var/home/core/zuul-output/0000755000175000017500000000000015156474722014542 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015156504117015476 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000271233115156503743020270 0ustar corecoreㇺikubelet.log_o[;r)Br'o -n(!9t%Cs7}g/غIs,r.k9GfͅJ~VYI_翪|mvſFެxۻf+ovpZjC4%_̿f\ϘקjzuQ6/㴻|]=ry+/vWŊ7 .=*EbqZnx.h{nۯSa ׋D*%(Ϗ_϶ݬvGR)$DD D~m{]iX\|U. $ॄKЗ/83Jp ώI8&xėv=E|;F}Zl8T*v (6pk**+ Le*gUWi [ӊg*ͼ~aT(;`KZ)&@i{ C2i1Gdē _%Kٻւ(Ĩ$#TLX h~lys%v6:SFA֗f΀QՇ2Kݙ$ӎ;IXN :7sL0x.`6)ɚL}ӄ]C }I4Vv@%٘e#dc0Fn 촂iHSr`岮X7̝4?qKf, # qe䧤 ss]QzH.ad!rJBi`V +|i}}THW{y|*/BP3m3A- ZPmN^iL[NrrݝE)~QGGAj^3}wy/{47[q)&c(޸0"$5ڪҾη*t:%?vEmO5tqÜ3Cyu '~qlN?}|nLFR6f8yWxYd ;K44|CK4UQviYDZh$#*)e\W$IAT;s0Gp}=9ڠedۜ+EaH#QtDV:?7#w4r_۾8ZJ%PgS!][5ߜQZ݇~- MR9z_Z;57xh|_/CWuU%v[_((G yMi@'3Pmz8~Y >hl%}Р`sMC77Aztԝp ,}Nptt%q6& ND lM;ָPZGa(X(2*91n,50/mx'})')SĔv}S%xhRe)a@r AF' ]J)ӨbqMWNjʵ2PK-guZZg !M)a(!H/?R?Q~}% ;]/ľv%T&hoP~(*טj=dߛ_SRzSa™:']*}EXɧM<@:jʨΨrPE%NT&1H>g":ͨ ҄v`tYoTq&OzcP_k(PJ'ήYXFgGہħkIM*򸆔l=q VJީ#b8&RgX2qBMoN w1ђZGd m 2P/Ɛ!" aGd;0RZ+ 9O5KiPc7CDG.b~?|ђP? -8%JNIt"`HP!]ZrͰ4j8!*(jPcǷ!)'xmv>!0[r_G{j 6JYǹ>zs;tc.mctie:x&"bR4S uV8/0%X8Ua0NET݃jYAT` &AD]Ax95mvXYs"(A+/_+*{b }@UP*5ì"M|܊W7|}N{mL=dC' =MS2[3(/hoj$=Zm Mlh>P>Qwf8*c4˥Ęk(+,«.c%_~&^%80=1Jgͤ39(&ʤdH0Ζ@.!)CGtGLS0l/LKcQ.os2% t)Eh~2p cL1%'4-1a_`[Zz㧦|k˭c ĚOρ_} Ewt3th?tvͪ{~;J0= |JUԍ;Iw}/9nh7l%>'ct Հ}a>-:(QxPyA Z UcÖgڌ:8cΗ|U1,-N9 dI [@3YN%:ò6PT:”QVay 77ĐrX(K&Y5+$wL#ɽ 4d-bbdAJ?:P>n^2] e}gjFX@&avF묇cTy^}m .Ŏ7Uֻ󂊹P-\!3^.Y9[XԦo Έ')Ji.VՕH4~)(kKC&ޙ-did˥]5]5᪩QJlyIPEQZȰ<'Y]Q4`Iz_*2coT'ƟlQ.Ff!bpRw@\6"yr+i37Z_j*YLfnYJ~Z~okJX ?A?gU3U;,ד1t7lJ#wՆ;I|p"+I4ˬZcն a.1wXhxDI:;.^m9W_c.4z+ϟMn?!ԫ5H&=JkܓhkB\LQ"<LxeLo4l_m24^3.{oɼʪ~75/nQ?s d|pxu\uw?=QR -Mݞίk@Pc n1æ*m$=4Dbs+J \EƄզ}@۶(ߐ/ۼ𹫘qݎt7Ym݃|M$ 6.x5 TMXbXj-P\jА޴y$j`ROA"EkuS#q * CƂ lu" yo6"3껝I~flQ~NCBX`]ڦÞhkXO _-Qy2$?T3ͤEZ긊mۘ$XD.bͮW`AީClСw5/lbl[N*t*@56."D/< {Dۥ sLxZn$N(lYiV =?_e^0)?]{ @| 6+#gPX>Bk2_@L `CZ?z3~ }[ tŪ)۲-9ֆP}b&x Uhm._O 4m6^^osVЦ+*@5Fˢg'!>$]0 5_glg}릅h:@61Xv` 5DFnx ˭jCtu,R|ۯG8`&ו:ݓ3<:~iXN9`2ŦzhѤ^ MW`c?&d.'[\]}7A[?~R6*.9t,綨 3 6DFe^u; +֡X< paan}7ftJ^%0\?mg5k][ip4@]p6Uu|܀|Kx6خQU2KTǺ.ȕPQVzWuk{n#NWj8+\[ ?yiI~fs[:.۽ '5nWppH? 8>X+m7_Z`V j[ s3nϏT=1:T <= pDCm3-b _F(/f<8sl, 0۬Z"X.~b٦G3TE.֣eմi<~ik[m9뀥!cNIl8y$~\T B "2j*ҕ;ێIs ɛqQQKY`\ +\0(FęRQ hN œ@n|Vo|6 8~J[,o%l%!%tyNO}}=ʬ-'vlQ]m"ifӠ1˟ud9)˔~BѤ]һS8]uBi( Ql{]UcLxٻa,2r(#'CDd2݄kTxn@v7^58þ Ţ&VY+yn~F8I !6WB3C%X)ybLFB%X2U6vw8uUF+X|YukXxVO(+gIQp؎Z{TcR@MSRδ~+1æ|mq՗5$B᲋eY(|*磎\Dži`dZe j'V!Mu@ KV{XץF .Jg< ƜINs:b zĄu3=Az4 u5'og^s7`Rzu-anOIq;6z( rx߅ euPvIɦ7聀t>G;_H;2ʗ6 h6QװxmR JQUbTP2j˔Ni)C)HKE"$ӝ!@2<Bq 2oh80,kNA7,?ע|tC3.㤣TiHEIǢƅaeGF$ u2`d)/-st{E1kٌS*#¦۵_Vu3ЩpRIDr/TxF8g4sѓ{%w .ʕ+84ztT:eEK[[;0(1Q@ET0>@wY)aL5ׄӫ A^%f+[`sb˟(]m`F3 W((!5F-9]dDqL&RΖd}})7 k11 K ;%v'_3 dG8d t#MTU']h7^)O>?~?_ȿM4ə#a&Xi`O}6a-xm`8@;of,![0-7 4f kUy:M֖Esa./zʕy[/ݩqz2¼&'QxJE{cZ7C:?pM z*"#窾+ HsOt۩%͟A498SwWv|jNQ=-[ӓI(cCYy<>b1NHr~ٚG5(I8gV lT V%6TvݬE.7d{txܖJ.YlSJ(fmK(;v? [u»\B˪l9l6 BȨXKIXrHT:ކE\HmYg1+ Ƞy#C&){sTQ -*Ʊ;\0XIIHLFA\R*^ R^4IM)1>r9aT0ɼG0w%r҅7{}r>.y_^X#,=;rpZ,s;&QxOsywnNNWtcmyՉM,T<꠩^<!gm# JlF?ui*#WhAQd kDDDz$#Ubr=2<vj{0]M)'{,̬mrʱN'=( C~>. Y.]:FL(c֊Q`%-*c y<:AՔE$+HZF8ZoZG^s#NDoc}XegfHzP[5~6c- L}=T|=$O=p֎x }V0OrᐣL]:%Hw>|@Kмb1(5\AQJ)eYGŇBܶkyߕ@EkT+9cBk{S',*ƷoqMM>a-)Z = fKI#F۴GcLQY[QlZwοN A{4bY;OX'}m6'ǐJ.j.@ɺ%'Mp[Ep6˺UO?9|}7Ço9gkoC 8a`3 c]P׳).^L ƳvYehy7@rs b0o"hRԮ}DHv'XJ#MNě gy@f'X+I/Y:IO!ٗoҠ2\m?Nn{FZ[~b)pUhBwLEs{8hw+[D X*^F:Mv?TvNWO_^.\MiPoΉ@N9%*y"G'֪$etyw#Ep6_h~>0nW- ھiym7J\b5 99%9ûMMkEX[RNgG'zGW쳮yـam^䒠%ex.pM=XnC`i">Oc3襷u?i{5-쮴=f6c%@ %2 l0ϊeK.'{0tp׫MNެZGfMxW).eub5ڼ%"" iJ(&6RT9+b%SJMH.Ȼw%뒐B.{yTE'ͻ!۾mjBHcmWYeȼ bo˾\% #_#uTr2S,r77VrXu/u]`P9J`:0O5ygjV7j0ަRDA:pq0*|\j#yX9 U(b̓+%vvqr!s aZ '&ycUW5ĬhfT3˻Ef%{ Ë.ߏ*bDȺ\9[Up`)FKv|v`.oV|0WDBdѳnZ:]]}nS͝2fêlWӫXgW{o6.q3A4낳ť'%?7<{a{ z'^vE0=a04[35n0w\fm@gdJwQ^CDYN SKE Ϻݻ-8_PqB3&*8VŒkH/UTwܸ]]j`W$9Ŝ#kXB:|hz,쥠6 "9$dlNgywiot򜳫Brq.$1ەZG8Sn |7܆9UaIm1%deh^.Y +nMC[֬G:8:]ֻ_~CΆZF@Eeٔ-w6ȱ 9Wxw( di]〩JVhY ˻ 1k3)g{FRq''(`1b.XR3sx>yKx_ {Yw'}ww& Y'|6(Ub]V.\닜%Be4g/o|?l>/.Ks2z M֔45IO$n"]i3 5KM\%%u{]DM?<}ڗ`94I<&R`tE]D 䠛\4ګDMd@զ&ex> ˟N'Ab lmSܖF9t|)_}ػ]fE18(lg {HZ%^|Mƻ̅}/ě~w_3sƐlPETN&ŎDիyTBTj.`! SvXGT ",}l4qUc-(tp-=duiWVį>2Ɑέ#؞_4UhwDD!=n?>%(zdUEJS[AǬIz뮨E)d[MrN$/u'jH/DsC|9R Y2PŽ@yB2:zJHu( OtO_#[trqQmᜪn“z*֚N1`EG礄Qhn2ӗa^#sa>cz7atObvn.6 4q{'z@=JUZv_S{wy(9ҸuG 6] ƕ$"9C9d(SԄeLy8UAOGDܧs L bO[3bň]"O=ӻoHQNdM#X]L;#хM7E Ax"U&i"*qr@vQT!`j>61ҐE(H艒GJz_;˰_as_j3P*[Wb.=!Gw1Bc:"hTT1C,R&͋?$S/  YRAxlxǑì#maЩ-gSM 5Q %x(R¾q$`R`Kս(&L.i<4,f:o b!M {O~u"_IS2>nc=`"7˴=#xe ?ݾ-dŮ[_a>M)~@@UQƧ\tlbL\?%}·?읉8FHޗ4DKK v2UTMUӜ=Bz>N>/O?1fl{\3ȼ.)”W >Oȶb> 8yӪΚkdp| G{;4^l{PȱdA ՀCxcSMTf3J1 R(*ȴl.^Q('@l7=iKitB L+BT҂=ll魳cf[$ F!c%؍:ٕTzqKh! bK@* *#QӃ\sAގXpljl ^,wf2AU9c>ˇX؛fqT\X):UiM2r>2{q>OڪRs )B: ۊZl{6LNHY g<cmN:85Qt0E_fNTU*K&+5q0mS换ohN\Uz3b ZFNM~@6 g Sl58*:gSaOAc J~v9ƟE q&gnN}JTWQ)vBی:D1c+ *m0W4Q>@>lW"A X5Gݴv]pEO>NOI[ 1,j2C:+34ำqA*\qb'YpuHT)|UkC.r`˱Lϰ{ xr)~l-ɩܿ*7DNXtiK E%V Ng2ޞR d5ȧtOt/ߊN| fu"Χ@#=U'_y)g9| {b`w ·B:M#_k%+1L#BR*FJ ߏ3j׾,NNǽ"] rT [087?1a@P5B,ݖc}jc઱ 2hw{5}sR~  ` Q}llK`;b6a>@'@—>9VAze"l |dv;)=l$-M`>3X: P/.%d1ؑHmm){W\hxtD78*>lX cU8U{$)mܛ^|8zVOz7~~);qUt,r̙BDݿ-xpyV}d5܊ܘmBՔȫvPk}0ciuvS-~zߊ K1PHO9pM.~9gs`k愨Cplm@+@'E*覽L|X̀`Z>m?" 1a; W;" Sݰ窜̀'i™:HG,?_C5tBYpm_ʆck&_I_Su>o>&lrA%ElR&Ϻ<>^==?~7xU?>>z7OY{Q?\"߬Vս+q&[ڲ&ZA"@+9Z'u*tI8%pO·-y>Lh!^*,o_V6Z;4AQt0ʹF5KQ 3l૎s1@:G|'1O$ +Jp[q@8)la.~Lr 6$E|<ޭ$൘ G7qi#ނеt%sx]}ŹrD[^F\`'Be-~ZW8*HqOȸVH5:sXӄc)?W`EN*|\v aVT0"#tvd y &;]P1w]<Y1X?m&yO/Ddv .z{I>|!4Ăw2j.ԡD6Xb]e)ߛA 2J1SGpw>ٕ(WAѱ nb;pV ~WO+18UB6VNO8Vttᐜ*ѝ8A_:ϋ);KX!$$]1YINJ2]:Қi >] چ.lL t1]Cr޴kӔ[R*)X$Ic>EK#e 15ۑ_dNQVݸVC6{җzEU"L 09mBu-ӎhe1u CWL;oG^R X5)aRߦ[_Vs?" Iڽ]A12JQig7ȱHoD:EUپOY>Wqå} .[c&SX( Qz~j@-E} m"8_hץ|r]^#}8B*Am.SEh:YnG\~3:&58*: gS!効<9ViCb!s1ctӒNc_:SE Mn޷WQb0M^IVic$"FhQ|![NIK q~,Jc%+8h&4II36V 8Vbv"ŏݙmn&ZX[ckwRpA;d+w!e rr[݊/V+@;NRy2ЯdS#v!{ @dlG]#>aJP\vc"Kjt-1_$A$Dfj=أ^o%<"HȯTgOرBӆI t[ 5)l>Mdc5u= A@kU#cJp րj6M>dUN/a\F,M1:Y6`ƀ$56%}N'OKx%'#t+MBJp `tɪez8|hUmi')XٛqUhhCN+JL֩aQçݚMvư{'j(f)}_vq@nuD@/ҁx*.L!Uc3F KwtOOM-™JAM'C/زR9f?q>hӈiJeV"&+ED>M-@7aV]XYK}pc?H_?CU"꫒? `a7N6ux٘d|,/669ʀ,EʿkڍfK54N!jp.Pٱ ҋ9W@/iIaZf% >%1׭vM!:ɋCBc6ɮu-" ѫQow#z)C^0\l[Ohl"য*6 ny!袰{!./EpԻ]В|Dž8Hi7.cZ0ϕ!1Q%ƹJ4^3O!5Z~vˌJ`.3Oaz.TMk9gk-T4G7! ^O7E"`W28" Pbn7?ϕ!l>g/EU:غA>?=CֻP8B\pup6_3XqCGXz=DH9:o pcMו8`n{<sz1t?_a2=8j0rO"کɸHd"yc PJ:<+AOZ]y6 ö/%D<=-кQ^oAnv-{)xǺ--pcl:WLg-Ӂ#vXǧc~f+ Q;eqp5;OU!F>j\FW+[3= !YWX vZ}|>\T%9dp-9Un# zLd%~gA%EL{rU÷㺽~%uY(nE|sUz<=6U[FTߋF7}qw/kQ]ͭy.D=_{U\?<^}O;ߟ"Th $$8V'Z2 &+Bs=8'kP=4hf 1h4i R8UXi7f ;VoiBi~MefZ7V >:/?/Ac 1M/ tsH ǙB?}Kf@rK-͗\H+ϟYSUגT0]'OSju&[V z[{X6"X#I/rʑ]ioH+P}03ƉwNAd7%I6CoIɲce'V`&a?Oʾ*$c奨kvc*l"uSr `&b G 1ư',:B5EM%1hcT;쨑L.`>E\ei~fPhy1-<D)xRi(2]ic31VCD!Hh70ZèY6eUo;]Yǹ$4GlX B~fd"HhS<*nw;2uTHc5f›ZhEXn&vs즐, GWyDK19/JNW$pBtG3EEVS <IxπH)9NX5, @ +wII풲ovݰ 4 =0"_\s}ӈܗ!ًH,G/K*b˒5 ;cKuF ?eIe9cxYREzlA|`m,q5ω,w-׋CL;RzW,c(VȲĮ>%x%ŵr<&,'Y;5ٖ:xSa Y Ig0KV zq=ɧ$MQ F&qS7@~Q^ĻG,|yJ^4]4%۲$c ra%pdNڟ_2?#̓Ww?-V,硼t@[;Oxyt%2=ӭM5}wt!_[F>28AGHy`Hxkk=LL?hw>ݗ.[b?/5iry;ߘ7^O~_3ޛ0K\R !8D7eSLik$:?e_ 7si7R֭GN>)\ ؟Kc8{9fVwoM mRM7 ܐ[Ţd\kx߱4K7,l S'L,u!+2B~w)J}fyr}ޤu.?dyJϸ鰦Yσ1]T\iwao-'.s'Qqfͷ79˒ BsA&SΰiDW(!#hmU@F't9/5 c\~ˬhDW{Q_ > &L1lw:?&)FL\ ~zgub՛kr, 4<##R:BBl1=)eaiI Q;C6dzkB0,C65bX$){8Ff9JRƒkOe^@+N,GcXa1ۤba*BQ*Yz˽\q]6Xw*L =f[ץ`Y7"I $i#K#tU)}G X>qvQA1)D>lC?2HjH}.}:D-Bt%H+!E4!I/5df乩yU4.>;ԁs f%iRۄ-r0$Kn!c4+E =oy[oDiҧG$+\k7Fu (psd4< x| rEAu-M>>2tmARE$j v%¥G1K+ l{Z۠^ј8,A}|t^v>`HpȪp4}>b崄{fE)x%nNg%˹f Qt5_o%QG[g(O{,OJ`i˄W{4fG˘h h OUi[/b:Ld+;K'*4pmm$,_w*RM5,U9=j`G0㨢 ~x0 4Hٗ%녎_%,on~ܞr%rw%]2vݎS p-̾'D%"rYTf_{*^@߼!ػr` 8'9QF 3|Rg˛#HSQ53wGnG׶NTRDEʭ5Vאlh\CoDY,`@iK4 Nxi;vO3pT,#[UlbJaSVVK* ׊ak__="fMZU1] Td LsG0 -uʒ*e[`%)V,^vre*Pwʬ 6{DdE%Z`p[B/5؞H,rYV!O <Xah1KҾ2w@Xnɳq J.(Q#@U7DYpYK9&] ]`R6.AZRI,)_Mt-K%LUe&9]3L]3n9-ipv)b c65XPTW>l4\sڛlhsJbS7FC 0%0D]LK^¸]ߡV7q]p=ߖ ;Ldrzv/b誢np@)z;Vຶ V9nݕf_˰ilcG^$ˀ쀶jav>b:sJtUxi IuT]XjT "V(oi?4p.X5tR-"T հqTX!u= Oc/+pX£(Z.ȵPUH7ClVݻa "5gVҭmE|aSm}@e*ZRJkj,B@ '\FCoqneO;nSpVВZ2UюVE U\+NLUhG_vд'LFp[Xwd㾶,ɂ8frѤwa視L-LӢVM_pjdj Dya \xCD\[[cG1pݎ[j4#1 joﳮ˴T[6؍HCt/dS޹!JD̊u`1ލ@Y&xmI! w^kƷyAVj[=<<>ZQЀU˨qoQ}K$>@J.Tt4h{^2QO\hg{M=R֌abλ[y {HŭK:y9QvGwj$dDI0븖wg96u={%Dy{l9y4n$nì[B#uZuXY]uEhg K"$+oH0A¢R7h5PZ2On=̱2_Emʒ4x6ęDENZkbU&@}fMq_5+}m ҸQ{nmO_A;k1hX 7 k-gH>VDH?cJc$@__89H@n*v6@*ܗ\j*||{!@ݘ< Oc x86贋To2 Mz1 % "3 |H}"<{`,.!wZ8BֳVG#qH~eU,S͓kw*/ɐZ~M[\cgHk䲿T- CDZ9 Eh $VW޽)'dmdUot6%ׂ^v/>!my,ó޵0mk鿂$nL&.ǠWGm4S,znA8^շ/T^Q,G~6S'p$z JO)\%kU﫴||IYk}!o)t}QGb*G3qC:GU/Ow<2PD}$V=ڤȧʡuM[]δu:^\e\,C ?O˟)`:ٓGabk@BqHDGx] ?Ad B鈽j@(AWgw$|+*1o/bq:qrgF E#A!G@nGs̈B1pb#0nع-=8Q]BGt!&i l=j!zO@b}_%p;r`ץ5x7׍sE7_\5` vs_ 6}?I@2)\cSTJ3GP] Z@P~Ot*A>sIz(F;؆XzHB^JPgKomёJB@!00NqwT 8oS08Rd>CDPnGagx=ܘ(Q75"ߣϞ(ܮ8~g¾ubWd??:q :a5]\3+0٫;06P/3x`{nݳ=g}G~' AiS)Av* DPډ`JlJ~JԠo@ :(\oИ~grduN#9nGb5ײoZ=sy߭7 lh(k &p&H`!%*IMm߹h ֑<ҍǀ]B0 0+w~7!V P"8g y2ו $ N% n-F+7˚ ~1{KY@@G"r'l2+_KkAh⡁ j3)R*fd)[ OCؙP; רmp$z_?0u^eQZ$-Ck3R:%笒X/nrA]U~R/"iNۣmB!)G޿;ȸEpL{xxXPDHRNÊ R*$$.J4iSlKMZ^#Q+eaeBXrr$"yzKOsrΨ` 4ZE|( ,,h]hԏ˙= /Wip/%q*qxK-{ r˽Rg )$q u>`su!9+@;}giXgеm7.O-]VWtX4&xJg}g-[ xƳ%l14V ~A]qڳ7U}X"W`{rHĠC3v =0-<̔$,ДOA <@\5%eJ8;8vM&Z?Aq@9Fa6*23VfDNyfo^n7ZV>?ҵ>\X%D Hvz^CϼtClYqt$`T_gO94W%#۴[-E.b1g38niL."@RC8lF^t{+F0vd%wEi=7J-9uM 5/R,F %h'jzIKwx ÇjCo$' Vkꯈ΄;~U i48e&p8sņ44lwT=8&ӥh:>GǪzfO . Hw m'I-{Y]w J,c;m=VH~MoqȠϠǸ#Gns~}֊~y ֠-v:w݋^T; X&)`[B2l Bv$mCʐmƔ0쉣vVոV#>ulwmF/S˷ oO+|GJeB O#TH؂P{P{ B F#:˄:[lO4B u ]&݂Pw{BݧH2z=PoGB- ߞPi;oAhLhO#4ؑ` qbj0J ?mNd$1SLT/ Z\uڼr`GE8uj0gMD>e[+jwB{"Lj_1 Aʄ'Clx`$^T+=))G@wʚFU1NπJ%ب1n/իhL@{&jk 7 ң64QeIk6@4p[uV1W2Įj~)}V,,Ϛs"6;#y]qH<*Rh_aTI!SYfLu^K~4]> T({ ej#\ߑ!pL2g',C Y,% T3#@bN ze(fyO-Y"cj׿F Bɽf"맣`eb1 8l4L$]=ph szJV)HXƒ90 ^4o8ZG3 Hn)^0*|Qfj_r6.K3) )!0Lw5KLx'K-F2XՂ8) ZhZne]hoJ7r÷d}8`PlX+܎Q-FfgַLXA?mQ~7̀ic`Ǿ Vze:-{Dt]TyACGF;\ס{SqitFo2{UfXZ%Q H] xzٿjڻDE+ȑˑ<ȑZʼ,*)ts.?``KFhէٍYlGypPONѓ5[湁3[OoL[|UՏ]3/V_pZ7qq.i)rjEk19gM5tvH5a=F@*hD*(łTAkXMV2]pG]).2rI&My[LA%ۓj.b(z,DCXf9'~.R'{(ֿt ƽIC8^WÂG4H։X!<եb Km *Оk d1G$(/YcN`m 2 5+bsAh\J4{--)IE9;Nԡ/|^ʯ{P^~?_=:WiOO6DCQrh1>8FONz_8S ?X_Z24ujB|7ޠx]\NxڊQw_6&P~9dߥۢW4mQ_8pYdkRN/T,[U@,#L c@/<Iʼi¡mu\Q֋j2rb>jUG8?S$;Xj*rJl༯`JrT͠5 |4(nsn*pkUP(BAjG"8?ztiVCI-U3/.pIo ,uydvcHZ'iZy՚^\6Eb2uJ۩JD*%2Id)+]DxIp_^B¨qbBnb|v:;cGuG_aL:xZBtV nFXU L. KmhJ/i8R$xm(@ 2AtHtۧ=Z8[sNAϪI U4/V)bâ)bܕQIrÇEA6" Ɔj}ZܜXO#a|}FYV,{h-**R>oaڲgTG{AD;;:b1RkS4+Htdk'gvT6Bւ+vAjIfRļ vН 95pRa<1iyK:E8?Q\e9c"LqsڍIM\ŭXcВ67`I91'KҔ" H'pl3+3{R'?\߲ZC%E Xh|hRGה,ͻ')pmk:K~\A,TmbPEĜj RMWugew1>gP|D Z**T"H5̂t\N@O>NEd| mO1يbk w/0AlqG6 nɭk5U Qjs R ~Dw& DWmQU[p1.N:+bFc9c b0"LR$ w,8N RR l,vuAhZBt;37lAU L rJ2ȩ(z!-IaK5O<:8Vh{>-`a\Qˆ8$ߞX2:rI'pb nA[ЛDe„-H'1oQ +<Ƣx̽APɂN\IQa.ڒ=XP] 氘Hє Efw9ptus GM5_Xpܲ3r >CS;悰V+'cY:OɅR@Z`")tDqv$łdܺ5 A-3 M[ ZZ(b-s>U}^sgn_X$4i0+ \}#S$d0x~r[i hAP\@s$ ,cq p)u4At5sw,8xV?xwٰ/58='5M;qTcx^5vO9W )c¸|V»zYdvu=dESgR[׵ZiE}Ieo# ~nZr>X9ۘWy b? ]v,8N:ufG\CMf17RZeZ >Ì[,c&bx&H:|5W,0wjmP{;t3p-L˂b^i b :HzsUBu2I:eQɓ>`6F;:F0`(c[FLn(>Ȃe9;%f%h1KFLIwj璦 8y!ԆYJi|[n ar'e1к`(Qr"$߃8@;=+w$ %-[g1 V=UF~_1*9Y%g)0-5PhLTsI=gUV,8A!7>6겊Z`SZ&Tv(b{SѸ3?^R@O Yļ]~n!҅DF!B/0AP "{8_gUZ^2 XZ, CHXPyx&]zʐ'gO~ ﹜zLlj'&o3Z_J +j <ʻ僼?ʂSgո;[gN,U/F )8V[Ƃdgr,ź3M 0TINGIH:Gso .Wb<+ޒbjJgŚ=F:pg&]mt)n9ngkm^K! f)g:|_@~9cFaWw,8NvǓk {onEɣ觎񿷏Ih 3Xo 1r4]J<%иϪhI嘃fz` a /qs 9q:߹n,dv)nٴj]42:IzhY]hYݧ\PSS~1wY"h#=%!Toqm19%ٌ-Q&̮zЂr|4t'צOgpx3>g=ec07֛eNF0bKc ^Ahr^r}?a(C)1|}^*ݷ*Ҷpb>ol(~G[5,8%Cކ|xB2uyzSGXp⬎/6(BZv+Hi% N8D9yCOcY{R4"HKυᩃZޅ4uW/-Xqt@Ugr>Wm% ! q| )'8⠐TފwnsS\E)d;Jm{6i-YBYiJ"w eOWxIUڷV9y&E1ʖwmX5dFK kR.ۓqM⧲|SF!n@"HJ&) )qI"t_}lk0<8%mDXgb#QɡDSQv-\޼+~GruL0}̧6 !v nf:Ⱥ{-l+PXg^yFg5}0:e,\ x9 ꩝¸)L|S}K{֦KAW!Rf,pIͪXf=755ba-eJh,ni8xehAl44G)<'C;tޜ^ID+oK"QG"ki>%XVp/o>eG76Rk1|ᵡ‹p77& ؒ1Y]c/^L sf0r?x*\w7Fp9ܝ[<18 ^u:j%-6( z'~~T"q6q~UpFHto]/{&ϻ {b2c_x曲+B~ Z_˿elrlp:W?An@?&vQu9R+ z+?7û|wdIc=W@*x/=^[-`L-அ]@4'3 jXP3oF;[P^FSTEzU\ Kfm=3BTjIWC8 Kht}hς˄?x/C'0d-=fV_`f۬Nq+r쁙3I'ch1f> :="b;}qm]c+?xܺ\R 7*shquո"ȗ;uy2aߦN-ݘ3eqU{0Av8Q [qA\ZYVY0-Zb[uɎL.N:Ǘf|R2򳲁j ђpU~H\]rV0=z 0hCѷڕ]©NK,ژ8ٕ -D[U~!"\]nuڕW&، ?B^zՉ 'zE0?3A5Y: cCD+h?~, tX"w|?%ﺰ~/^DgΈtY?9Whz̋+D[2g -Ӣ J*~Dp$Q3Bxgg0>0Q[dMp}P uudf $T$E< cЍIjrw=ZVH˸Aq&nPmf]P]mКGmsCKaT[ZaapBVMUBSڥ l-jE5olڎ53DpjPE&KQkXBaq5H0mQfJaD^G02!U~,[4JADVgl6,̓ 99ͱL"gOSEc秲#6@iV&R 12k{C'<8|0N: ~Y/2LB֥/bmQKQk:^ZzH%{1g^i(e9F̯5^%ax#b +^4Yp0_@I59>h,5 U5gvay` S?W7ҫ7"0KRZV̇,uŇN/?Ӱ Wא^~1T'yj2V cfR"9/\ykXUxW=<֋;6~Fc4AE'H)ʭG+*Vw8-7eK[nYm*+9g7ZQʈ-c<+=2yߔX"LA&"FDP!ӄ&MU%ڻ(Z{=Wݲ/^.tc,PʓHD9U2 ޖEy,d+*Vf!b.4a[Wo,NvqvJ-u} V`kD$ʦs'E\*N}-}^IV~wk4kϴoIN=18#;oK7+gm$*G ,^nsh PIwS y@O/ Լ1)f| !;j]s.g6tGmoZ)=Ȭs~vEM{VZ轛2*tɳ0ѧV܎>%;>#kc_9k;?!r<&ӌBۦjT9l;>,@x@>K-H߳%RϪ+`{SEҍySQ|K0ը8zy`n3"Z%g ޢ[ť-xikp1ϻhLPxFs#\< Ôp1l[,8VZwυLv")m[.O".W#EQ𹂈Yw0p~ UQ@pxoi#Ĉ1MaS\1LDŽ%ZKN9G6QFD KhquїnqHsfdκ) 1<ě 㯓PgqMtdIkh 7F3VeAAά@3͔tZdžJXXϐ'=b䘡[GK Z[3VTK3V~g2"j"F*1iJ)P 9 0^[3VTK3V~0EYIS=-U-54$.1-Jgv/n{^VzȤxkGYaͲ6FyFtbULTaz.5vW \ɿelra| @Uxfxd'Uv p)md.;ƽ^ >Tt%z.w'ޑQ鏯[2Z?YZ02ݕ;)/L \vMd\d݀/Yȯ÷*|u35#-÷x+RZ7)Z Pir~0tU$@Uwda$NvM>23y}ɊX_k q}r] Q\>BvG"ܟ**ރu2/ʿO3K޻B|M}._ }v8p49d! ֫C4sȂ GL}fCynb#O.1<4]@jw]/* },$4. K~gWUa#dksnP}w &.58TT oˆn ڪ NL,qޱ.ʺ=(BK?ϾGߗ7a%E;X z||b.AD"zXcPH bhүC0h(: u"hdzg  ߽sai 7 @zh$ ?b H` n&;.lg?.Q4{\0ԏ ʮDOw\pȔ [ڞn%*&)LʰK8H+ ,))gha l'_!n(s "P3sIIcT"H"BNR#e-.S\(#iX$Ć'u,SK2Հ4LINIMtT:ŬVk*eH8NS ЈJZ8+H8X+! =)2k9!ޱg1lMqvAE(k5~ kSI@ǁC`غ}D-ۆm&0cm&.gm"-r ~G 5wVO9d{/,N(vwAtYy)%խY"nr{rق=fIR+eS1Ou*q'D&1$a|'GTPI-cʘh]:&4d^Sʖ*w4ݣ_pwtϴHD&ʳs"@%HѨg(DU}+KF=&*_)Dd"md •X2V,ɿL,frb;,MlD8kf5mq$)fIj:ѭX2x} N7bR dQHykHHu:̷rn:]m #̍;0`{|'CO}@Z̩$WZ,nb!$j҂<Ҳ5e~:$B ȈO}h|~#HVIsZ!EU,(5(ČqpMXQCN:gS)Q1Et7z}uK^k#N$n[7u[l+s.QĤQA8[K80+懡sBa +cGDpӀTXp* P|fe琘HC?".NbM gQ*i℁+ä.t+RdJRGkmHG`6V~Uu7q>qe FA0HgI䑒^} MRæ'0"~|U_uTY1) O`!0D[ d,.G- h cNoؑ`r8̭C^*"WNi2Β2eڨ#]H6&$$<&*"֍JџS ey:njT*.Z9r9:^beVTdÍL2y$IY^B1ULA"O7G%圄|) OYR'Nh(2!QpPdeiNRH,FgYZ$vD}?u *IBHN Pf?J55\aãrhaUzoh%ATf]ߑڴIel}.JɽHzL2Ff1Pm ).:⢾TCyþu1MQDj8W`aх:x& 0baeRX9 FYa Db!.H\-uQؐIz0#apB*QMȝW%R3392H H]sC3Z~GQA7EY6fFdW_s4S) 5e4v͊AMƵ3IcpvF4GlЂfBD4Јff:٠yBkPO@|!8u$l暂^\;%4Me:.Ur9^PWxgEhcyWBPX>CK#͒WvWmU*9+Re:1B͔NyN\93L8y9ذDDSDHX iƐ\)]PV(  ӕ$jîZ~^qJ)SȜL&A'Kd,3?yoCIF"/X.”G6e b0j8n~Uregx{= ¯Mj0hiT00#apGj"O# ڨfe*`e6WNݑjਤESY7h p`M4Õ ]&3g84Exx0'[a@l\U&EAhВjQ@ - "2Xx hvx>͌x`?@nȘriRX!0ɟ6X2lapEû۔{z-3 "yWC kXɾjq7{o컽>wfihFͷҢd?XLYbk'Lic6jXc!0i7;M/R5}hFOh`܃B3a5D-xf\G44ozhp0R]i0 mbD& kYa6µ<fM iA&-Ʀmz?r%E(tT75[^OLީf!tFWWSF'Wsݧ==:MoVo6B +@a-j 5_-+ 0WM0CP3(B+eE |O ɯHjįsq4ǃEr}gVt-Q{c?e7RUfp{Xl[A2Ĝp'8O_>ҵ9ż}WVAz<@qa\q9>L?F_yi78^l2F 3Ee)i-fl'F?!~j(`=~* < &+$,c G1d]Ca^ތ='޾jayan;{7$t4be'{OI~cڭɁz?~xYFGיizXY7CykɌ6)N&:=/]떦W;ϸ9&ǂ 'xyvC.lXb_7pqQ9'X-gW#P=ߤ~zS)܎~]տid*kO pu=kbs#a- L#xxLB:NFel'3ީU|;F(+u:57ܹ H6:: ,͑Вm*z =C/*&%w~jSk8IӲWFmxPKEp㠕7^T]Rs->\12_%nk/nϯuݣ.}d;NLt{u7{ ?L/]:?zO\-i<7Kq:ҽ(H_V:t{Ԗ/)(ϖ_\ٺ皳`dy$:4eCu^jqĹ`^[h斘^PƛϮOe ѥs_t䑻"!uZ~>/mL<ͲgJnlt ,L;uq~: >M ~u/Gow.mo+HL\ggy8^k7)|f2{G;-9z׻^h6v۳fgYޗw镝2ՠM&WfG }pt7"\ z1|'"zvZƔOdywM]7J.V4%ݨGubyI0/_{^g]1{5%:1lЭs\ULi׍Z^.+5Iˌ?I#"nIӻdxJ/=h~~[ͿK 8wcw4SvB_ݠшq |Rm&L^W/dwa}gYC`?d6)1 af BkD yAyxs]߹նmPF TbX0P<Xi4AwX"WyHcxLJ+DMfUi2#lA/. >MU j<5W `pj072-p#%^pTDmR/yXf[fzx&Ap-WZi4AlVzU!,ntG"[*hkAhF+…c {ADH%LL4F!!00rd\cQHV>Km%}7?n*˒cΈd73x1gqқ7g-=cp}Kea2:ߋGL~QQǟ)|'_<X gݝz/ jUmz=nzZ+˭1&aXCw` ) . t RU6MM'^?'.9Rg_^X3 3v AHbOhC``Gau ap-4h 4;RҨ S6,5'+a;[zimvxnٿT2y \`(,F#<\-bQVA YLsիPcXDzWS3#mnig!ƐF1 WPH_ Wҧ6aJu~Tch`j dZD4~:+0&q0\n(ZmHQ^DmY V45"'&;gSt2:c_XLJf.%ԩ'7YKTtɈSWE|c{L(*Oeq'2noLi۸  57 ֬ A˫g <()~tD+fOGIH`_ސvChbo͸TɊ33BkK8Sg/8[Pto_E,}>\mқR)CmR#:rIbUV֤㚛5~ҡC}]{  F䑴qMJX 1BH{0d|yyϒ_گff.A^.\)Iwl]Q@rHZ)ikph[ec1mPR.S `O**1U}U`3`%U4$}d M}hM 4 ԓT]ERNއI5GV7V٫ Z"Qjh Z4ޤ^-®$JImiV76K"B`8i⧪BV7JkPXVKv84L(&[e/`IH V7Zk+==9S ӁmeS^k( ]ګC@SHOk$FL7 $:#s !̚Rmm`f>.a D2s m@J-844 ; 6P0PJT9?*E`+{㕽K-WRRUUV7N٫hVlcMgP*Q$IVF*{ WSC6MqjhUoz#TR|I4c(P1vOnqV+TEbU<C AQ:]rVpis鄱PdNm1ʬ1W }hಓk#ws;rO4ͮ\\bOgM>,t~68S'{tөΐܩRe9& ԙ|Oh\R&)_j,]мsL]XDƞ7dn+5JJ:NMRcVrGjTTlO(t<֖Fk]1@ L#b9.˥"y LTĤRR[s'j.0͵EnYބ}jv8jUDŽ0j&df5jX 3~K}Їn'vjb4Jk!C@i&RײCkor8vHC巗п;6cO+oU-dd, f+:$e O/Cj^շ}Wmۼ "= !syDzj6"/&W͜bUp5/f_qe^ @1;kI8P H*,+b瓘ia:'xE+dӞ;}~l)ZDQkv) XHҺ2+UURˌk\KԦ%\)+J_"LڗZT T "] kfҠT(r1j7\qoSXESwuEXc߫j#)Mڒ28iHy5j`XΤ<L; +:>FᚪUV8tۍGR3;S3  NӥfмPHeԸ(P (d Ӣ"YHoB H9ĭr2mFCixBwL,ACګnd{,ԒXVe?q(-ؿgBN5m Hh@)jBGi2 WސrziI]10|t(]&[ J͏V# b{K9wsVZF7L0jrsD#¢Z$"Flz=2W䕃a9ĵ* F8q:0h (4$^Hd^XHn=2 9;wqhr j)-XSqlR6o9V#?g4Ɖm&odH≘"Z(Hy*LY" Z@UwF W0:y6Q ڥ4O'i㆛!}g~sg +uˤͪNr9]Q`-gFl!gaפ2yXS.Ɨ(csפDŤ#UN9uٝu(*H{tu9x\Yne7o" "FutB xiN❃-[N =+1eCV[#"7G (v(6@Q2@q*PN9(@h9YwVs\("%Pl@Y麇&aox J+e Oʠı~ZUݚD ikLOobvO>x ,6J;A#a` ֹSȼ7,:|&˦.@\' L.Fz iM Pk& W9(8QđxCRH (ڍJsxWt:Z"z-:m^߅5X2;`8*&5a?"'KE^2yy#7>ȥa%zWKL[mFZsY/U6Η$|S.eh`Fnn# VDS.) *ub. kPJР3zcba7B6̲r\ߕS چ--@@ۍGa Yee!]r7Y F'bn7@ҸMBd6P.$lGf3MmGh.X򙯒w"`!8 1 n7 &.X•̓͝y/tRp h 6ȠF/7+9gya&AYζE,12IU%]/;]JZlQZ{oA;P<®I{L~wnE^T>l.~ɢLZ e\VڬT;uȟ,VM$ Жf?\V劑2H>Df;0k*r[e/6B 2%At 6FF#h`AgAf< C w JCd$8QPi51 R*1. A4 ܶ V)G鍔%?N7FxHR%h);a@TS$5JFz#ee)M{k1MOƧyuDT0Fzm4B6ilmOǂ\sSZ ddU<-O{>Ls|a:L7y!|*dz3r>kprF YpB-_Û}9?:PSs_1x;+8H0@࿾LyYWN?^INYGH{eG`iK3_=A{t<5g/ kT-aq =<íXMbߜ,%SqkQFgz5 ˝{DVgp/.yGDx~;`-g(oZ# 7_j{o\c?yϕs?.nȥygC]?(lkZ{'2\lQ~_BjF"Qo3nsR援rѺWw VAJ>ֱg; Ww ڢFJDHv0"h ?(N)`Lc SxtDhFL8|ᔗ/o@}@ +SXMXUv{P a%2l@y>J:3LO! '$d<3ԩyi˳zt^:u/=Sqh vő=} !{|@Yi|nBSFJЮ?@q֨N"ٵ,iqksaFd09>EfmXn>_247&? nX2D }CnYWCӲ:  ~Yջ{~ו~ V< '1כ?|_둢{s6rvAgvY7lq5斚rM;D7 ;_'á%ӏ~X| .<⛅w6<9M1]=⹷\lu3v8ހU*&>=ޖ?-|֏9뙍9Y :dxn'Rbǐ=FEz,4o%{L tjy v1\ԑLrP?vfy3n9"{|_V |{&J  ڊoe>}9 ~a緋 z[XUD+Mo镢HC:"k/Fq޸N]fڃ$877.<{?TOP-/;TE69\ &{])21$Mo9w_v\6v.͞ⱼݧ |=M<볩VgA&-œL*`Xs7bax?g"T-yv16,@M-\ ԦөgIH5\j8gm}skty"^qIw9ioo7әjIR7:kƮ#.J,|G/(vMLžf:TW( Q6Ǣ7wJWpj2k?%FqSuQg3^6׹qB.V1ťA3SNfDEJ@G9f-d(ck[v>+>ƐiKjYڑqk:OPT8t-613fSN<Al fIfԿ|y:v:L8G,C2/L $*.SN[6Aӌ<&yJ0JH3V9Ќo%RsiV!= A( I8BdL aNO_"֑qIowmO9H+eS%+Y[ тcF>!C+}XxS#:dK>Jik#QZX%8<k%EE[_b s ߖQI!qpOm&a=8\$7ntE_^= նڃuW]KzmA1 %,c__OTPmsW 2cWf"+#EūX~*y%ڜhrGiI Ҭ 3Sg[y&#ߧدn=+ l=+(zدi8ѭ;3>}P6~ViCo/朷pz v 7]&TP~lز!%uCfP̓HS'^4* HL{$J+G!s I*Msrs%%מ\4ͪ9UlȬ*V̪VEq՗(#+5`d%61EL3#b0׮JxSW9y{uIvtŽj5S450@ LDl?); p@v|8Ň-ee=+j5foPWUq8-ԕ ءcECoL4;M~6xxjR&[KC}$]V$W@rok3< 6^ҹ$o~Ea6m]uؒ7.\]veoUD%O9'Fφ?\eυ? dx5G#H/hHtMWst(YRȰ+c!1Ewo>?B`-DJŠJ="f$-; Qp5oG Z> ijô&pg0y0 h,}A xq05R rhq.DKŎrb1NFTd$:h?, 02۲W2 dWJVMd)R+D|ĩL"ZtsHA@/Vb5.$fSyYbbd;aL G|UZpU 'b]SXus5~WnP񜅄6@n"*U{¥)߹,L9-tcb-C{.\/K1~n|Evf (ŻiQ3MqhVvCZ 7u2ꏽIض\_!M8Zq4 ؤ֜67mM^ϟ3_߹7=A> C{frn.?vվUԸݮN{vuYzuS 5!r_#1qѢ`j u"bAQhd4UԖwɡ 5!$Q]rL8dLz?/BYP#^ ׅЇq#y}Vm#zd{sY+wC܂z2L5Mތx#~ȊkDO= o/& vŽQŃוvMS,LՌQa@E(7Scpն6ƵN;|3&gmoF`- W 5sy ^AUͬbi-ry='Ujt<'5nԺr2eD.t<].ϩ~_޸_; ύ:&%BS\B y f[kh >ќAŔgdd`F!ʃi0haTpF.fB fÇk650 eQCaMoF5VN9g';ђyzbiP-i5I1k- aSa:pԖrOufp)F槱$ڢl') `zA$`rA$\Ӳd!Q7KtS٧8t$L1_Wd29qɱ4c{X},SI>+VǂE`HI'wF5P]J.Þ;l{L l( Q0OO BP1h58e6%jQ Nw*B @uja i7'[UTp[[t^j &{L^K ^Utu=L]3J1Mx)>HăQujӔz.[@4Xpޛ, QfîIQ4+,.]%휈'?/·xQgCzˏņ+=dz}$>NW,&ej1JW26> Gsae}2?Wcc8h8wᵗ0 i?/Stο'( pg U^UA??%%|ǫ.):nIaH7pn{>wԸn/ RUK_hȂݠW=t0ƛ S>v󋍒M64X[S2wlM} UTq9O cUQOAe Ɉ(q9ʧl˖qxCЍ{f3,i)wt j!視U,ʿÿ|tbyg䋈]Z9!ubhr:!M293pؐʘx⩒2;@\d Ior}+h%M޹I^~wA͠.})c*Mh%I60QI@.`ȳzDzmEl9mRH$i"'gs2}M}6as))9,TkAY,#`<5R#[I'\.g)! DlEp& C;ICa&1zQ/A#z.9<.Oget"V$Z*r:|\mRY"@\2ΙLd.!!OJzVz3[KcT H]k.4s!'m&3$HRYBrL1ediRh`'vE!s 4mv miN$ɩB(:MA+cSGӨ<7:OmjEmV2e0q-MX7ChDZfqO~FÝKrƥ\y& iyB,i'I@k#ONQ;ϛ[!0ROrT"s󹖒1O$uDa3I^Tʂ(eH_]nɑB¤)``wacyY1ȫEddk1ռRd*Y$ݧ"#ʓkluXBs+}s`1'ʺh6ƥ}sdu2b"2H"PM4B@XqID:' \(fĥ*ʊYdD1&BUa:i !1KȾs#vv7mgvtj;VЭ3?8ˤ z` n9"ka&c[o,yi>@./[0UV%H9ҕkdJUAİ Σ&y.@|Ѣ`!3B 9J#HV 2ȇ`0R/lZDb9mtY+ QHk"S?w:YJ.p.{*2ɘ} ^{@#ؠ%DA!Az΁6DRDo Bq(!GmwA.B>a^J r[bH @-I)a:A5HCWP!.v6nQ ˽eo;VAlp;@,ʀG,>7%k6̃E1ELMUQP} AJQH y㡈;SZ5PaYwcF8gTW(!c(Z*l-^#`.:&&Hh(x mJVɣ^PYP JSJ`@pB}olT :)m`~;zY"Oi}j˹^IrZd0u0ugl̂'eKap0JBg'UC]UaMȳQ^y(Y=p4vMf c- 4#db,k8n𠗨-a]M1WTs^ 7"fh8w.:)QJjT*dC:Cn 3C3' )wz"dc_B[*^X6b]tr3p 9f"/( dZ)eyBaȢRn䠣DB=xL 7_GU],W![ڨlJ%Ǩ ikom͊晚2@L%yN{P?AK" g8F c>ƔAGo9 1atS E2_~廋m;ڔ SIgg-lݜvrnwt 瓻{0o|g9zOkRo‰&47Z]'S@ܖˣ7_}Q'2H:ZǨv=AuuDè+48yvQguQguQguQguQguQguQguQguQguQguQguQguQguQguQguQguQguQguQguQguQguQguQgu^QmEFj:йq!uu~Fg::è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:ès0:z=F/k5F5uoF_Q'4:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:èb:|ߣ Grp~O_8j)ݶx`*-Ojim;nv_mΏNe pwv!lk%`YV~6ʺacV V(YL畀V6X{{YtJ?ެ R3ݼrtGR\'>#yW!O̖"RZٱuίgդ kYʬld6{Pݿ'X'Xs8:VVGe_ XJڨ9h6E(>@!֛ʭ(DW0'X{btެk"kP^ Xq*JI]Qɣ΃=҉R:Z X?0QVp`?XR̺3ed%fV֒ ?JFm=٭~%`hJ`Ɔ%쁗<봸rWW86+F[+;\˺`2rJj9hj-#kdAF5^d;2` `. j€Mf\ XKÞ` Z X-FV2ZJlrZKz92lD~h?X2k``1'yO cnz&ZZ%2xqF_l~HVDǣoSyۮ.7:Y1'} ߞw|=Ot&m#y-vu'ПSW@y`4&wE2|5ϸ'xPyr"Ǖ5<땀fDZb f%`=W l28-ZKQNiu7j/D9]X X KG`~ ,) 9aH]_Nisy[0׵__;mRG>dK/9`K/D0ՖeOuSg?o^d>T0CI8|)7nVCf *=d+MQֹi4v "%i?қv= " EZz9^m>ݾ=*]?Ry73yZ3 ~7-WÛ>|7 99 >Y?*P] ^Mj#^uo|/7ME< ŗ]r+G`ɑ^ X}\ X\kDae~R孴WnKSB&ȯk `Zj5\KE{sPEkL^ r[Iypש7~~{?zekuVT{1ϿwאԊ/݌$^l\~9h_my{{Q[v.]@=_}ڦ>lS"_К:;]]amtуVriwo9toto7{|3[-3>_}nޭ֧Vޞ]wWF{7_|6,_!U ${̇23_f !dŁSѦ8+F:fDyF999S,Y-'#B!5ysYUtP-w򦦒Zi9PJJw[IսGQd.W@xj16E+QMq(ƤSEW'qu~0`)O΄h/sAVCI DmT/  el4NmI9k QI{mE01$Me{6XM"c搃G)f)G%=nòݠTGhOMsmBig~1 v ;3؇ [[%R#R@N(˔(1UI E~]]uw%kCvYDȗj{`r0xJ*d@< x-ϡ݀ vN#Xe~ dr.sպ* %=]AǢ1Y2ǚ0+[a!uрu)j;2z5`kcP Uט\.0/GRuȆ9`QmPU (ٻVLtVx_,j8!nAkeU 0]f4XYi)PfT lLhHpuk4p.F<Q5 7!tTdL'Bo,l*!l+5=(%:zbV,v.ʕJ 78c)(|kB ý0 ! "f&\~eE"NePuk#n58q&bD`014KPIabYLu>JAZAMBmot.QƐjoSCAV8J8 ,s$e中bQ`*vckFxY 6@BY'0OMJ۱r{wZv٫))8 ~Mcr^5 5$DjeEccA0 ،oI v/UTy V" 5˘4iAֶjoF1c^*"]1z|T1&bb󢐵rb6!>$*~귛>mjVe c("VuWo%ፍpMJW%SdKWCHPoa a1%OpA](^ŢΘ:ø+x$ LjyU*C aPs8@{ O×Ue 2inuG-2q ?zH,:W' ;hKP-bbYь d-Š.k`̝]0ƏǗniֽu'K C*X}t6F40{ܥ)ook遐{sP6DD —9 8&!F!k`)#8J-Pm~9 s[O κ$!;VL 䁀:A鵘V([|pôhmB8cyhbn!? 3 Be#kv₮P#kHQ](ͮDjaJQ;8*jV|@,*jBrI|̘I̳l BPDUb؊8_XIWH:F& h%PD[ B ҭ*^"Cm݃on%BG iT_C%*hIYTtcV-6b=΃vn+7t>N۴\GI jD0u1ID3 .-zL[Ƀa]}w%[YumD1h!J5"Z`p+YU1tFϩm X{ulwjtҢkփ+M:d8$ϺyfLR 9O:'@wu~A*DERxZSc1x&E.Vx#Vp*FX6i CBX OtU2&N{=FMW: Fm(6 qř4 V̯^c>pQi]!X9jE#VKC$X`ˡ옅GpVR"T$1D %wRՒZ YQ0u+3坩yD]pB^0݋j+7 $)y)KF:>vGol=}pf<ŋ4!O;R%~5xqq vyz3k'ꐎֻtЛn~דKryx}|,?_0mwR,)$.k_NY+˛/h?e9;<%:@ y1A&s |CG 0à: 0à: 0à: 0à: 0à: 0à: 0à: 0à: 0à: 0à: 0à: 0bAm ۪Prԁh^<Š7(ePAuaPAuaPAuaPAuaPAuaPAuaPAuaPAuaPAuaPAuaPAuaPA D|uG ˾@HSb/U:0-:pJuaPAuaPAuaPAuaPAuaPAuaPAuaPAuaPAuaPAuaPAuaPAuy1ν-OWofq6Ť}(j[<_͖ϓ%Ik6mqM>[ )]Yߜ.bQ<h'bꥱU2z4PoYt\?/R"KֿU:?OXCt{ b~ zMV!v2FNZL7+)e/bB2 ~TQ:H;+zD͏)':j7XۉXu{4* :4J؉H{b.{w+u"KщXuTFNT'bƏ{^ӎN=F#䁣Q9F(׋Xi>p XNj5Fy:k6t"ahC/#=.NFgBAѨ^1ЉX/ `^lJ|fIP zٍXw`ş\9yAA(iZ?Qmh4 V4~ x%4r֫rC/ZDnh ZCk,/jwֿu+NZC 57КZsCknh 57КZsCknh 57КZsCknh 57КZsCknh 57КZsCze3V3*-hCK˫˶F:;vyUj.B'"{VGF)>la>y5?oA쉇[ rыᡠ6|J桾!=桘by(桘by(桘by(桘by(桘by(桘by(桘by(桘by(桘by(桘by(桘by(桘bC!7|9<z Ax *73 Pָby(桘by(桘by(桘by(桘by(桘by(桘by(桘by(桘by(桘by(桘by(桘byoBuO<;Eh_<S 9;KSnG a@FnRVxyn1@JՊ|[ff:ջ_Rc)' 6OPz\{mij?rZ֋x~4?AO .7WƸOw5DWϘgK$m^W*:^ =C]^ /518+tu?j))ݸy}[z7?ڐL}0Oo|~6ݘv :d~w^}~u%vgj))t5ӥ ;[wǟ3"q}fjbS_>4Ӳ%c|qD P*^NNK~g[gܜӋYѳvI[:[>ym~FWOl^y[^H5!qϦfa$޳]!tl6f(ν}"zM?Ԩ]ʼnx.n䓟@zguĘy&4ya XffE+3yXJ=IBu2{`=>W://p\&gNcͤ;m7Q,9OhQ[3-!]Ys#7+ ?#o ?lc}<ׁS4dEHUR3 ėD&*i% /.!4d^_u+pRTPBT+vU:8)mD zٓPbBمPBOS2  ZaF"RlTIܳ`6rJ i9Y@DBO  C*  7l(YuCڠ-k;?{eyᙛƯ o򶞮ZB]>:wH}pfN=ϩɋfK#vi4y_S*Ѡvgl1bZ ^\VxgVV5Mϔ$i$5/Fߥ^ ?Y?la7,OH~$D1mPJ#g Y:̦BESņavTRY\z}TF^U,RS^F9P:®Z=z RՎevorxcvbv|Z3;Nuz>*qaxwn|Dp=4KrεN_۹*Qmp ! Wvk3|W{6kjĘAϭwd??roٽm0r;a;_<~tꀕ4-=2%}@:<'w?7[mͥOrnVՠo͸{" Vf;4RJ:m fH}k!d =a8\ KzؠHa9&/ӑWWt (X,{F@t( ꯿*sֶŵK[+#SDT>/2bWۢj)RƓ첗d[^9u£ұ{O0q"%(-%2`zک#j4wE8/LC?"A=||i?{8,}ôӵ yM= .5^m1=̊@Rcyᄏ|O|"3_: I/?/Wj_Juo_~D!T`K'od6Y-"v9D;"'ӽϹSÓ^ݸtxI1i?f٥߬ͯӬ)2?kU Z$lx6aʪ=5^2IOdY7gCöe9MJV΍~H?;"GᗻL~Zذ1M?~H Z-V\U]3G]>nW@Y.EG8]?JeN۰đbj VQ=}0J$wĹPuSc߽NHzTKHƙ,HU Lhmk`o~9 ?;o.ְLR<+ۯ~_PK`"^"p(`\:P%m)35тE?=uC6?1.|Es9nNğW$a8Z0)nj.oj3z%M^YʭH5I<~+yH{= [M:'wK}<TZ R0՛O[0r"_}_(+uat088~́ѽUiڙU_4OU,r'ҫT %C{24\FJ24ھOF5_0" z5Ю@hWj] UhJOVv&m^bswn#?__z0Wn~RK"._ l {V_JY: D6F=ۄctL5k1u sN\P`fbH;F(B5KFK(H@ ZuiuA')iku40{`jԇ/]-: s(po>  y74` 8XsW{nwhe: `X z5ɂ6Vg6>ີ.ϐ7~Alr&~[)MNoI?j7${SuڛcqE Kec+25Ǯ?_ϑfɦ:(7_g^}]@tu[ib'J;qo_7(7.Uˣ,~[%ØE#/r'HWB;9Ye&%ex3Wu+"pzV,ra aBDi``̤T_*P $cC RQgۀ?XpIi`W`Q9ޕF\7WD"B% b:̧D Co¶ǀN0M Wm0!p UW uC b&W$s83q,hoyLI5j"6OWۥl>{].zz(2fmQ{ty瓞/ȵFɡRuuX襟͉{zcCa(:u1lZ&9[!ks]O'47 e* Ji'{/('^f)̠\iyZR5fzۿ(>9y?؞%3*sl}hnmz<<6Cړ{xEw>2&)S-{\ʫ:C@ƀrhF`a R"7y-0s#mtDx1@@cT 5ZL$yKVc)%9{ "Rj,Q<ky]n$deA\Yo r|bG>a4*`?@ZA+ }lYCmYIo,vOJw&&SgiZc7DxL'?[gXeB L} PDK L0Pj$DT8㻪g5oZJ`S JXKE i$aji hc5o5jrd{1p<`d('!% 6 o lCF*hdQ=0pA qڧ0GȖռ%Y< A$@Nw|(^bH9V׼-yklj&oi\1dwbƗU[1y[*2ة>?ĸ13?WŸC/)$$+@O&MT(Eq$kǖ94Ye9GCCDgP<&>bANSc&_1P "J j$51>,ϡQay^,G}Z:hD8`[43&L/|#(bm% 2wyCV:1aMV넍ׄ|.hq ( rM hcw{Az{ռ-pfA| T{o<j CD/w؁ Upռ%#{cDD.BZP(awVC/*դO|,Sڻѐ/*/?1 .cm2LWe?#mQc"}KgY5l2Nȣ%@@-X,ZN|_au^)."ms H✓!a?iU.=#K|W>X&pcqLe,tK:9۲UQ m nj_ѣJ(kNJ 7?J}qT\UOj.pb^XpqdA2"c$z- x~9iP c3L jޖt+:9e(3†"Qe2FjޖOvq?Ib=#Ո 2ę:P6GˠQGjގVR;[mTyAQa#UW6 R&Y[rE79&NÛ7 /+%jr}Hu@@@xBGw0/*jޒLS^?,] e!P@1g6c*hdwAzռ%ӣTw~(A!G"7}vJW%g-_jn~>Mffw7 1qClnxʮ@OduA4rV0P}#bh<Ƙ9taCL.@I ^},]Ug5oZ) (l0"q_L̜' ^` }ؘg}9fc>+42cP쥪#ar:F!aGl&+ax%Sc=H=F?! )Xr2l6_ (Տ`-u73vEWUzOgn`keʓp)")cFKc:2YFS4v';ZI[S./>/kZ%x6[o'D+o7NŮjJ=nvb|otnW9R?-٦VӧRܳ?=nǁ诲߶w~k7n@x~q/&<3(R=ߣeq3~[ÈWgx^L?c0\YF"4#1M3Qy`.c5{YWQ ,H{:?O{_?)V}xJk:EF3vg{h㔛M±FR9~%[|̪D;  9\pnߌ.a"dl؄[@+&[9{0BZ #x lЕ52&6f1L6/`"2 4>va d&˧yB-7da `wcsYMl",A1#اeҘ8T1_oJQ`F }lkP:"AF8ԩ&1*}1X"ʝKORBWjۘBq{ ؏ 4f@ծ'|M ۘ,daQ57Kf9Ekԓg"g%Q;ɐE:׉p.ӌ#z `3EP`Ua&sLP 5B*UUߙڙQRF\vOrX 脮rpйxuk5a]ο:78,1B k I/?Ac{pTdgi8=N z@x@RۘE1p;'_}7t%hkYgDN JOֺK"nE[sRf!#8*-Y`## eVAS6ٺ~oa9z Xjb+ N[Py2Qa#0EIm;9@nI b8ؑz q?R &ʨA1bP~PK1 ?HYErH뀩s_:Q=8[]m %tU )Crs0zyw81!mw7w'-@~>91|60#z ~}=KY@O[Mϻ'kґ@mNЋ y ,&^]{;HH^qۃtS7t`cck8‘@Ns>7CnS}%kHIۚSӥk{l5sQ\N9"U}z(va̢CVX/AZ9؏6O>6c]t~roq_o)'ghn~z6 |\F/lnCmជa"TP{yH"dxɋRpAn@5y=:Psg@cg^Viqf3|c(% ;soKϥ-̡0$w*bl'M!jbb(J-)9R,cYBWw)Q~ao(1PE`(9aCJ>|s̽{ ~ K]X]IyM.zn ^^t0/8؃t5`/1GRSD=Rt^e [sJPP #zyJq5 B K}}Ɠ ,2Kiv |zU]U5`bh,-~u8Խ,JSˆ4R&Ly y}? '.Rbj8|p>{eϷr@g@=DtbAC%|,xZX5>]|19Wk b6G &3g *䡶IT:nۃj 't,lUB_65塆dW}-٪4`/Mipo/[_j*Y(n͂^^K-u .لmu;Dy2 /ِx_ÜP`)XbhtRbA:y q1A)%Df} x} KR5ݘ]7HK2k厜Z1]K/%-}yX nr:n(^oOx6U;*HV!Dg^X @#ɬZؘ8xgs|0YUWtp4}`!pL#z!/T)K&}L7̷U¦&j-o?wcBG*5=aqd~HL8:ꔺOJ6Q,QljSDȷG8X8KVDO-"IOu \"Ĺpq Ь?zy$K> ӓ \_ْ&uUai|xL)ɴMň +QLpKH#Q5Z$$9U<|݋!*fLDgs \sDҔF488'> +'SPN}T0"C̊Iɏ8*Ch>L1_5vtgiaRba ,,KD̔@=Jms>/'c4E<Ԯ %* vߟYufH.04`\^@abTVcT%#|=BfHX] C-uŽm!у1Omt:1_oޙuMZn~nu dw!kx;z;R$]*k0H+8_n\W$4w@Jb(~V,J?{pi5i $WKٞJ*oP|{%V8&ïǒOA5p+Kt44#6acݛz˭k.@q$݀+\JRK5]avֻ31, |~-&ջM^PzJ )bi \1䡶=K me{Ju[&;҉~:(_`9KM.Zfe4 A.ܢc71β-" (z"Q;rPЂmw=@.Z {eNP4ONm%?w&lD0d2y=;5v7v ħ 3 e]:-@&- j8~ns2Rv-~ T)<[9BJ[ {k/ʟ[Z!gY;:.SA]xx 0 n@gYTo1 ȍl:˳Չ<+!AG4C-$=Ay2F:+`>5*Ӟ} :1lIs*"w tOXw0~b~Za"X$bč]kwnW3,RS)"ūQnR;db̯rkj݈ԹEIQ0ܶ4!bǡjznt>"mC;ϡ"@bƜo OcJzbiLٌ!NcH%ijjw+&U8uzd)2F8kta8]a#LC3^`ywX# %&)AcxX 'tV̾&Yc6 6710q(xf ET L"C11}Ls\̦Oe>|Ŵ[!wÚ阷@&"uA.#uG+ ?w];3I~C'0|M]SnMQ11CiKK: o}!{̊)*Y/Om H"#eT % 28C-[+^ѧ%UFt>mvn["t F 5/S1`ç9mfs8Zi*Y0s"E)mBdL"v翐_*3eh~5 =QxvQ*^^֯<}xc[>"NZ,i Yu6r46RfHD5QʹY{ʧ#ȃ0ei}}swwDE(P_@|N4Nշzcu9U|sWN^Ta[^ l:V΀m_ėpgֿ݋_= ]SduױȾ8Y#E@fvffھqJϫEsh^e_ԓ%scrMʓeЯvio\ #Iiɬz~!1CW&5fTd@g CƗyCR_1n|WOsCņx<ߞ[i6Ox7q Z/>w HڌoGnױɠ#ۼo8q}~v9`ox';.{eW4CŵN *!w35*֖r+PLUW+@Dy17 ×?{L=_}HeH;ީ/i{=[5٠ W12uؓ4EU+D>)Ij6νzK[uH|,7*--η^3^]_CΟՠGFPp:3E =3Sϳٟ>|]EcKӜo|p[U]>d˪uwSTwwqUA2|򏲡ZP( / }4]ۤeE~N3 >|9̮&pӋ^9JwJg\ս/pT[-kշj=}?ul V==&=ƍjk/rN{/p۳&yl~kT;m=s똹0sQ,QœVEX4/%Vrk2;;A5@]7c򻁊x|$J&iybh~wB\Ѧ"UM'[DtpLGqx[ҏaVFq*[ N ]׬ $2%! j0!.XIbSF2B86ǸJb21 `b&#GBT:SRQ?6? Pl'2 q$YT׫(mӉ,tij$<]tA]! ".1@U'-ೱ%{E_} ʌsh8&宻URXYx >f<K FF1t d (Qu(8~[M2nT "hkYoU|ɻma?BgO(:{aGִ/*L$1;3_Y"1Ö058^~ꯊ1rZ~( &^_-xW輟mz*_Mz4ͽߺdwFRɟ"<}':+1 1e?uap?OX=ƻ} P{zyt|={7~`TrFrs 5aAmC ~( g34U0ʀygHmJlv:zi Ѣ;mm1k[He4I#d͒Ij2,a\CdlHOQfnT3MKM?Bx ^/YNOk-b~ь,TM!iA.Ap3,7ؤD$#ZRWB5Q.BmI O"܎ivz sO;a0cTBd7Ђ=֘;(B~nʌpT8E qE⌘~Fr 24jx8 YHqK߆Sn21~Bǣ8rLƈȺFf@8cȁݔԵ"@qB nf p/f6"|pM#z;+(NB bjqM>%u=6B e3TApSpPvhm(sq#FD:ֈG)vj$NPfׄ2m?濽p珒CKeٌ!ʘ]TfYzJRz,HP9NSS&%צnwOh ?$ |F%$Cib3v aQk/TxA(@ B e3ph4Lgn3E*NcFB[{x$|繙Y@TaMl 8 cF8uz#%&cLXYMu}٢i2̜0 "{OF] (Pfn;[%( ,. ^&"N(Nת+qtd:/%%"l=j}wB_޵5qcڪ6@*;Vfe_&)hQ$C$[= )"؉\L87!"4YhVDĵ") kuJ5Ow9duϘlc3RV1AنT<&X@/o^ qlOZ>ywZҫ{;E=<>E/WQACOrfF1$uCPϋm]5X5"_5)^;%% Z۷o nC 3>m(z~z  ~ċCvؠ6/08jX)ji ^DgbX;l{E'겉sQ=3fՓAg=}ڗF{sE%n}̼OqI}h<_~/}wz>ɋ%<ӰE^g9ҪCQ&v(7)J`^wvvtcBS𦖈U l5"V68t=wNx:8Orkb(-aV'*fDU]8*u^4ƣPO,ZJ~R'T%۸:O?]RfHvGߥǴ*Vul7cl(=cAX7"pR»7|ɩZ* VBUC9-W7yNWmݽiBX4lUИpU.n59sf/V ¬{cTLC >R%54zvjj£X0QP5uUW$BխF^䠮/ kp6GolEH=ƾ4y;ˈV nA 8gBĆ7ژ0#385{V b>Q AJ~.^ƟT'K(.n 4@5FY-R5O~} Bp̶l{&[AuRkߪ"({BVC F-g˨GfpJcJT60Ԙȓa^5g.6+ \&?)Ye"mpf0?;?q#38t27UO6Si,):fz^Ly/l)K{Z9\M(:pjPQƘyd#w҉1d3٢fQO W{P HMU?ͼ:śjf {Y^]ޝʌ< &<^2W-0u V Hp>WbO6Z{׽5`@eO{=B <񤢘J6L0~ lRݬ_W%_-śj+{8<3=wNI>,>(=#8}E9M^uպU`ZSښP*^sIAnl_mx풑BFƁ N4F ;ǔjxmxAC 罻3c wgCG檽G9=<2N^gS > L"!*%t~f\j ٪Q\1<5 f~La{"w8@<3"Lr[`aJm[|_ r]m$~/ iR)ۄ3 i)(S"Q]Ow k7nCo :F DYga7-DŽZ\IB5< HgFfD,45=^$T,Bhւ!4jTj0l36(aQ KpjYP\6`8ܢ-x˸d 0̳wXnXazˬZWg?[P(qo{+m5 K>1$qo@iԡR.U?>n{ݹv{;lUr-?7aO*^T`VݬjV+%|/+? M{_Ϲ5F*k )_FՊegYXu$~B _ gS$ N=JVSk>yx5jw@.34 N IBZmg5-@$-DCJ<,q2A qcg 8ͭHs" '@ΞZ9 >[.AOŇֵ;:(;.\CE eRJcxPy{D  ,8eJa*"Y $}yn7zxd3G+.XMvLPÞ_4 8L klK8IHKCĆ|eFMzO¾fbGfp<[d)V, Q=<2SjzzQ&G5h;L?/. ` J!scmi1ɼrEX}OIw=/,KcC9QEC9$&`-DEC[ Dt:@ B}󒇖2d@"4Az{@4e*Y8="Q9ؑX Y'_r" 8nf.x~}!{U4{))jmp6dZԎGfpY^J .O͏:(gLCQVBo@@&o_#38ލʮ,wm D9!(GVp$S0ƞ͋O~J"S#3>1@`0i(H ܠZ:0`F T`8SapBz~fPx Zgk=w룼IYu57n%1=Qg ̨$h:\M(|*ڨVcB \Žj ~? ^vHa)fu#/8'PiPW}l-V%ףr_t$Rmkނ=#1STPLoF$H۞FądhA n(pmC83 {{R4ٕq/˷݀ں#38*Gg$8U-H 7D ["=<2b0Nsl9*/Z2I|aqĀowT@^csC !a:D!!ʄ*e8-l @2dr!TFmg"(DEsvQ RJ1ͱ{xGᴁq0|, q|3㘠|\e_nS͌M8a+ܺLzxd'ŲcȏT3`\Q "hS]<>t{3" 7buDׇ Y1nR5w;#k,1jP$5n 尥6<2 lKl;;3#!ru;r=J NYG7hj9YnVϩh-yj^|%؅۶A`[ aPx%*_Nረ6 gd4,E qe. _̭OB z,EL<)2c$H} ˦ 3&ead>hq[J!%A)-g`hRN+Xs:9ҏcИ~zq{;2P)↰GfpXlQfl`b$yQ%'\&ӟhf~uƍ /_D&(Jm(IZf4MxHH%FT7ڙH0քQ£tkB}[ƉR[Z/T;j2cH8-5kWeL*x!m"TŶmFipELu[`Zo-։ŀ՜K`W qx(@IR&Nr N2\[E hT!S dhOxcf< 32-T (zxd)f̑] A d?@ 5mvس,sDبR}b)K7G]. Nz-C6ы>5(|qwً\}B{?7r3чCn̽`5Wh9Y ! 9u}S;F< |lϧЮeXa'~_L@q%Hc-j=O5fz`VcHӿրun1 z܅s3ƓC-04T?Տ?|Y]Y 0fBմY#wJ`|۫~}Sϰ\?ж|=Jfp(j?Lʟ!99;l .ÒQ͌B (˵P'ˑ6 H|=BH" ?sNvhgw,@GkZG5FkplK58~^oYU?xnzx8xQ|-^M跀7slYhY1mt}vrs1>}*ϫ53I+*.03h ]ؗYgx\-K^z tTgtٙdh~3k}\ݨn_W|i]%S݃k|FD xO,8,=F Z8z~zmWqޣ/pzX-P Q7`zbBfk35~}uuxU7sa'_Ï+zt5wŨ};7]s@ѯ?rLȅYfV?yk& q2EK^+FVZY86Q{?2vƽa}gݗigưek._z d|=k6ŘoŇh ;ln^/u;I3EzY%nnGX4UE=@88̮΀{Zp~.߸"/~V8XͰOͣ3&c9tb}X}x2,W,;U winMZi~]U`ud*3"_-{ءW2s4ڠɲcl).d^٤hJC~K:b-xԳrF;l!Q+L Owzzr2UOed;oO۰7Jfv&?ָ ſCbP˝&FHS0$)@LDL4v9nJUm0Ka>zu]!>u+hQ;+*o`Ora=mK=j͕TY]*ƴ)|1{lq3 W|]4j na~W,ꂡ{ rQȞ,?woYdwӲ6ݮYXo!tYgzoU`:lInRn)"JkLMka9fкR+JD@+%P \)B0ǐO+ 7w٧Zsn0x럭-;4NWdF5/|xP$W#=IfdN9 g=I\ݽ`MBqa&i}w1[I[f\_"GmqNASQRL1ikr;!Iɭ */aIN%Uߺ1ѓhxM &\x =_?z;>>̹̏=&ܛ6 ϵK'w  u@dvzF)Wlr =.ėXNq2k*u6k1zUcC፧.(oɊʧp;RdW U:8|{ y@ >zտNw-wamqvA~IR?][I`EF7o2$l,2h% FWgsWzߛPoޤ(Y)g?X`'z,u~w }=\ؗ_=߀8wi˘BgC]ÅQE~Lju>'rGβeLx`F\0QM"i$ fH c7zU%LbESU$EIw0rw9b 8Pˉ=a,ov YZ\QB1&6"iZTA #-rs*X֠j>TF6 u35zq@ִkV 2jzejmHD'gZ5r]^KL52.Pӻ#C Ј˾hWԈ˾xgm^Ԉ:(g͡zݷ8d쾅1MyG*Ln"t(HӨfM-)Fe!h %}|e[/Gj.ol8 oǁ V%"I ( rSFRJ&MjqSZF80>z]OiIcڹRrz[08N[a$C; 0U\D80r-(5ec0ZB_fdc!a@%^Z؟ (~OOPS&jYٍ#Fqj?%Q<.\0:dVGxL;$'! 1 K3BCˢc"S4zFϼ8یMy[ck𐟹}ȷlxhK Pp"+c솊a" Ju %!mAxǝ^#tLa*<0֜QIpZ㦧njN>T};26Pe8Q*Ԕ(?%,bH"=v~̞^ ͍]~tyZ>*2/5>XVwwZLY'Xr PE%dDڍI3NX;n8ړI,9ӎIɞ˭ĝGQ./JWOqcF(hh;":IZTQ*a2F:6ZH Fkt9ƈ +V'EN\×w90cVݮY#9܎jηU/SVO;F۽v] Xeݐ=e D IdLR%",iDZ*_w3:B1r,;×t.x@ޝeBgxA:T'r0^iHusG.Y߽eos`_qy^׋o ghl e.1&\^ey !lӒ݁5}DY)B١+>=嬰fI3mHl.4nnmwai΁Ф[}awdmbM;s);t^.؁ҿn2煋ֱ{,"ST1L k1)%EZɔ1n 8њ^Ԇ4A4Wr4Q1obWot^ >&x?۝fk3ǷRPZdATt;bv6-{v ~CA.hz]DdyʹLgVöЋ_ܳWh13=%\G8(!LtlpcFI Z^:ۣ(}_10LMY%4~QG? u+n*sfٮvV[U%ޕ߼ \ܐo}o񔦮ղq}9o_+LSfZ.}{j 9oV˕gJ(WN \w9ͿJ}GS/J~ZeUHk:>$۾~Br?5ފ}22ĽOem~}UoJ~VM~_?uJ !Tt.$O2{ `→{<5U8cw"vwgg_VOYuzg6? $!U!EBW!eK ?yT@?SDĵVI1($4 | ~M {KUlR!+gVqw95/_fVIoMsl?e{0?g`A6 qzQj0$)d1^uJH4֚TJ$TB2u]Ǐ~Sc?)1'˯)W&I' (ei QÀDKJcm춛/z5bz,e/mNLy ?L"̕xFSS)c+L@+[F1S.F4łMAJ(V6IN ŭxrGT"i/_ bŤJJ[Qh#)!N/ B`M!4bx 17T4VvIL4!T(j7S)V2E^kPN:MtekcH^eIސqPl T`xJ(@b:AZ* 1PdR֜"AVP*< -'I$mv@(\]]]^M<6*^YzN  !4Xܝ[#NIJTY?0i]7"Әr4O'W^I-wRw9^%yU7ʣMn]vSش,YzLIR$dRpb8sAX0%4@LA(0Q +G gtwKZ:pƳyv >]]u3Q Ц1eR7~Qy\^+XeXg˩hɛnY%ɭ:zY9:Iwvv4ڕ')$$Jhjc@@D݉VMLtt5$9DUb;/}Pn(0Ne1zt.;G&HozS': hrpj)LahEBm#f5  G0,OnRHN 2vHwfE^2%*鯿v)tyb vz{l@8ÐB@Fwg7{#WdI{]%gIld,xɿJn}'<6tfw9ߕBuE82ӈR4R4۞$;~U">|BYean]/w[nxݳQWT_'rč5I؀uۻq\~3;.1xIߤD y섆.>4qAhto:J8} LhtqKh,~ L/nLױdt8ϼy7*\Itvy#Gu D~OBvi+/w'iSYIiz[{MYՖao\1? /jݚë́h}:vUAK5ÆKB{i?/p`5TfH-_gfY٬{܌ 㣓_p T<-ˇ{E,fa K@;⁊`&7$dnS/iwl ^)鮈,ǷIs"o8V{^zj㋥`"f>XsS^HߝvЍO{xQjwJSw #nQ)z(Qyr i9 'C{AܭCsnsi = J]7exF= K%v / PzFKV0.T~dP^Xq+]y3kIP蝙l[/ďUf7<ߝb$X:h}uko;ېf~JC4vAdWrdQ] m z%x@Hձ[͉P 4Ph orP8S]8wg!'adtBnRPiiC#qɨi Y4#0r8=9JhWsaq]|qrsgiƵ+g}[L.J8;;ӓOkH2C!Iڙ}Sj˲ԕdzE<7Z_-Aĝ&ȸ3ޓ38[ɛ{0,D36mp+ ",NzǝEi,$&7a3~@ŗ1'SfIu!N36sQRwoBQPzAu,K2R(Ɉ|Ɗs>ۻ3$Nx s t @Ep S2ET1Rh88 .Yz'S /.Le~>"B(mja*cޥV4lz,N=vy1sWL-_r7-AR X+_]!;A#%N (0Fd dsD, iU*BǠf~ 4L YJV MM1+&˝;&K>(a(y:i:\j)»qz7Qs=n8ONE)bf1Oy3_*@` 4TE?n bƗ|޾a=ROhx.Nt:AΎ{5i!Z}b^bV8F ?os*9MTbP&(E $IЍ+YnnΘ PQ=m͍1y?STy:uU\$ϵMqYsГ XS :F+y+{2avӫ;ڷ(Ԍ @'pK3=8=1xx{F0$dҀ-a0RL-"A 3 e[tF2%s.h ;t*k  eP , r0&dY#ڴfRo$yR^>1>4 ]Л~Ebp0Z?jLݾ<{֟c>JWd˝:9m( <фFk*%1B#qg5J92p9Of}{x Dp7M>4z֞0aTz+I36F> Vj<-RNyJ< wc}4t #:0TZhyݕDL %X:$,ZnEK2^qx I#hYmŒyLrNeF+A#1[;w#;G^ԁ~$0qu7Æi`3J# FEV x * wAk,t z,k%\P8 ;s&ފN ePVD*vQc% TdV0t]F H@L`}@6vzqtcDA Y`ec% Ok0Xv+]@vwwKB9pik$LR1AYAzz^BA0I N!6hL? @E.Z ι[q\*s,N  ̚QA S"&af0,%A3-'ӣюv1[wt~ى6 Qrko ~laVo/w~klG ;o?{lu$RLx Xp W`veZF(QbϙuDzGVyg42q@"x/E @ =u6bwpFY-\Xowyn/ҋFp|}>9hw:ݿ:{;);Ow-7( o6c"ͤZp͜ J$6ewSOB0~=H˸gBH$[ {j`I}T)Aq+߭L?}u}h}11_Bݗ(ugqBW:n,ͼ˴v=I\~/3e? AwM1֙ݐ& e~yG$ELx0Y[nrn+ޞO:~ח׻G2X߃`_*E{y7wG8V7)gqn3k|0ݰ?bD} O 6%Wٿr 䧺7ھ9 Π Xۛ{h0NG)]693;7[^ r=E8Wz2iU>3w.}zI4&ҘJcR*MJJ +?Ŵa Gɉ>kr}u1U~ ac"%ֱA k΀e{Lz#8FX$qf≓f!B4Q8Q:̭Xz%68cB3-';Aõ|I-'w2IG&tq E&j&˓qU m6AW~jw DnO'1{]7_~lV~z`ǣ׭z/,xC6?~o Lrc㏍?pk~{./.1Vol~n1c>-I}|Z@lJQGO͂ٴz>$ X. !F vwu}}Hf-J?Tݕ%{n49wSZ p-&sqmΧԅ-(p̈rd8Z䣊={?JJMj0οx{$erd8 +?pKĬ$ȋQ-dj֛5VpGLNzAXcZ?ִ;l(BltǃЪ#F7Aw8&o]R!=\ooůNKW4zV ǣ֏(߰5g,&;ی[1LYp@x2KR:": ?3F"وl$:SAtO`D2YʂA4xohQ ,,4򗠂K@q[E$Zܲ(dkQ^2³3^A^ݒp@Bi؅|ۚ矓v) 57V?;m\ }lc LΜ6YM@6)/JطdiD{BI -U\қ 7u\S޿A\r:8p[C=GrO}˓9JGtWS˺l,OG 9%Pd2$׊XD{*hMI8MqSrܔ7%MqSrܔ7%MqSrܔ7%MqSrܔ7%MqSrܔ7%MqSroYTkJYWjw|r(`Kh/dskd\#̛L6<#TKW%} p;7v7]놗gqkWXzI; Q=6v"/Agh #J$?(M].Y^Se${K6OC.ۮ .D%w鳣!&ٝ$I|MhM{oX4 8P,8.ƨi59g/(\%Y[M]GFD^ȨpBv3 .+q{RǿhqXOGr %[,⼯pI˸P+"&I1Pr J?.dEQн_8݉O':#4%ޙ녬7h0{jAKYQKyg. =X@jhzGPW8ʒ2j2b!rkL(GxX L|^ v;j8噶-Sl ~F4O+4J}j9΍% (tՄ˂ddE<[OzEd6 GPR(p]5Ms4i{gp_ _|):{2\n8-UJZYv`+j*Cլ ƿ礧@.Oɑ5Z nt4qI/a:pͺg}0̫ 8썧PȸX}nTY%,)VL6'EfW9ߌl!?3%sJhhh)+hѶ\veyY|+ψ6F$M.F}oZ'Q(Aj"qUM,֬mKykLy7.B pHA_0 jVx`T-qilKlMlGv\Es W0=rd >=:^aQbg\ }Ymô`|]2#+;fx~\iphizCLh,a9 fշn4: : :Isrp \J(ZTZQQ8ho^,t X46a&1&яbSd\ا{QMTt`$"M:/}aYLX"xv|F%v}; uKs TkYmږrں#ۚu/aE&ՠVsjUA3ːUKUUjZ\el˲ _.rM(nm ߆]G?`.B\T'M`C8inD{5`c2> ʄwEHHAeF|/NN̓÷-YsA{UAq<8iãFVPV~~auj R@$+ [8pѼ[ˌ%%ЫX %֐ Eb_ " >lIJz|2}+văyq,7^EDWE;LB ХN (vIx^?MclmLu+ U19<%"Qb6ޠH>QCC1>@;<ֶjli}ѓ" A^%]$>> }? n@xQO%l\R@P?.+:~8c V#ȃu㧑90l/:l!Ćupg]i@vɿZ27a'4v}lb}w/ <WY$G6c" %o@| nvYNKϊǀ[oA2XOlׯ~~/KĘ+mgOxh "ޭ rb]J PC@LPlOWCޘq8PSA;{dسAA7b ̧9W%Լ`P@'⍬"FI12\+d<6ȿEzkJBJzq@n ƀ -6&sYB\ 8vWrA$cBL<#hLd7DAрd 8LC +Ժbc!*m)AJ@cV δjꎝ?q[o3Fc#w뭽SSBSRQ0 a04𵘐/'7!ܠgqU3*hؚ#*mU]([ղE*FEmǑljTuCeM sKUW-9w/oaB)fan\#E+$j*NŤ2n44mVvUjIGRme)ըEpT]e[wj۲*m]ԶrVi[w.uuKY8(@.~.P*cq[^909c 6Z3m(p_vo1$HZ7@~+oIa=oZsJ1CΦO0lҨ? n\QiWT M00cפ=:`օ0@v`P;϶L6^ dg1kIWz9k>oɂ'$h>V8ђ GXL7*a,uRpB꿧7&s1MdĴ;b.1yRU' ySWԈ)J/Bz7b,0Yzz,{TLBPP++r^!IٞɃJmlЩwa)-kVǪYFXݮ9p-PKVUe 4ж#[)b:uq%o`u)ъ H2:I TjLAGXд uMrVT 8x\PeEaAokw}uǽde oa=hNAe]&FN~ر ,rǎS}$KoVz; M>K=_NG}ڵno [\=> ö]yoH*t6bUya z&X`5X"u8q7+LJEɒ]L,zGx|=%2A4 3x@㗵2W\1FMbUw}wg;@L,G1_]e}rxf֦a%(2:SJ"J-.1Fr6#d* -DU!w_~ū_^{~糣PnD:p}{}"Og'EH"maf3J0 3` '[ -YզfInJNeŦ=ݜyqMMK4'/߫D7prog<ņ'ZԐT3[㨙xƖ ~/`!'y zX4jc xxXc>BVݙoE8 H* #D$Zd'a77nf_,)^y7( Qzty%FsI(O41DϯzKxΛ6jW_ 2V$`;W`M`ڼJnzG(1mcuJ !% iMA2Y%ȯ`BSBu^g"h.~s`ۻ&P6Pjogkh^]{M8J8Kk7=w^(k|/]v\<jƆE˃tl \6EszpX=cHnˠ?.^ރ1}PA2hQ!gGpM64v;p]ڌ`n>0Wv3r.WPŭfTb1hT҄e)`4(\&g 8=3G,G$;zQ  H ՁrݎZ~`mQR[ws?~aw> ̌. rf/fI>ȢoNWWkd/t ;/Z0bu.f3E>խ|r~A2k$3f:Hh\<`رPL7LyZ6̎7Im6wjڛlڶLN7ʂI-lMK-JJMNM<n=hTJ-nM+(ϟh-TV VՓ`YOx}ʰ`l؛nvfm5Žy]4c=mʕl\,7<ⰁdP0D'-m7 r^gr.Z9l:{{!6'7Pp(64%$y˂ƧtH"@ZSۢ =*ތe<%R"1c GLPFJjkS+%b_DzZfLJl5&Ņ˴˃ ֺݺZذkzI7:͹6S ŜtbnTϔ!RHL `;skf'$u-posHQ>e,!=_/WeP,yD'hb#jB'0ALn 1jܐ"3f2"4a"U 8M0 ¡IM"eDtB26ArDYN1M+)MCډH7gJ񮝎wSsyMiƮ|7wkD Fݝz챕P @zFH֠eV8rD1?ml^c\$&7)d ͸2g4`#M~u .@!\=={w1sx讱 @Q)p͓΀ձ^'0"Q&Xhc'S˸Buj CҢV, oG*f*?e7s^xև=>ʚ3 A磶|J!zգ 8B)2'!7c3TKsLy"t|h;sM;'TӴV弽TpgS*'BGٲIMfpfܚ$Ć"kO8Gv4[W 5\uϪ_b@uS@q(138!Ja^ ))U)5KDHTN[h. a>! s,V! 0)e5DbS p!S1h@gYvg%X'$cR$I9j%H#b8 2J jS"N,dQ)nC6-I MI;h ۂPhdT4gUJ xx/.^}3:hu`X!((CΎʛli\Gw;?݈`BOy[=edͨ[^*4Ad;)KDjPyXnș"GNы2" X$\Fmlk,Wʡ汅Tu7rh< x ^z Ս؅L]<9x0!zп9Jd6R},\7(^n!a(uYYlbk8WЄ'\0©㵒x˪كe G,9#Ӗ.WVU&nuk޴說_M{:Z"PLD(2C%4V2L:erBcl8ǘd N'D(jú323?~Y3%G OMCt-~aMBo1[ C;#[ t~b@- 2-I $-[ Co>wL YfYh`fYh`fYj}&$%P5ƌ4j6j ib-)1Z1+1XFWKgVl"D+h,ZPԆ|XMV) Ppj %CY F%ƖS8SJ' AulvĮp(m}X/K^Zȷ ^Z毑B)B qg3`j$NSb 3p" e'dNY(ѩQRg´y9h,è0X=չ5; ]oX:"$FFbEp(f%3x{k|n&ʭK 2GX"i8'<&8eWQk4.(8nZ1 Sw@ဈM\4r41EӔ;=sh?h6$޷/$wjgH bW]/V+VH+VZr@x㘂?7 Y7?;>rvtWdOc:ށw<݌`n>x`7#7;\o_mF2#2;)KԠrc[9r&^ٟrtZbG/J. \0x ycyyla"OEԸz"*NRܜFyiTCUw}aOQaqƃ,W7t\cOKN#?LHATOmk'zywHWïL?ELHbd8Μfqr[PPG?x4HQ<'E/%x5„LY8Ɣ?D:RG]J)8H! B;1OyBn#:wg@G3 ޵_睗PJpUc>A8z|h7\o}I}E <Qp 2XbF)㉤F)qT&ba[ĊD Nӗ^=Y9 1 vrQ {t@T) FsK*X"i9:͈I{t?G|IWԪ5i|#T3z_]Wb%˟za>RH;3wºTU$XP-/_F&1,%qdJ??-(6Bßׅ5ǭǟ|ޫvkf&˖VZ_F3:ҎmPZ0N]U/f7ew7 *9tnbnHw4[i`il%^ pœmlu1ՉS2v>DXq+UOY8NS&K5QiP|Q'?}cE ȃ+Qm-^Lɖ|NzNB0jHC|#anX4I+l?9J{,'7yEAcWHHBȖb1p,{w{{u&(OZ3LKfP`'[Y4\ik0wc}pyZnʟ#jL"Ǐߑ.W`7Y3|UK_˒9^_ØoJ_52$ hy ^Ük^|hCw>إ B4U|)[eI+l y$EMVg*hFYd;Ŕ[x(wNhByϡژ ܓdcVUamgc=vib^ qEfw\4 KzhEWqlN l9!5OeJdBlJelF ā!!ȕ#fwX(cD%ĒbI=&*~eVٯ7̠_S/d0͚9clr4 fY769K8XoNhMgGӅvQeoG~Ч=^cMP5FW`$^p᧥قTj De r\Mڎ'_N݃{OqFG@hҳv˯<[W!k30SՌ?z8[) 6 n( :oB-p5%U() %'bwIKfՈ vQ ECkZck>n< g-:jX=kh2Вa.{ǍOs@δ\L艘/}q|ˉ~by&NYOSc#AsUN kRUYT>ʙ.k,D*?3S+C%>P{3eKStxY-^!ƛqyulIQt~7fg8dg|`xsaUp<8)><8)Ƀc+P\xwuoŹ(%2[p0ڽE5]DqCoyߊ Ȯe)9@a T)ZjRe!rc&ɔڥ]0 X5 j˪vs97lPXӮ++;b ʦiT2kf̥OA |Zbu{l4%K,?#zbRY [D\UfMLC(]cX$U*tRi0$% Hcė \`~˱LEO1d*~z-*u {X丂Zna~8X]! T]=Av躩;Įvh3®+V1]!ZU)+U7$mWTm`a{qv|jW&H>.^'Ky4xc8yņC߆Qo6 4[gڡ߸8:c; ;# F^x%nׂik9u` ;ڸ:1Q{i|F Z?mzŜwo~H;^߈'ޕ?;_6};hpá$p| ++ⷎWLv*$N7Vw/ [l|W?w3~⣏4\I nTO=/']j' Z]'za\2'"Hڴy;$#|iM7x1gQUx療M닋&eL1m-awf$nPA}_/“2: M+jLO|=Dl@\ ̈}SKڱ[ݰ TREu 5.\I5k&zQP^F*MWN (aYߙ4軲FlSXZygTyN ϳ&*%죨ZD7LǑeQ!€8"S "ڊ٧°Fex &ݴ)T1ӝ2Gb-heO4la D':MSW{ʇw~^C{|4`B.NZf-JFѻv!ކ۲Go?IhW5N&l:x;v+S1Vu`+&=9}}遰:վt6;Pч<<8Ւ1qϿ#s;!GW%aK J dbQ) o=6e*]}rr߻[4.Uan9s"&|JNЇ eLYJ:Ay  e;W՘ڒ-m Dk㘒]9h) 0x15$c:6ST8;[`Y4zȉNKOzK.TEE0D1 "]٢"ae& mv 1UK*˰V.( Iv`*Hzr:O6=hA=&J- fh|ED2 M2X$͞`@̩C5GAn) nBFry .EK_?4LxT!hS ۪TFMBDYA bZ*.m\v*a>ڤKft2HSO@tʂ(~.' `|8~u/{m ܫ '8]/sW>c}gjm-C&qћnX! 0%b 2E<+־V_B/hG D;ht?י98xxwr $S>~ft^p”؏wk@|`ħ6RiҞ?nj+#ZDmܒScdlv}rY|֬4eX$LD|UL[;]Gmu=r|֎6bQCE$k9^xŽ9Yo}`oK/s!H;W@aInG.\ N!o.oLQj(/R[E֒>'4ߞ]=sgo5ٞB{ci mz*9y<>y &>;׆|0i>0ycN3;;ώ_cjjp[>wܡ0<:exF޶}zkϞ|vtЍj5oL8?}eV-豾6nh #?g|cFdv-_t*;TVz{-zjՍz}.M UIsnP[Mm}jNΗs;ZxfVO S}m`^u Y jF efgNK2ÎLi¹ U&gvR dwޚbQ^;~LY6۶`O`ƀ ]+ǔh-5hyv_6zkDŽl@1Yd"ۆI7w+(3UUB zU^d{0"U d-p1SC}Y5h*܁t%|6f_]g̏4;m2vEs=租Z)|sUcܙ(Gpi=$Uɳ^6K3o`NΫ/MۯSp/ԹuW6.)Ra$!wכLBZ~]Ά})|w5`'paI< XŨ_I&ٴ5HѨcەC<4] dqPV׺}}2Пs~]#0RӁ(l8#7QZ}gQyP=yA9' Ħ ndy^4'ӻ;l]q/ⴝJV3*.(۽늭nR]n`!6 X\-/LylK!TEEj8'3ڕ4Cs@\^pw#%KdLҍ#0Q 8F* "= )W L͜#c"-b"@\ԣhZ ` $-ü^Q.O NdD]̘ Wp]/OA'H8~@d#saY,} =_>TxZ~D;hfeA3A3k@g pƘ []lDe 3ME 6:n`#`x۱gsۗBa}_l8xfJcny`v_]5ך1f EZ+A/BDp)1cx,8Q h(03mٿW(ѱ$ۇdP/ۅ-_"B-ƏJޒ{ұMQMc6/Q(kİdR!5!1C:p{k?"L7w`T3 +;A!VdbA @d0V*cg*| Tq$Z;ExE 94URw ~>IɏUarx5(# .Ic+B`PR"(Q!T($$Ա'CVs=첟 ,@TLSV:aRړk$PعIluybidH"< TE1 Bc#HB ɈLJ0g1 }""+Z6HZ#i-a 3^K叕kYҫz!I%ϩKJWvHqFަFnBuSlz Jaw-LuS(x?Qࢦ_^'oO={~zsЗdLB_1в4 U>Z=_}r5d9[k)JX+INoP11튳sĭ$MIRoӒټLvjOͦ6qyG_{,T.jEA^zkY|@ӵE}&ӓ7O=~W'ϏߤfʨƷdf|]yyT}P ek£ !!3p4LB7\b`-H&c<+>Xn`-yMw,"5W}g,Rk .ΊK4x0|Y׷̄suo!*]t_#22h U%4ZK/”0 %}SD}cx 1R.. fn@gK7eFy5P dt0n6^W--M\|(Whww`餧)K'UH0uno/M#T}qLh<4T }B#AU.pʁ~"* _tӦ0S=Ȓ9q9P_-o,{UV>Or1rGٵ- O 4l`R%.f}?{wgjjw6Zu/:4qq~ڝa Ђ0Z!t\vnS#34Vѧ8V'Fsf0iêx:Y RuzE dn~ ӇGQvnLʝTNE,3p]Af M n+#! 卶Qokzng0g{Wj]5WrM_yZ%T [9?&tY%t>=Q(g?yT`<‡7LBU'89+=.Ѧ@sO)Y ibD$y08 mj)cUfD>q@雇:|")<+1 `i+J̽2>w)v1C__4ےm3";:pn**'9nvA.G?lmm͊m8wҤdrYu\m|A4uGW ZZ0Li2˼Lfer25UGJ#2 ղN|UX'/t0ez聞@O;*5[f5κm Es0Ѽ9ݡ%Sm\{20[=Lz`lcunCy0,VN87'Mkr7 އ(1,X(tK VZs#4.!Lv&;Kk{֐1CͲ@AAnŽ!W9+ύyE4Ŝ&Ik4&ogʶQ;spߕA'TZS3T\u;Fd`Nnf|gϞ<9>;ypW})ݠy_41U-`,NEsֈ87eˋ^94C@2.]ouv{Wt%Tl>kwv33s eXo*Cqڇ͒Ȫ9mk7ui` hsfeGnFC\݂[j6fH> V\;۽j9Ŋ \Q.[ )erT䳚KkJP]4+^qb2Eh[q=Y41 f 6V ٤Ĉ3\2WP4u6" vsDZ/5:]m^F~"D4=rtGiAi{Wf-$V(}gӲlKUW/DŽdksE~PYC%}1}, PlJJ Wo+ҷ:sJeF+R zPUO`,yu> BS!`;HjMBP` FED|&0aswɹsթ9%S++n܂#י)Moz2\]*՗Oz/^];j.USSIJX` (Jx"B `-$c'q5Y!n{@.轋qbPL}7JEVwS[2ZOи` 7`H'KAp>kENqWPXC_,輸t牸3XDER_8{("cU@|I$g>$w6Ti/_#AxF 6XK;u2Y`QS\^ve&Y :&^̫ :t'd;tn oh@m=B4}ϢI]gڏ %+Z4b[=3/c7X+F \1x8cG y&nPB !~搜TB1%( +<&9L\tGYxjx& VZms|M0UIZaLm3vҭx-MMm9-S赱et!eYߟ.km)OJ26sCA9AK 8FonĘyI93ȉ%B+,D.%mp9Ks WoAV :fuC+?-쫡8:qo5?_w[)]ag<UDPf-AHgG?Yi ~L+ºU3~ՠF5oJ;tց,6ߒIOˏ? M"ikq+{uTdWWv'{@PٯN G эs'G}KK9NY-Ǥ P}'u~޻]cayM]WnٻGn%k9#·gf4Ҵg=dU*Z*]4=mcUNpN4߃O8u^AX W+d >[XViBfWj 60lP-A[Ip{6G6QyrԾ;Y_;fre:O`+7mB\ gCRfkfn\ܕ(2x(>0GH"+!<^Pce@s!<Ԇz"tE`^u%>EC-Q;3 @CdbhAJJi6F,$4, Ш,@2&4a/0mond׈뾸 NN G~4W:0LS8R@54h 2|CDtC}$#?7hn" ̊EL]_52Y&U5f%dAwړ.?ه]so%k3fGZPYv,w̌u0ce83S C'`j!ҫDA=K fR'ixn.ۓOKb'ݑ[OU*V 5j_Sh+a^'cb2nx!UZSe6="[#c)EϦ~ҚNdu(zEE"{NǼ=Nz<}p<.l/npY_=jU$qn&ݝZLF~j]|@2Ci)p>"YT2G4jfX2G|Vº^t s*ff"H;( ybG!@ "ΈRRrޑYxjzlIg~Lr6@ZrXlwu5#1 LϾTca0,A< AH6Ξ]^B*:U2O4VIi( 8ĬHz sZG5eX(+w~%>Mlj;S1xЏ՝_|®~\^0AZ`2(YǩM#*ZI$8Bu$vӅelm6w0"p*Xrݎw/Os^l0hUk2(m O|vDң2R 0Dz 3)5P8@-c@`AP3tש *zH?+D 2k<0qH{P6_8GӃd绉_wV[ΌW}E${o?Zu: HAYς$o_klDQ%RRMTEAǭb,+>룟MU}H/K t dh.(N@#N4B~)r(s3N OqV=xwQѾUG-eWzI#LHU; 1t..,S'r"D` *Oh =9 HhDQ"Q%^DfrZ=GYХ[\`к{b,3t9H/,,\o+nZ4ŀӌ3>X!`/ *wc y6(Ș3€mUqYSFMTت,*RfW0<ѧS[x.-AjQ:QN!ךP~MsZI1F迼\Z WaP[N-J_xSM&߼)~]SC;ٗa1i.7ZLN;څQI&3x][Y[nԴE|"/E2>[Kh40ZߔuPu~s xbO|(g: Au~k:nB<`]ܽDj||-xl7<|Šv`P+P0v?=q'RJ!K jd^ g+ҹ$yLnњH8i5J9#R$w$HKrSscڍ\Tbn3:cr), ^3-YKm}RP&Ndq"E._$˽N>v$b;VڷqGbN9P2 `%%VJWjZޱXM,&fGESN)gQX[.,WHGݺ߽sƞM]+7mme+钩]yXGm(e%ξҸQO;Z/xW owqDɟܝҥGhܱ\|TnʛTR OZ]I^,[M.~$l4ay VJbUxIIao߾|ϊFEXjgܿg_q!wm/Qw[JY_@'ѽ n<rlR6_;Jf. zMԬB%iRRհ$ >bٲ☃S| Kmyq@Geg )RȜ R6'pk<#`L3M!X /As5-"Ln!V Qtȥr*qDN(es aYH[cUU]R+X,.(4yX[lING*?gȅϐZX/S"%+O-77S^8IJPC6Ь?r@d"]ɳ"%+ZcW^VܮDr:d{DlͯzEuD<-|=꽷jÏ639^'=]#6Y)_u o-^'QV[U]g3;%!k$b朸" D1mDrgA'ɍSoKyYh0GUsu.@;{lVY&v*YwXIzw]QyAϚ(f/'3y$A%GN7uP2SlХDZ "(O$)NTηk:+ڜYN[z,_ʔg X ŷ fZ@,t4f=`M'Vq$jho-4Tey3evSVIFE9guo6&a\]PYl7JLjE臛3| !cfLG ^}tq㛉KeV"hGQR$r$+C+*OUq>\V2iTPZ{e$nfKcS܅{PPD(F+3\{wi#cUKYYmrXR=j)'łfZ"|6wp +sCIEk߽t^WuIgsh~4:03?cMH~8cCNۍ 5֠jbO6ٽe%-AMݻݵOz^5yuynzHw-hUY谴 =vfy%G}5rz0G/V"NfݱGkzGmTB8AelV6_4_m+X<źypܫ_~nǐdVyjO3.u896s' t1d6S5vv)j{̩!g:MgZȑ_a; }ua3= LK9K||A>| ɿK$^+C`8,)BB 5\n}Xˈҧ(%4MVڶ=0]'&#r`8 k`( _#x15xFEWV@6g!i NZ7&<1eDic"GO~T%mzؾFB >e:.'™Vc~R$x 1tH8<8 j|%j-q3"ik$Ƃl#,lNXD򚤄c8q<&Zj2[$x_!xils /8I m0$Go F_!xeyPBR T U T!e0Kok$oI[ɗyX`&m)uHh33f${ǁJM0Nh u5Z ,bd Rʼ9 9Kh1MR+K3k D\pKh hCNDP)dʤ (YWHh^7{1/ R1?3،3癡=}A7/i%Ҳg4d:hb z|0vKp ) I@ģ,I e|57&kK ٕWLQD(ɔ sCx5Z7&(,BćT)&cOn_#x 5۫BSI PYT,0$ s ͗&lipJCS0GϾ|n^)$+,H<k21EeQ"S98_#x.:jfWƕ z4Zs A +=e2Yp(pҶdzA"e|,˅,ܤH"Qm)tY뮑Z5oF2A-."6"֥U>Ǽ5xF{ ȾR2Z͉p4'5xL3-Ly,͜& k+ u&,Hh*:6‘:&2 Ke}+$4/nZ.3+ϡ1yŜxGB^8+L(k$_N[Wޢ(uKz¸Btk$M[6v5鍞KS鷻 ~l+gLrHH*l$9{0'`G,}j$odqpNh$m34HzմgsE5*+z*9b8FB@^"%8Ziه1602\Hhq5uk}eb$U<d1|&jNY .s1V#UK&k$_c` a-OGp8􀯐|6FMhۥ(ILeS Mo|蔥%"9H EҚ櫬zfey2/&|`{i<%R&08L5h*g:׋qs.]5/l62cxLIV!q!>}FiJ WC<6[pn1hw;o~k,}ӗ;V^{Ζ[67i|7 z:&OV/tC?d/4m,^e)9'xxRo:4c|B h${gvtYqq/Lʏ.\˫•_Ͻ z~K~YjV}|EE/}܋;2<"j+cϭxrF$f"U;mv<սz|"޲S7Qrw`y_7uݻ4iwKr?9[eߌ~s~4dBSrL nZ .ѕWq1A쀹,+al侃wwUfℷ"nǮZxE9y5*دn٧xfXJO86#˽%=sg9:9z~uŽ5:3;ׅ'äkWSBiR{i+>^FѶ^]P~:{dSb{K#=@Wu;6,!qDGiKd, YU|sݫe^sQ*JytU8,vCW|@tUf0t%AQNBWE[yUR둮^!])0JCWE]BWE[OVt sT1XVɮ[N?Q*ZutZʍˑ^]FtHUQ+ 骨]m8u*Zn =+!eW0g9;-SoTԫ3@?s?xaK]ej?m =OKsbv=JtԩgJ}MSBUC+r}tUtWHW\Y0 * [=Bu- ]mm UR^!] -RvUVt0tթkBWE[s`Ҏt bbHUQ]u ]|1XdzWHWRʡtU)0ShǵHW0(lS u ]S%7#]BFn;]t t8`Qfh '_ -t `֗D+T ~pB-%]FMt68 oÇY1&- s og̺mu[+әq\)%5=Q"3bbp43f_f|LB#{;Y6q-,|Kݼ'X,P4}fRƻD֒RbMZ蠜jmM2`Ngfto2[e=un,xO2OX-crpz׽~% ꐈ  7˼15FũL{Ogvfi JU ynrI8qQcwWߛUCɟ ͧi1A7 ^N2jT8]Z4M~O䗿|3T8Ŀ+B⻳OKC5gou +f{fLYY}M&o'4hxn&~PtY򛷓8~hRzhN@wq2X4&x"|)tqGbcދDA{yh%.;˜HHb j"UIp!}wG[3Ғ` eխ~PhkBQ]L˿>Jk6]TJR5IB;@n2~e'-mʌƝQuB,v[qԈ8`a!4x7uB[[?}Dw8z]vdhiUB|;bSWܘU~Zz*FUSi?bB6}p>@1HNC\R]<{!0˝AnX}UmWGVL&(CvKO$5X=.6muX^f.?=:`:Q8< ogjU6luq0^%HFMwi5ު'{?^x۬!bʹj f)elH%6*,L]Ե0N.ZKWXb À l8tUb(tUUpt]koF+Dc`Bla&$;Ql+Qv3{IIz6%JKS{zBC-(IJ+7tpOzӋ :xu( tut%)ձ3J+˴/th:]!J]] ])ʹ6rB++/th෇ J%ҕAMo rodC+D]]"]j^+)Bx3wh5:]! @WCW1&|r0B›AD;|@WL˦7WIkF {#C7:LSP2znC+b<+.׾5lt(!I \eGtP•Br4<0PDW+ ]!\ ]ZAOW@WHW(  ]\J/th:]!J%ҕ m]\/thaJ +7tpq9Wn(TEҕD]!`I+QWС:ҕ\]!`+{qUw(MjDRI=+?BrVml˦WDZ+`{nWrnhٙJ10ue;Е tܦKc<+|m}•B 5. ތ/{t*j%PJ`$(`:ܻ-]ZD|hY/ʴ~C+NIkڋ iCFZxb#\㍋ h9aCw%ž@kf,` \|+D(ÌEҕB[BWVȡڅNBWR[NGt B C.bg>A J3huZ7:%ҕ6ʜ2 [¼+˼QWVҡ,ҕDK9{np7th-BWBiaera~_}T^U+zGXJǛHka)Z|Ja9{U+k5k??}}ϔ+a$Tr)_9 w!2Wѷ>W?}38Ybjvֽv.l.D_זk/U]-;lcMGH"*\8REQtw;0:%&rcl%37_"e`m%LJ޴bAiG ޭA%jP Gj Z_#j+8o[V-mM0x#.@m^б%Zl'Y:8.Yq|E`0euwl6x׍VjνfD BMtؕ+Qnލɿ/~=걷D%ozonjQ-_7EpzZinV#aܰ6{jx%yC@;.o裶Yk-0_sȶRҟ\}v6(hQf ˨ץ<>o[Ο7(/ކ«m+_=IGpk)j?(j nTOa;.Y_{~ g㚃Q.Yݾѱ)+b>.,k쵠]ǧߏEZ]9Ypژ/xRpj/nq,OI} * {nQ~yQ4B6ǃ6Y4UTm5DӞƃ?.QCā–cEpDh˹4zE0 %> .?oŽhՙ~wCiVvAg7 ǝ~xÍ6/z|ҷ$YcoM"S9QGqn*xs켯Emwm<M,^ت2[Remv}K/u֋0f=b).ˋA Zi JÂs^ B6&`lR5{T&<~Zk,@Gw (ĝNo) {5p#2*ߏf3 W+QʦR[qA}NZ͏cKߔ7u?m%uhr3\7g/iNCsy@ٹjV [,Qd8v6ZB3D,Onp5|V$n"Y;`ϧc dbrY\Ί_]v[dݮ^Ϳ޾T$8눿T܍seyyLFc`@iũZYU8LǗ1o۱sh+V'G } Ui#Wݺ@xw%2 {w{~J@0xSqϯo,i~:__3mo# Zr: hqM*R\ZMz"/M2~L3M=~Y3etmnޑT#J\}A=eMv{mu%)*IT5x]3jC lv[=fG׋O*ا`9jRj={cͨLjroy$V4F۬Կ,qaN|\ey.+HNsaxќI0ΔLOJuJpyYds`ڙӆwZT#~"l% AFujU2uEUu'clwyp8*Qל~Jd7빳;}h'd/Ød\;4:f %1z]z;b^\C\ gxZC^y?:H*hv7ה++JJqfӤw>qb zsWZɟj2B$\j/` /-KӃ3s$IX#I*\g28#幍'9\J8ImrRHB[[ Q@&3F2Rk cZi‹^gl{V]Vb>Dqg/[dMЖ۵%:6kU{pl+.yۃ̵LDfǓHHIbRTe$6h-Z9sEt9Zb:,$AmQEUq;$Y3i0SQɨ:lΟn]ci=c*I]4f㭫Y1)wx>鷑L"jomdql~i l"wڍlW(v̸}:DUC ŇjQiOksNt]攳ӿ¬`/Oe t(7 ?J+'(I_"Z)Q^`V:1*׳O>[qδ:9SXjY% aadx{MN+?ؖb$wvnFcwt7댁s$w8Vm7LKpJ4-GaA4qhDgW /n|`Lq |}j<2ĀZ~7B{%]u|t4c2!-N蘒V[Xgivƻu7Y9q< ʢsW*5kч: . *j*Ê񌷧F/.4->v!p“rDN zohng9xپJ6ɸO-e`R&Iby,2)cY\X.ip@Kݼ ͠Q'G)pkGU 8 C痽[*ug\t޳I޻vמ>ep -8v* ka3' wI%,)ti͸RZяwc;WHB2wn^)\etE N0 b\:Aӏc T3tDDu”[0a\A2m 3@e_R$K1 Tjθ`L#bLX F]y!sv:Ifs{E6f %DiI-<ׂhHifF8ܙ,9˄扦"}؎`i҂Ҽ#Ezy16%~N5%$MVEח_o|V|Z*t41H wmqIW~vyȋ;3~X 0yS|[̿U]l*Rl+IMvTVĉs2dzW>ڱ?,}y `Xh7z >ѾӽfZ۩G< ϶eJ.nwzX_c9jn$EW.ƤSBSHIB6դ7 ֲZdjq'zXEe̢{_F k!w@0_L2N?UFOMl?˅)D[B:`#^UO"z|QXRQ8zAN2Ku"9f5bbCs.h#Ue/ƀS9P(T0p\RWǧQw,.ޙ~MROwp`anx'3]I͢.O)^ahv'A)]6[~-6?K>hŠ}i{E`6Zdb | S~+,`}ljQ.m{tV!R/DNoJ:|u .>{3Y9sЯgӔg۰*nXny7A4gP|Iwh2e7(~i)2m kmWo6 /^0?#ަG n?zvKx _ חM]7j[,fKW_fk\ިwWՋg'}/X߭6K  .{94 3apmq釳Pߟ>[moWNg ۝NoY@ZdR\&;5irZeHO= { 8\ĕBx񍷇BM/qZ}-Wm2߽$z Yt/ml6q͏1#_4&_%Cn(c:f~ )֥Ρs9"4]?^/4Ydj*SLS$P#KT\'ej.d,B zɳ)تL+g7;\;]w[6XJ{?Uk) IEStn\;==[=R{=q=}͞D)m.(y:~&^^DyAĒKڷ2TqΖڨh+I-L6"1PHr3Mn1{ c;Y!y)zzѯ/B6>j: آ \hM|pգ0K߹%+$n<׫_@)q@.0K@ټ :mGS sR^J?\ zwt<_Ի*L7&]D )b?}Anh?9[%=/̍3wNmvr~ܫլ/H;ko ;f.$Q?0e_g5?lf>9;垛ƿ'obdk2ZZáU-u1Ϊu:;n.$NVZG}"?x}w=?sVgk!G'v.%gZY+jh7y%15I%(҅c`gnܮ'}(⣑$CDCSV I\khj^S sX\$k{?߳;~ 䖌+2#Hбւdd (h2JMVtjMy1n6Z m1Zzl"w>'\gr5\X+bW WOTj1&ZJ}@. 8F166cF6c6.bldZ0:Ҕ)b˒rx_dF2S$QgGK#m=6yʒ&S$EnFr{B.7U6ւo96;\p AC'g`D~~S/YR&ŕXGA P2JbD(fW-OSu.>7gCнJ#gJʩ6Ϭs j6՜ȵ%s !u>ÒxI1} V9@5 FK!1 ru^'Ń zNEb9@tQM|[j֗Bd;dr@]0VjHKEw)vTF%~)xgP͹ L.r` zSq[1`Q n<)p-i簮nlg EeiD)Q_CWP@b1bunXRՔ.\(5e] 8wg=Ҳ9 Ki"*0F,dd@9/ ݻZ(ZpU6!J-( a[5pn\jX&EȆv2gOK;6@,50!ЮhYvBͦouWY C@~a ւH@Lhel]8χVxf#n0vT@}E0*copPRH 3@e*lkL66_@b M%dT R`hq 9rCRӢ#K# 4}ARPSRo,ՕbU.Y^ud" M,*OjzS KByt5JY2Z.kI uWh&PGwXj9AҤy"' Y\Gjy}ƬQDs|]1&@TXʝڡ&D_/06/LE=.ῗZ4JFգ=ۖ׺ko6-D* 7^xu:p<(Oҗmg%SăJ r2ÝHz=-,$ /#ȃZ$J$$CdB5U-1Tk?H%{, @|>1 5dm<om Wt_Pu~Q& :U9;V m+dəj&$Bd|&˭vz|o]K]@U%A-^`bקXkxhD0hc c.OyhC4*st]/%4J6n5b D="=ыv }JP-6':w,H ex /;,u"/90*6$MyB0H,ϱ[?,A,  3%VYRq9&P?z ZhwEDM{0aQ覒|y&B [ˉ`k^Z7`3,vNj$k4,NucH@)BMs_z+$BExfܭE٨@?ofսKPA`/ q{kCМqsy=Ehmv˴G' s]W&@c7669 4I\Gpow'fqq\1)֚BԦq4ZVK ]ȘBys0w#[/J3bK&F@9,CeQ!](rr `X )QХg':SnC,C-.3P|o&bE_1h"NS:X&j@᪛r$oY^p[0p$}˜.24FDCGu135P<ց.,~ [:Qbr!Ǩji=Wnl6rEb׉ d *jq pm%y=?jeB+̈́h fX[ш[,FCJV<Vr&@H Nk΀~2.N F; PH՞KXpw9vl*IVIti=Dr9d莢<`ւM!(4]Xv馀+9Z4gy]'b!wT@.8=cQ w'Y\9=a"E-4u0)(Jƺ  ~~C+Oq4dؕ>W$'nYmw~^_!َxHF1 \2cAMУ<:_Q'F0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 k58r>$v^Qƨh ݨ(]Q 48QguQguQguQguQguQguQguQguQguQguQguQguQguQguQguQguQguQguQguQguQguQguQgu>Q8!u pNF:q^QQa":hTF?{WH O`31̾z0S"5$>,"*K[b%""/:HAuD$ Q:HAuD$ Q:HAuD$ Q:HAuD$ Q:HAuD$ Q:HAuD%k%Q QHud9IDHQ>uD$ Q:HAuD$ Q:HAuD$ Q:HAuD$ Q:HAuD$ Q:HAuD$ Q:gCyp׃oi %gc x h}s0?"תyEBWxE;rJhUX޸r]iϿ^YR)tW]7gwWE`M{㮊]swWEJn]Cwe U]Z޸"iOi?])Cw.c]MS0L64LnnNg_n@='3T]]dWZ:ʎ}$F U%8q+G־ |j#YUЯ>S _M] յJ^OAǟ~w\Hb04tDLrw>ydg-,^hb5Y'͇gH=r ~Łl8"kdS(fGξ,mo}W8"AJ8{A}rW K ]q싻i!9wwUdMߣDZڧPZE$*wR!])5GwݕX#wS.z㮊]iHi(]&bZ)pvMgE-b-Rwnz^.gPd2IEC.U 47x>wQr.4"J/?_[ZdDxz)Gx.|0/W W~tsWTv%0uްltzN6(Y`JK%Kl 7FfEQm/3Q;Y%emWT,d.>UAH[]Ori@ w`{ZN>*/2+?N@D=~{:_Es4h-<.ao30x;Z%1OߟR6Qj?|ίگ/?Z뷶}= x#ztj/kmU x"I7)5)ˌN:9eQgiFtTK*Bs\?O-uD~ ޺,}Z.R;wrsh^J' {rk ӠU3@X?.LAŻxyT!-^!,;iոͻ.TC-?-W}orl{?|Tjn2NUn僟̬;1MIU~㞺NyO?ޮ>KݼSȷAp/?("f^m]H-t𧟷Uh*KvvQiq 47pR7 ~#tq+$VjʧWU_]uKf*ȼ HWx_@3btEM#+8הu@㲅5w?H(لC8Ev` .Wn05Je$8?:.mL#i|hk֪"ݟ^8s> ͯo.o ֈn]ޏ$|J 8,ٵTv ri6VJ.q!^{֚*hv[Wc꺛}|L}t}E-;rOk?Bm@PAi#(%G,N 9Z~=2yTW"r_ iWԲ$uSDTCq\ߔu{DT5 8\SI>(0F+kzJz/_katJ*>r7y .W>7;ڻo0WOËhvTڪ}%vfp/qӛ6 _vvsvE}g/+܎k j3 W5ZviAchVy]Jc]]-200_0w"kxeBxa_BxSJ;{u1픩PRCX@hXBv4vM/=.T;4tORnibsE a*{hz:68]Xjbfuez{n7xt 6&"DSrVsh^^xpM8x;Z8`m'>r>aq?u<矽[N?uxv yx/q2s 0UBV2V*\pR k'¾gM\~|'hDO!ԜDYڹJ]IES-Ԩ֠l(=c:&Zq=O4&%T!dې2 3Q n! cs=i୍c!TĜ7}v~F|Ԏ|;rfLz4KV;4~.̤[h @ 1Z \a@#oQeW_e],svfRq^2A9\.T>mEZ谻BXhجvo =|('1\ԓӴoL[xd@IL._,\\LnTS,g&33l4U6X eeBU*I`b a>%f1hj~Y`CFxCuYr̂3=|>[^v> 79gx1_{L^!\JJff:r^Pcp[*D,ΔH]r(YiTg~A/?T"=Ae*Ok!l8,i)& IxPi͸PPeZi 9+GӯuYf;XúKJ4ḵ˃T꺧O3[b $ԥtGIUFT)3N[1[ϢK,NQQR ߉YyY8eFXZlhvu;g̓hPhy4|z1gYqVW:(ᕈ٨3Sblf&aU)M>`'D?,vI bLx`L1Xb&(kcI;ׅ=XAC{C6f %Di5IJS`>TRTdB8˄NSJ.le}- - -픖k ;QHLHWQU/VW:9b?b+dܼd\+o:WYDJ+[(/Z{D[{7n!~Â%7wؔu{9/8I,+NkئńXՊ |D^֛:`,X*7Y,ZG]™depfrrTkdP܁@:YZltfhQYt2Mkӟ~9oO=1Av &XfQb7M<9 #"~*#}BH%'-Incv["UL kˁ GI HF09R6NJuPFk!h Q9^ypo i@6) :+ՙ8GzNd.`~fAL7 C3ϹE;v:*9f+ʤhMI&,xe%s|gblYK)U$d =FM%-3]U{vvACNSUuvSgnVْk/9i$\&4̢a;3G+33SPPD(dMX̭%\,MJ(:H;9zrºXE-4_1/pZ;%9$fi6 xBQx2uޫM'/8Cފ/~_ydƅohYR?\fGR*ZA0aqxE)xPLB=I5GM Hs5fVwo⍔$+CT7RCB_:ԘOC^ه,'2k-\&O(x=l  Ə0JǕf-Xl׎RL9W2V /xew<] o۶+BqŽ]wV)E@RTő=I d+Nb'Vfnu&~=z^|_{RSYuat=ngp(U{-,0hZ-3VΙ˽$Bh>nW9WJ+ԕuR˰s$a4E 'ap2oX)d*`LVg6(ۉ0݇7Mo?|͇L}}WԻZm%1_77peVP޼h:Emzk+6;1׌T<[ D䗋A{o$~jU<+u%+y}ziP&nEۨAγBh UkHhfOh(UPB+xjX\ ebLq.px&a!\f%r"K\Eg(=Üй}Ӟ{ e2dkqcHh")܊e$Gr$lf{;{skc;{$Uh-b$̕o1ʊj W[X ;R*pʐ!O#cwA.4p&Qzcgn!p(36>C9sA䂁]Bkˉ)#6Ni.D ZaW|"%o(4̎z]YiRrؓ 0N $,a.aN(ESfM$E9|ZeӚz=N=t;;"\nnZ \_%iZg%-Iggnp\:Tn|4rM %f?S;ckNq2bnxʔfΑL1 Z[VZaE9Ugx/xNRVM2Lm挵31Qicghj4"Iޚ˙GnepM'8UjƟbbCU~W5Ym52l|n{UNy>3#gOU+nyuA 3_/OfyL=]jSH]?NgYdڤy8 Ye)SHF׼wrNphk伐0o~u^g%~2yiL[gc34՟']w/ =SLv븯./,5hvRC\#.*[5Gtj\N4)|*ΤS]fԪPkZ\jBcD;S3տS:h@̱>P3~ I@7b o=P~'w7:g˦ zjjm^}{:ܐiTzvLgj.75"E/u3P.ו(9sjP%vLWrDwKWjjR(_tE[] P]`P4tpPR;]فJr0/a4EfG_ܙ"UYj1s8*#^``\ķRA~~4q/6 G*VFC?Hx U2;uz*b;]VݨLꝏ/l1K0oQ7w%jUI(nF" L:;!)16{>ڃrQڐ{ S9J)6eFT,OÛg0&7YY׳Gpe@81`yp5=5nNW;쎮T 3X*}j9]C+Έ (" E<*U(}rAvtu8t%YH]%>D+ˑxjU@lIW;KULxA*ܻh|7MT?_Vg^%Dw4 qt7Ǭ=&My/FwF OrB@IxyVqNDL02 pi4Vl@f@)hGWHW+)dDt3y]BWwRw!bkv=!Aٽ1޵1 F;0-]C)L]t:zlc!1FfGCW,X A;Bwtut޸k]`)u4tUΦŊ;]JUGWHWTreDt pu4thFNWR,쎮TŤ`WFCWSokٜw+4">u-xZ;]JF::DJ1+ ]\P,ttP2jcUKQWRˎf n#F@);u _5AwԵ.`CI7jwKWmn#C+l߶t;zlcĀ"++ ]\m tP*!Ut Ѫb1J,ٛH#|Mtqݤ[3LѻZڍ='Ӫg"U'.̞Y)3_+ pծ5CK~E0݊!DDtUEUKu,t*t(] ]1,@;4.Zt(7wtu8tű:&uPUI,th⸵CCá+A(!LnӧO0֧0ՕAV>qXaRJNHjrS#X]guNfϡ5xL:Ta7e$8Ų.٪$Un^po2}v1 z?Y^\ ?o$;4[6||\^>|˻ Hz$7-|LEQl6'IޘDj6_2d放G= MfZ5 U=1<0W '@2O3y4rP S!\oǔsL(Qm|rέ̥9"g`;X+\NH!IS 8͍g| uw{8ZYg!-4op.8|~10p F'ˤo.kv |QILЦdT8CpoɅ&45,,7R4Ū붹 //~}.7~[%}rz]ލvnA(-̮%#uGw5ӬIs[RFjƃRAtܿы*uGB;]Luvs22o7i&=,TMB)Nuc,og]%js5*⩿q馤>+k mGaXNFZud&vehtTVV"E?5BReqZ2 \a"{ǠB@]L_bT2x%S7Nn&Me?9]Qxrfplh0qZf{&)fN( ZcI@+LwZs|lq-bzYth*'ELu T0+^Mo1{Ud8rK҅S&YXY/In~;0r 3E1Qv}7gKA-`{$=)Bzpeۺi&p~j0$QNܟݚP]nKJ-[$>zTnoZ~DOh(,Ӊͽyt]Yo#r+Fr}9@ {p"H:ϯOXՒllɦ1MSlv}WZzHF-b!,sGLJ!m2'2/`,IF?x8 n=_w EpvM&='/ g讋]L8?fco4]ݽn?h`> a[l^w=>i:tqAt㬌 Pw7aߘ fԺͼi:gփ;?mԲBZv>'GrYct&ClƸnbl53r]Oj!6 ci1.0sbb>xJ(2Ѳ*]T*fZ!a2ْd.A N"w\>v^Au9egf6,kۡ? 6ԝs)-[!GopVeibJbfRJY99+&IatV9(b/>p+Xq*r@ND5:r:_b*L/D_ܷnSijV5VN..n./5ȡiP*G o*7*2߀doqVxu H4j0JSдl;fLJQO*!a:gq.97Zz̩wSωZֹhbY\ֽ'V!}ׅ֯v o6B!hJH\T8-mp_n&&]/X;VuL`sZ'jO^6׳ݿ˃9sF9$ܡ/)xH*( 2g k00&.#v%nm6oa͙bs2=톿o_j7nIsTݤpyRN:RXmF`muN!H^C:A:t7bf!!Nj^ N_m_f0-kA g/-ɉzLf\M?[;=&zl[ZCR|wlk_ir~6,qs>jeMw8RySal]]8=rKs{c-"s2+0+0?ԩ{]Adx&˛SH{픭Р^M9 dOmk\."1I]/toL޲tFiGxLgqs}Ro?RƓv'"2/P^ΨM:t.&Ue{#=-xK62>Ie_e__(N5'oOl_w.&lxU=.5ēru!p! M]L2\}1Eh<"e"_H4x!Ͻ#o4 "ݶg#)f.o qLQLM(8>#DNCW)c$m;%fg2w6xI/+}VTڷ9'}0C#B7.YY#e!rS\!BNQ(e+T7i[{+W<+O_ N4kzV" #'`^uyQ7&mgpo= ΅'HZ7.doPJj:פI@qRBv՚$_h_͗Fi>]Il] 7=gE~|C,sGLJ!m2'2/dcA޲&]MkʆyܐfQg'{w <i+&Pq:>s1+GZȇxtׅŞ>cIf/ ;<JL$uI/$8S4Υ, c,VlD \*XL8*d|/,PE p'#n~uZlv8ݓ?oЄ`spbt NfE#,L5$$ hf",GMY5`K@e?²66Dp+bĴrE9J,EC{-.HY'D =TU}"y3VAE$X.U" A4Jt1`|및VUUi@byv:U]u-Gtw ߺ ͦqg&[<ꆨnL%ZR1ˬ;]kS]O)s bYz a115~m |D^2=osn}ny(?<˗C 6A0ri,ә4t̕!h9zM,hJqm`J>Ir"3ǜ\,~):ZefM,3 NXJxX&,,ސ,|>WSMf4),]_RbN3% }egPvfa<$ȝ*yZϤQd]uG=⬷W fVZ{T},PZk2Yy>Yw 7+i3ΣJ >!YPN+L(QIMB&yJ^$Y,IʛNGdFf6jFlP'{Er0RF#0!b 1p/SrFb C?@P^*2ЗL\~>̀٦cN:%RfJi朓DH$!AE1sd$rdMh"Bc`Sjn{|$zׂO Ó!rɤՀ6R!m.jlkpFb@+2֕4Cћgth5Aw5i&2TiDf[:$T׵ݾ N ? WJdm,,ilU/@:y f ȄJ^#S.S`^:u˔ 0i%Icp9޽\8GPm0vj %OW|cnv^m& ̴ 1S "43a;:U}-TUϗ $靔k/I.r41%e1S)>'kĀ0:"-p0E)rP2warնbiGBg1S5kbzÖO涘M]zR^Tm! ꏻGə?R2.cO9[R$y!J6 4"̢d@Ne=#*I{C/ *Zъd1>0j)skUuh7vĬ-\4qJQApMB@YR};5#Vr~NљtMK=m/!DZa+Ar:+[1IQI<@ʦC0Ї:]h~Zn- b0(Aa> qTv4"0H]`x{)["!γe6 `=F1Ő3ZWs2̵JĂYdh5G(I1bD1;r!?}*5$GL~ r"\oaevn6` Vޟ|rqc6ӭ#qzah0X>#8U2 f0bঙtI:*Gm^mwN|lx$,`m4 ȯo{Kr5JX[wַ{y//?ÏO?!Ty?|=h LnṾIQpo迿y%-54 4fhUgŸ|q[T}?׭1W߾|7M޹OVkUϠ+a ߟ-S%#sED h;W~n7H{4)Prh.zT>Gy F$!FD9#>8aQ qhdfmxڼ|PA0""aI4T1$W4αSısPi󊭉6oѧYz|hs,gw13ةEuleh:H(H %9P&z0YәMMa( ! 'o{lO ]MFI0&&/JEly/&BwO<g 2!eLXό'!JʘĉL<"bO1Zks R%f7DXő\B tS#ό:;YywaM-WcK{˴#xۿu:Av烡8//HDf[쐳"৓;n:y4*&*rl̍к$9!y[;hpm=JЪg 3\׍ypPt-N:tT_~5JvrvD?֗U Ww~:ipu64ۻn3ӫpRSOKwU/=]>Mxw3I=9oz?F}Ag7{ ׆V G&aQ$tfx~S'Nf֮;s 2 /O Ko%%Wg1H. b@HTc49'o3Hh!Ka'p0D1r<6+p(=5Ck Ssi`;^o.R1Rț5kl=~ޟgn? tpr5QSeP]~#$U(sD()h-# Ɲ|7COgJ݆~/$qm,+ U,~b~h77t&Rʆk m~qeZ^O?=AΥˮ[3cuLL+j-UL-Kz[UBdUҰ6؄lOd!m[Be5ܘyuRrwp-w Et3:3Moi>Ҥ6 rܿj9kgfMem׺E3hcYʤ^&hf٦PiAYg"J}튾^QE<cB'F__>-"l_'?xF;JRg#䫏K/~8 d& NQGL{8`,*/JIt'+@ s(z;~7#@bZ'e)3UEM:zes( ".jM"\k)T6"VNyD䤎E;T $V* .,Ffte ΥNxiaK?2KоGSK£ط kn?f5]7>\QV uNz'M2XYM+ld&vNbt2`ik"qn1]s 9㳷ħ`)rT(Cb$&`1+L0l}4Ȭ- (YlTL`!GyN0MJ̤4IǨ.iblg_yHlcd}f OK۰[,eH0Ain8|K. |Yfc*9dDv,P_ -3ubYL{$CdV,_hY`)b%V1姃N9>hJ<`gV{o%#ĀAK*'{C%,q("8|^.UGdMIqW%8]-Y%7#mԴX3`vɠkRy%`bD Kjp2tVnGhGO",k-]fg{0`I'\T^l55p'E g"Z`8Ec%;]7A&Yz;hMPd~߇]p.m=R|*9НY}DI\KlZrevx^r6BE .(EQr,gi $8D{M"qN" %tE=+"4\P4 @'*OMNKg8 nݔׇaȽ%zfr+Ce%po$W <E40QcJ9-nD2Ivఔ$ hcfc;v XkGԚ8Ň3<-PwgDZۊ=%keHQ/я!OPYB\O8~;mGN>_}soEdjП|,o>\p~?Ux6yGTuE (ˣ\ @JK!gnOe5w3.TB։",m%B#YӻFM{Z_ԓ<1w[Mh0}RJOgw>U5̚%x8e) {^-b@fq@.m7=XxN*kL俻zy?߻YvܫGBMSM'dֈs,"'"*FhW"P':_2G_Qor #q2 񘻠 6KsQq Θ ,xGT:ejEgWiʚgYիET[s7)ߧ'O0\+Ct#ԕ`I샗9 #?$+r[0ߎS&aX fi1l r:WmA,SǏ˽W;5V\Y_.C]~G9`o9rkP `hACmV!(_j j%lUz:Փ 'M^}(J0@PWuҨl\xqxMuҸ{fxaiM|#kAOU/K_wJ笕Vs!dQD5Ncp0u# `T",bY@2ǍԸv:;ly폯|R&{ȫ#^Q$h'>rQ&*D:RXUuK!N7~t⴮hZ[CKƴUOUnI[<߿,QrɥV)‚p!1B^ q4tJܪ$"ME衎Pe^Kcz7Z;7&7?󜭞A2d@%f@1s2K$SMDnd1at d{vrӲ/ÒzဈAٳB^Hk u!=訬-avs:xc8(8#&IEXPH.8yQŢ,YE<ֲ?qZiӎօ+U]=B5{•ܹW5\=YƋs. eR"/@8 ScUn_!byw%#‘_C|{{ʔ΃}t0Y_ۻN11Y]E=<ioər'BޟNa`hfq+q~8';-Z">e0r1Ş ^ӗޅdKԳo&{b-X-HE5]7_M0dd7W.i8+ǧWT_{k47 :\w/gޭ z)̹?̃YJ{շ+"RᏗ:_ɻ37\v cZfkK6t [YUd1 G5FQ^,3h8Uvk%]5eҐ6RV>Cy+J5?l) j҉.BOw|m~w_ sޝkuD=0Md*[@#ݛ֢E.Mco=Eh򕷴{|hBڡZ~a? PȗEeޙʚլI^q؋dIowץ2ZT<bC<l /bw.!*#-p%x# wf BYc(:#oRr\yLDuYiGd^t(=vayUm>SaQ&k?@"&("(L[R$uQ(DF N# ;GU1x&BP^zYiWu[6n1R̜GΩ=S0PM&*؈,@XYTPc:Twyn`q~}@]@{/d̞yв[Ҏlf<AC%^tݢ|;L@Z1X%)MU+V΃|`!r9Iڃў(m%FM;K{ }ô!QebKi[_hMV!Dž?%h1'h2f+P fV9 >E2mp&6Ǜ|t5fSFh;c Ui0ۢWW%M0ˎnXz4{zF c7@~7xxx{i~U8:-s8`W6{M>/.跹iޟV`e ӓ7ow良ڈ'W޵h,Ji>ShfDXN zRvwl&aQ$o 5G䊕Q P?SFTT_%U+V^yKs*YfdPߒ**^2HOga2(FX:}"Z^?߾e)%:eЧ~$K`\r#>T 1]> KZ}hi溗hWtJvc" <]ɚ8傷㛯"B: Jpw 'WtdP{gHFnjk&K۶x|Kz)+0]U3dUٮ;q.q96-IMIV8B:2<FD)T/ re9z Ev]ݞ=$-~ y.TG;}i*"7n>ռ:ռCCgl^y[QrDZZs3FAh7 fE"nO-"U").]2YJZθn \qi \i;\)XGצ `*/&)W8`W/5ĮH`X)zpU*ZqpER*ޱWȸm+dĮ5H{`*RL\2LI[W$0<7WE\iWECcWEJӱWYiEpEiOUxzuWfǡ7' pu/ 7O W=mpu?)9pe:zsn\kSpUE"iWEJ:zp%V+j \BC[HḵI)W^\Iɏaս'&z?qj \i;\)HʶH`d5pU孉]i;\)ezp$ZiZWE`R0WECرW869E`gkZ }H)EW/Јg PeFޟۿ\9rw;dYIƋ2YڛW:J<*12 ѣ /3-[bS|㦎8!9%3R5˖K]Q$ƃ ,s|N?N\%tJRgJK\*1 9;H2A2ym6HO;5'$+4%\2:.@L%0'* FL̑6bY%eRKR]Hԏt/r=i\:Nm86\TZ|ʓj*Z_l2U6XAP2!*[feVj0$jt'䐌]2dnO3{9k# J (Ĉi2@9ȄdL1E&hkcg,N\v#XgcƬi"I6k2"\ G $4GLDgt! ,ii8g#4M5DU"zKpueY~j%;l]E%-K1٢cE{RDqh1"c?fOR.+ȋ&ɍ\X]O'gd&2q.3}|bqv܇3.T4),.沎>XI~>t0y}1 q뗽LbCBlv|{t'bI7͂ҨUƗsm.Ip&Yq6}6AGB-U5lsoYǣ4oY9Z:yy&9fbWtq͎ 9~492CjץC!Tz|EFrݥ;#ހ?26s09%R6C Ǡ "O9AgM2Yj[VҨH@ w0kL a[ɞN^$ WO5w]{WKqؖZeqy&Jjm8Sљ]V+}_O[Iq]|/Yf"x]3ʀ>a#ϨSL 3#rb,1˭rwvr:YfiVƜ2I% FtH(QF̱,e9q<2AXmI9DVܳI >h1qvڮ8&3zmJÖv1 8!9Y<+рKМFVZ%fNؤJO4OE@N?,,Rv$ъ%)~d(5*(sbVcT+LIK}cNK޵q$2?{lK~09'{vq )1HˇW=HC"1b94{U]&8)XzK@FC4"-Lp uG+8Nnw6}W};O,$ëR˭޳4'5 ތ9MSV@ p.֚ WrY *FHrT;,"@(?h9K _k,\c- _}xa@3.*B?9/@S>y-R^ђE-(:hOX7GmAT6Auc]6z@" /j_C Xr.bvA)c $5FHβD{M;w۫.,Z^HYIX$S!"BssyD(ohrZ:q\e.zLfQZ ]l6$V!FFoQI5&N$NRumL2LN[(oQzA?ag|t:LI5*{&ιh2Q%w%) dkMr+o|Ք Ƨ[tU T-YWj)qLßHpvg5!\TեcM>TmPdL#Eq>a;R |_Z)(JŒ7úܛ1v8e/O&kr/ko09ƥ%*E1P)YH٥DxBaaœbyV6xNr|[vSZ`X`7šݏZ7b='4Y߰ 9F-S-ubZ{GKۘ)]xb҇UL8pM 1A)F `F#c&ˉ3h0P|5-Wwebc1"OZO_{HuXķ?947sI-)oԐr鏫G. a5gd.UxזBp Ȑ6"8C9h` {t]S L ;guH!z " w0['ܧhLJqN"`6(SZ;dJF|nt Z#,gNϝV;D!RaeH vo[ly56o.4$ ^ˢ߯;S) Nދpvչ L9)C @|wc@Khf>+ oTH,x*URp˓Rъ-BLd u6JQX20UiiLF~a݆c6%` R$Oc$hN-ɻd;? |:1KpH @NhM8㌗*oJ% 9٦L&]AzN,,GI@EȗLy`XFf]13)n(s\aCN ke4[[3 |"1)R^?LM-Tяe#X\I1,SEIK Pͧ( 3.5ɥ~Rd_( yUe%eψeJ6ÄazvϮd0fxg l7goPWK-0qzAUH'J*L  =YZ/W`'D~?6Pb,JPr#V Vm[_]\Onn?Bp&?sM3U+VǩwpV\6GP> sZŽï& b('rF̍~z<휡Xmݤtzwm[qHL7nH8<|A q0,(X$B&&'MdӨsg >2>|n W*b8/cN9[өigq{.x]_~ÇKB|wwpV`}Vk$L¯; v!퇖azh6C6g]-*}">p?}n}{k;. Zq/]B 7Bֽq.JL̨7B4`&ru1~K o9E%'SEC?A1ZH$Ĩ2W)7 ;CthCa:G'Ɇ yK(L<$E&ATw;\UxI%띓4YE\չv1҂ᙋQ2rz+(7F?@Vlk;_xS|fA}3|ގ?ܥ=x3=F\}TO^F\_~~7j͞WvRqb]R&1hz(@cJ-<șW_+>&-7*x{tU.lYZ۬٣2Of8Gߖ'k1zK7_T-w&wx7^Ż,Nw;"jpA/ǟxeiFje;guǛ{PY$[A5 H%,F"5Cbu?p.dtژ)с.O(A3ccIڿ1\x=qh})KT.feFaSm1ʷb-guw G_\]rϮ )W?JNhyÁ#"@\L k:I6uc9ƴkTKIg3` ]W]o'ea]m["ג6OAv sק3O4Fкnw9otfs!YvXun-Xfka4~޿q<}۹SzcFuwNJ].*qj:NsMu9$ۙǖ^WɎ`:wޏc[WWy\r(EɇKՕ}1&סqt&on;ba۝.ϭ_O;LCvA9~9fn/䦓c|' 9oo~ 8_oW [[)Zbs|b3E9\:\c f'/`>Yw W/,XPX(֖|2Q۞^m;(DWmL`b{"[:mLWv$).w|eNc"ˣ2Mץb Vtg gu|_o.n2=ϴfQgfp$UE UR-:6a;z+)Tط[&ji*֙6I^QpqKZu7ޖ '-F_\|oeOXol?zA; >'B>{ݺK\,,?r-gYo<{8naLRWM>@X##ꂒ\pun{+d7a?j#4-p'υAWᓼjy]ȵE3(QAԩ;]j[5Gq[KF: XNm\ g+֐Ȝ Z&$ ûe:sNFM,:h'mF>ϹN,P= nm7IB04Jbx~F K+!dZZ~NO F8D1]koG+Iݑ0^{` .pSBR߷zGMGbi^:]}r@K!3`ŽW0CNXm?qd8=w6Zw7>hXǢ$2kA/ɰ:3w.~8V.\+pF~`F >0{Ykc)x:Ҕh2ЙaLY RӇՃvQ`Z2+붱#!- eN)S}lJݪRhrIRhZ ƏRH@){J=^C}ZS_ V>338(*0֗q^*]\FÕսSx61G Fn}㬅I%LR\WŒ$zxIEڂt#y0X>]멾v&N;30saD=>y AHRVa S띱Vc&ye4zl5ͭHgstrnϟ4|1vWaIPāS4ZjtHZ#B I˧XuG (E.(ql#S{+%R.s]t8f|UvBGxЦ~s(仳I8*TܚLRQfV?d %Zy|&Gm7 ,3A\\_];m2.BQ߬bMq^O# \H(cWGe9AQN% =rc4&ZDuױgcZadPg($^2"Y~4 ",iP%LR#1s)i]kY5״^iOZ͆o ytΫ)/uzz 갃GZ=RDD#YM8JQ 1 "J%dRRZI4U$} p ,g4gпax{K TYFKm)9}ݧu^`jc)Q?*8_z@Pp\ xA7%>1pHJ"F`i71rES&y$tVi,`Ve)A@`-x8z<˝ Wu::?;' k|v2,+_4`|wʕ#19fqqo1<'<-0J oJϤ$—d.nx9Mk6{ uD@4ByȊ Haf: >5"Ay٪GQ](ܳ0 Њa2{e]Q.31JD̙ @"%)3]*DgQz=r?iM:] ! 9E1a06j'dN{N S`Oq+R^E54y{pTP 9K8E"Pc zQ2MeiUPzo@ 6 6ߣ5+.>X ,r0R =xsi1Y1\8&f8_T)=aza\eZ^^rYܩ֞"蟯ӭaJ6U"Xs&4;HS\3)(ԝ4tIr~o(08U `I6WW"rS] 2p洕,Uc靿0(|VN?Tɼg_LbĚQ 6N\o7Лi%؊){JUbMVuw~n}_x 6T0,ʜ/Q>}KM.-~UdW?Qӹ0!VU%1bH{1,,oA}9 z0bzӛ2]T.uȶR[*56NFGXH*u*@&ۅ\|q׿?Iÿ~޼/?Xof%J*轫^TVE h(oQ4UK2gޡ\r6[6fFځklk@/~jL]:rQM ^ 4]|@Ak¤XTz;C 7ݲЛ,}U4[4;7%+as> RYb!Lq؇<"ȅpJ4E8D(=b++Vym$-XȸH`$4nEd>Jq$ $[oɲsTIJ%v'XlM>U\0=R w@4.G.m57_R>y7D,XF+5# Е+OWޖF\Ԇ2mˠF|n4 /ZDntP)`F$ S_ *Zo¾ B(kB)0%8ŨXI0ΏW5Rл`q8~6l35u詥#,e(Ue (e9^rË]hvD&H od"}Me>][|R^e00 ʫ#@T#MFM\HrLx⣇g~p/ju~lS#m$Ԥjl]^ͤh!FKkWZIIwucqmy]>ߢ&0(rn4Be%j%VRAhWN~ fhv-9;}!]N'U¤W4~S)OWaR}r%)@h\L?:r4FTS?!D흞X6Ev;(H 8cݙ-k% S)!Xլ.j¶0MSOY.y;fZPh<ﮇB }=6FF`rZb HƼWzZ6*cCqhv!BNL=x: +pճe smVb.K| x`LY9&uHe{y%ɥ\K㵲2aDD&pI s+&9NxQ>oI8h]|3w.̣0ToӍW˫ M0lKDrh --% }K^Ƀ`OΥYPia.bqhXX X;U R6HKp3"RL k%0l|8[u'c"#j3"n QY&;ʆd>hD>%F:)vrmrqLC7:j\Ea fBYwޞIyk*t/?* 29W[jVP슃#7#̲7 #B$ЌB)|B0E >IK.׼ֹ-zO lVƋF\s6mc]GٵY9L ѵ!*̊ԣb%b+ Wl竓3gDtp9݊` y8򛫰zlCBL@؊JA[R.k%OƮóٜ -Ho$PlQܧ5tqD񡖄l/ö."ޓEjE.{Fz9Nј_trz sZdD)1lf% ,ӽʹV)b$Mܙ֗)hW`y=]4rrf׬f9苣 ^>]r`:jW4r*7Qv]{;2[]$ ]C9 ȩ'p"i\ x\;|+ m5 )2``R߭P!Yu0EFXVho4bN[e6!J2To4&L_7NL2J{,Gm|RҎ6++dn٪6˩(sX޲^VYd{i"?rW&#XWv. vGw0Mȳ s34ȒpR`F;58A]ԊH ;\H8 NH"*s4xE #QHY5Faj3jX$"﵌FMFSq3rZ2P]iK[n}y 5nZ1yl2lU6_W "zJ],{L%~. ,Wȧf^2Zu8ܓC}FD$LZC#Az"PURRD%r9KhE\:=Vʁ,z̝vXh${gqoR0#4'(xAclj@͍g1bz%̧|RMl ֗͞2T\1CO\T`'JLI_e'+_~(O) ˌSgpK}Pg}ñ\Nhli;b?1xo@_eyK aRu $c4v_w8^q!ȄlǍvKjnJO)G]A'8DRwJ>QEw D#W@0#se\\%jw-\W@%Cz >"q9pCW\E\%j%9tqT.و+. !U" K1;qeUR^\=Cq%$G#hUVCWJ{qŕf?07CWliS`_~+`&pLsgEE-pe`٫)R$CN% +k"0Q R!M:DAU\f  v~LI(%cQgU<&*?' c^'5K"8yc |n;IKmJ,% l1`=WؠQ0R% \ d#/T6ePy۴o!IЂ[Tu;EYyn[{ _ʘe4֚x >j7 >O")z>˻c_ `GX磹f!PQ`z1+ T ֘H *)bԜ`M`rPd&L\'ZF8^(Ae zqyq-iYf$L>*amNH ]G+|k\hsu^lò,hn5Ygp9iA h0`5]E縷 OA)8&=s@s{>&-i3Y5+ 6/)P A7 3âRVc  0B'Lx= � |fLj,%PC-9$RBa Ql4wSMmIDC`P8EN``0Xm2h='9TFKq3Rbn {Q 9K8E"Pc z9dP$X(05e) UHY+؈d>Δ5{3B$y\JO;\C W$௱~cz)1C>5.ewO {y?#Y&"~dL<)V` tNbNMK;=ͯA^ΪgJ``\tbS) -,IV+#„ ݇FCEa\1gf0v05 f fU[^!^n>ʰfRw[eS$wC YqbU)T=(_O ]km໛ b+ b.͗8R}7s"jn f'$GDw}/'tcdSc5zӵPHʥor_qœS՛\Ǔ%Ѿʉ׀vW+PU[jyݗʉ?CwnVѸ--\R;ݫ_ש=r^+k)L@yM>jb w MX5u[ $G|>JAER]eV) 'FJC¢4tOہA"mHd#"_As4xmg툨GX8t;@9](5e& "OZ ,"رػQq!FZlW@&^M>l>\"0w9%w/_>F_yk1:)dh5GV{ "E^I03gN޾Ma7p̿tx9ZYb mo˾\SC X*ȥX&jWNT =|ufh~;r4kGfӑL2PԄ`e4G.酦FL(35Ϭu.AKE"%3;j\y)7Xuc>{ 2?20U!8\7KKӹp=*vj`=[cg)>иכBBmY?ϊIFREUGVةR*2$3)2'X%I싖V5XiNrsn,% L.G@c ; D`Mp(NósL(б6ȹQjE8#aRE]V#~~O"gCh{~˹.$pRrY'Kh%\fُifOvO⨭j yqcC /.bN`?ebbJ%?Xꚴ}9 3z0֗-X zäsA0dO4#9O|Y Z9 uq4x upv.t'"a洢/Οq-x5tX C'bmX׼0ʿ[IF'`By`p|+F-t$'p&i&t&i&q&kUSh͵8SxBČ"K&z" i"DycTs!"wD BED( ^(M3fEiyoC0jpoqy3_?#:܄;>?{gE+5O)]]=ȝCynCBp?hbaN^m͖wwOy=͞W{6RKF-jvW #zj{=o}u7=YdXcj_~tO+gI`\aC=msLc=A$ӥ.*9\wKhp}at ~-QA0uTWӶM0Bww"giPoeHدPf-XQ;1pDC (5c?f?D+T$g2&RIdZŕ ;c| '\$ѡGᲯcJ1A0rg9Uc g_r_MY˻AXn%L<]D m?<‰Sq1:ieτIghù0neLIR$Ĝ7pTLb boZ)%- )d` VsJוquWQrųy0=71XEX4=n..zFohhL߮1yc>$w&dEh.I w0-vb4mΚ+߿qMQ`.7]n&)7S s|׭OaN8,^ 9TѢGTcW_pn].w*dvyRRm˺K\}?`u8P< ux*%=V{P;UԛTjk|N?FڶWezetaw[.`VH'cJ}lQZ;Jqc /I9ŜrF2;Φdtm&;\}`OA-J[IE.IƂv+x' (Aئrx U3K㹠n, =ink=W&FYtr44Yጔ\;%1{'qBG1"!XdKA('=K0-hrt>k"R>_m#WTGAiZ.HORe6~VedQ%U2ZAx?vR6{zԮC`J')Cb$#“Ni KC`7&ǐS-ƠO5>飬'5,ôK4@#Ax;f@IVhSH"U0adlsEdNKN:TxI*&zSsnţx0Y+YAUkǜ%Q*pl|GN<ʻ%QhkATY(VhVu=P/\q3 ={ f `^':sWG ?$I#8h-;+ 1"{˕Q$$]എ,[?|E5FF׉.˅绱CX٥l0c\3.04V i=Z͒9 s CP 9`rEEa 鷈']R)B,$Qؔe@0o¼*MKg3LVY+Mi ;c.WZ{?{aEǕ'•x+Ϩ[- SWM1:Z^^ cG/>5iK /=lyTdZ_8SVf ސ1 $(l7MOSo-n;wy{%=ڸ}/Z{/=h\8RO×ʶ C!-EʝXHK-f N%{ P/Nj 5sDVږzԿZߐqkm!) 녪LiEz>utkޞzNNm!gtܡ> BED( MoVts5o{0o6IW"5p܃p A4T&zE1-4Ж5#"uq~עroIɂaP'8pF[]-[UU~O*:yWtetݦn~\ m%+XvW5y8ỹ{L,zZ0 K{s)tWnGo-)Dt٥]!Gx)J`s2¨ Ph-4gQ]PhSYg="Y^\DʜUL듿llvj~*@#`?$T h $+}=wH)1k<3>е' @QeK”2Ff!]KBHoLE' gB {q e'1ԬA _*Y & lUHpVU ##.i]ҺRZ {Jv.ː.Q?t٦bc1ރ-v[(~` \Q<{?5660,fd[h*khpG]|)=u7cy~'OF!%Tߺ"&Ëʤ QYJc\Ā%HŃ! ?\Is.U؛@ymf\5ň5Kk<)"i.C= k%/AmQ.0x BV\:|,(]Aؚ $T%\͆__Y@nnG&K4 Aî,Z7,V]u͌- >YJg xbE@{cd5^ 7%w۲ۖӰ-?nmdls^CW2ECO=1?A4=n\-l50UPձO꫟9Ås( E\xR0^xJj0%E!$"DBv.V#QZ[>(Q[/]? H_?u(v-٧6€gt[DV?z}@jyMЮeNDǔ2d Y(Q`qcjV'tP[J 'tdr2gL V)H@mZH4}95VM_@G;b^U(͆͞ПZA9|{dž^\vW(-%K! }9 3z0-(7L: J D9B<$url-:?8~ x}:8o;gVߟXÜV3OTNuA@u֨Se^?-$0f<0_e>JINLLLLL2) ZԊɦE4KR!IEBkx}L$C*I{׶Hnd/^GlI_:x7 CQn;QD@JU*dUNIˀ12Zw.Oܕr?E<&w苍B|L*("0g k0+/W3iW%.xx= KesBW. _sǙnP 23=UŔ$־#{[&F%3ohD3 Cy|9¿)հѠufpyvV:%f>,4z|zØboTYG7;5uE]f3BH;3 Q{>cwi=':ڱM;5>v$K_wtk |'ؤlZ9i3NXRF7j,qn7T$]پS*- ňc_۲wTcQ+YԨ[qx+MT}}l2>9tQ1hu))c˜!gGB@eJ,YZW5~gϞ,12E8zannﱭòv> ybmU"m:Xݥ^w{J7FU(b05͚ճ(dh(bV[#챻sANIOVҹҖ; j3Op͌9m3ET !piJ,qJDt.2Ws]_f,LFC)S3慰Dh<'^UTX2.C34,}ܜ&[yi_WV$+1e31Ѕ+/j)sus T2g_mrK& Aq~٤:2Ϟ:+s@V-$@)YYr!wM#),"`M2Ku'F i&d'QK(Qت8 ,22 'GFTLd%/2:K Fh,dp:'5qgK(Pw9;tTmѡ*CIЛQX! V׶SyCy/L@iDI;%$t&{J6r^zus긦iAԛ]VGdhrEܸѽ5"8To(JN>V|5()PC)N͉{D.ʼn^'b/3e=|=ҧҼA֗UYJ?eIH3X.e;aSdpLIw[&$`;VaMY㟲|>Џ8Z k=]^(돨,1/Khc)KL:,1YDzwVx>9A]T`h}6f ʍ5J1DtVJ.b Z}1|@CqL(кqޠ9ze|C}CC߱Ez ܻng*@Lh&]2×?ߌ4-93"n=g?mY81EeLb\)"\C@d۔ꖮ}G\m~eIu2G--qJJޢue{ fjx!Pd廄ZT )UDALjl캝-.CuWA'۪jn겿+Ң׷P hX>YR4< Ǐr^^lN)Ȝ?%+ 8pڦHYp#ZLPޡ%y7er`? M . q}t2+1 )J Q l&nϜByTd`v}R(\+" 20Cc->1e'GB =0a a;Θ*j%r)&4JtDg2X@DP50bCZiDQ#lx餒kEm$y]ixXl%[)B+I,}^\qnU FB7J}hbى,vu.A7~ue::* HMY[C2,{.M I>@!asNqf$ مH` !ܧWS#Do GU=[ 5f|>ؠjhMCpagJp- QFXHv(r %*sQЭ[<9AϹ>R^Cp`E6(׈+rO5V09_*N5Z?VRgM)J&Q6 Ѩdc H23 L:+iw{pڅc ?.Tj*&ADXqFh Hc(-ZB:@-u.:krZuK)R;%\΃Y]RI(4GDyN%UB-s6 ~izrgU!d-0Wڠv@DʪIb6TR ߜ_TJ_uͨd/u3N;_s(Bas¶z @5 bjn h.ʈ'x=C's.=n L{&U BVX0VsYR<`v!й^t;_khh3SN"rN` 7Qn|`,zI KIc藍Hi.2, ώP3i!D%3(CDI&*/dTy]d Arbn$ɗ.{ ?0n{/ދ7AL+>S4Wˌ{ZXxԓ;OVN*`]?~(s0޿'_pev{ʄԚq)3O2LQQU'Q0P&1.8b'gH%/d4N @/ 20FM72qgn҇/S/ŋp8.sn>ϟ/C !~kgNUh;S'OJx>/5+Qe6:]ᭉff~k %j=GϽ{4=79L3ҏt?&G~it3>~q=bP̍<蟝Owva\m͸ջx;cZneKJ?ŪffZ̢|" &^}=H~9E_٪`[VWޫ!8ouT:pҰ&5Kߌʽ2#~OxrԏM.䚋\?A/?/??_/_~.ܗ/?Ѩzc*b"K x~u󦍨47kZ:IӶ]OZUW>4Hm@ۇHϽ_o4L鏠lenWʢݨIQq]S/~&_LDLVV9#*HmClGV m7.<#g˔V]Y!& ,5jՀse tGmW9 'u BE@}$ҷIJ;J/JSGTN M9bbRNXdFwda9YV?ui%ALu e~2ǫɛ@>}G|œF樈c KF5ۘ5H&FX+RTӎIw[B 6c-qH:U'C2:F)Cfi%{K5PA HIERQ+ eCrw\B0o(N(kBQ# L^cVhXfп%L64a[}NUCdYd(*Vc)︄ZS|'e/Ssҝ*(IŎŮBC- bYXKG:VJbz/g͖b!6]jDi1 ,;Ը7@2{~ڶHϿݳb݃B3[ "Pwj>yQ=.й@ݘ;>%l19b P##N]lMIAF5' %х*(SI\wUa BalrXTI*VyU嬤h·%+mDu0 #o%MVr Pj *G:c}MZ︬fRr̜]1 $tj!:FZYD K".*ߧ'{M~x眺^BG >ہ^';@BʺܫsTcZ1Cfh,xT|A%$TL*h6**+Knr_&!iT1h+)R e|3Ưـebꗏou7KBhz:kzp4\/Ȳ rt;?6^O쾙cftA+$ ^hװ ,  s0Rf=X x`╍~chG00PMWauB( irc[S5rs[gtW3{:fnܳHG]9~ffwW;5{f;{bJO$'h"|xGXwJ :׶k|sO>l5Ng=m7r5ڲd!=KT\8*v:90 @Qg0Nq*Mslo]X[=լ*#^=Q׫I瞷Մa_Z@h3___S,>j[ŗ˹q]vS]+ $Ntlk uWtmUk:`VW.GL|{KX)euD_RtjEOVHԂCo=eqNv83tAT " ~ފU,KdO=f;{4kp~1ZSfiTkFWcE] )5KFL&yފa9؋U ؛Nl>SmrC?^滀2Z'*=$]JꋈvM6!{"]}5^;9y:՝>)lZgTBTNZb#?ex-r DG D~XTCAˋ8ۍKhG̣$tsroGF)P5ߜVxj5NVh 4ذoRbd5.y>D.߿G \ +\y\i Wi$K,*-/xO|(ovo׿_[- zlu %X,GOf(r7SU!PdRa{t4ElwM 7;$Y%(Ρ}=#Y>Ɔva$}xKJLxa'HY\&@]5Ap\QTUJĪm-:UJѦ#䳫k'?hnR̚fPU"@ 6YY#2ɢwƃ鵒Aa4*z^̮+?;~*3ݠΆړ+B7q֛(n{jٵz,uɶ Veq*W2E_7|G+:aIx+Zy_?^r->`-6ۇPF6KdJj\6i UVB`]/`ݹKNWV|N/Nc-;9γzC..._#6 "ae5/Xy(hb*^X Dkrkv>`)}eߐU1e Nl g(TXeo؄%֊ Ju )aa6l X Z) XGm ͐Hkr8kmB P>ll`i OK;i ۃQ!kN\ڬ$6Igzx1) } ;Pm82FzIrԙXz:-\.nODB F;T] U1vx䰼A[ʲ s'VpJjlW5jK6{9QM&Vު} ?G۰g ا|k#uOÎ\b"=Ϋѭ7z΂v)*0Yjul&J4*B 20Y&c *Xի]@)zZNl!.b%TC6r҈ƸmY-7q[8 M]qudkuGGKAB i3yŐY(=[Z.=1"ckΝ*($PaL0q`b}$;6&Vu+Y+Qb2V`Md8kƶ*;bIx8Srkz %L΃[Qޟjh!9 HEP/ {gUļ`^(弊+0b\˲` *Ks:_-Z_UMg69h09`vr Ǧ|8Q+B}pϿTY6YF$0.EC<}u WV^7=4G`} r8^H_ϖ.p;dhw+:>ڝ4VN h `Q]"T`S%.lbNO{W#נWYqR0߈_'_nE=$4\ũ.(d;&u^+GWfW} ߢCS6WbtdI?{WF/{L| 2 Y,pKDs~VKd$mI$jE>EV=46dxH[ӭ"qb40KmԚ33%CT uu|ߙ8 [ n^>όwK҇,;wr->hƦ`^qHSf,1?| DI8F(o޾ҥmjŽztd\mYBK=7o,etO9zykG+1zN8X*b'8 ڕ{VڕERǥ\]t ʋWA?C>"rI)B͢JYĵ;t@rF"[t '763s *VS|ԖhI]w%ΖwCy~HXN[ex 7Ii!,$FR!Ivh59q\Y 3g)5ʻ tDh_AW㜘seA:] ֒'l0-NPD2(jFSQ#"2kjy=T Y"\س}4 EYu1ZKLq"" &X,Zǘ 6@2OLL0پi#r2v(% Li$0p0(R?︦`h2(pgy0yD̙{$#oǗWF~~9G,U'.UG-LOhS|"r K+IRss.a||3s#ٓ)ZG9Ox œaOϦ}m.T|5=ޟs0|'ik֖fX{3p{Y,Q# G={1f6ɺSUFַ:{ɶV[U!j:g0ӑܰeU_qJ|"~Lho4XYǦ Rs^_tx>~??PfN]/9ZUd~{)i}ӊu47kMӺ˷n:|v[ڽ>ܮHAV ?~0|x3 oVq?\c/ξbr MT](NRX/!1@۹6n{~FzJ xk %]C1)rL] h G9MFeTTS&rYٚw3Jg>D%gDt(Je3^%fҹ2Ah^i.N';|ٜFsYzV^:;3݃Ӟ|P܇)$wE #c?atu9=MݛR./7x+p>y=+.i$!y9 m.ⴗnGk`6Pi"xr8.o_~MrнٽU3* 9(p6n! `=~虒trzȆ oIgthXSqh%cKAVW1 %pD/96'N)xGGhpf$x$p3(CNQOeP [$)N$(QI !#zj1!15cZiSa9٬ZZ|NrX?`l ;`0i= >E'``vZ0;>_L+7~Q<"GK'U3W_yYE*/l/gn |f\Es # C@HNp"SF$)4FWuVWUd|0j}/^qf^lUcpe3݅66ܿ '裘e8gU.NւDQQd%MDW9$ŧ攓8YStd-H J< V:|b UAQ!D$;h" 1>罥BehuCS] WklG( )cv+,].]MKs8P\k{KVXjfծ;5\y E %χhM}]yoy9?zsZ&D!TZب"TX:*WZ ie}z/7 f% GocMLLï/CAtLI1c܁azas^Io4 ba.CU`xu~iTpFGy\|g&Ŵ3+y)Eק,^g~:`ESEb1} k*CTmbΎ)lY+Oe-[lٝ۲9KgD qpXZ0^Q"l*k>KR7fY.ѽF)v؉k&K8NPJ-ϡKD (BI2%oPcKO~i~:-v6mWJlJZuZ@P Aa ( UAL|*Ha ( 0@P +>?H`$.|tvOhtRrvXttĴ?z蹴”rP. aJ!L))0”BRS aJ!L))0eoGRΟ5Ȏp^穟~TMO0*HAH[^&}TJb+/SL\k8$cNatv?$N_%P|8PBKk pZ)CTQg)po:"h%TJ}0'A[֥2ΏFV3"z.{_`؛`)~=!B\[6vrwW7߇/ÖlYrθQ2'P2V͝Д|TB@=idӟG䷈WK;(q>0tl"nm_ޞRu[_/ΟѺgW}Mi^/pHW8^R{40L1ǔ'Ǥ 5#ZA% GbT՚I $x:11 D)P 6P6ȸ(bNYDS1>Yo}/:a1_ Laby̝p'?.ʉN$m 'jzWW鋯_b\Wg]}%dhZʼn:5,9 o j0:= [Qfw>_jH|pG+Dur^ݱ߮'.}3d?Ya`xyxч0 kYћ777 9nwBˣ|=FζaZnja͞eGFWralE3̢E1dѲ.ڢW{V;Vۃw2ݽvVIm],,g{!;mXN7Vp^V=1doyNZcX]aYj+ q|+݆>&eQ̦3΢A[:h-¹a65ft4Ay{-C|gUsVt$'霽P"%)K{DWcF7Z.37$D T3(׌S!tb<D C R[2Mh+BTN*W81RuRxFAB`703q6̼L|5y [85V*Sc[y9Yeh=~JkU8hDj-존RP3V.\ݸ Fjm|q%Q>(=$i. J 95'W˾r@dof4FcqNYl>p[ gĻ4}Ako"/͚)S|k b:6a&?tE?DŽ֡#QHU[kKnI1m< nh'b3Cdɸ+F& \f ?Ct ` 6m cHH @* 谔('0 BCfe[Ėw<0XfgI6*f"Ā!\$72R& >ksSze_ۥmZ9=>m}L*~:ю+ͱ:gdDSLK~h:2'i=ika@i"sW:\A ꆌ*!͛3m2KOnauBZsMUCY/jU:.г,ϧip]y{Ӡ@Oi4ʎv`4ɽ߽YчŧњO/1A=Ž rՇQpkl;ٚ>GFn i i7Hzr6, X_1Q,rpb2-Lk Fbfe"bnru9//#}_Co[{QѥQcew`Ο "~Xt3K/i~#;}tFzP$~P'OޘYg$2vtly`~9󺞙eU!Rz|/?|_lԒHC`ZbN&+ɗ\S 5sqrþ!Y0 AƸFiKٙCJRg'-2b\\p(Jjn"vni}Msr}yz}gI%5$qJI4Mk4mvX, LNk- )Uv[j<7WZPs-@l Qs+f ?V\ :c#t{׉`AMiy[\_zfiu,(yC]F9ǘY ,Lp(U& gĔ"| oF:>$:ЅIePvQ"i JBk 3B#D |jI##EH dJ<{!S_PpO(M(=pؔuEY7Y)3ZkG)sD#`XҒ'As*ؐI9/,)mִQ3r ;NC9rF$GXoBi$5`fR4EH9<1 IAP2y!'"DœYQ3˭ɚ[ӬP`]ml~~3"BǏ,W|lů͕PcS_rJdIg10B5ʛ2/]H`ƃLne~=4x?O<`3Y_R"Gܐ3PS&.щV‘ӹ~>OfŒrJ@:0mj&nčiB}hҳWqbvyb3^bC?σK,~=iW_O/W:t"H$F! ͭ]߯i8-GF@~urOwo/aFkz7ד =j!̹lg'ӹ{3K"Zj'=S7ZL0F֍4%w$tn~kY^,?"4m`Ų_=l)vTu~F]{PbqIt%-\\nc_yP".[vlk>TњeagA8<ן}wO߼?~zυy7f`B^+}z%Jƛ 7Z׼لo0ƚf;C{̍Eڳ[{uߌgPN}G?%tkW|V@W1ף8_!IϧUg2j 7!0pz&v_5kBJ#yC1%eTgT^`ybRQzuOQ)Ӟ@QG "&,"JLYA$UƘ ZD}JN^u&3;&M/Vô&Kwk[,i hW>g?Ш"ܽ(jix`}l RR ɖNWnpCM:!5[UoqNë4uk;CU'iWVr.l`|y|.Z=r\P b ` Xf.%BPL$|m?KT!@\]0Vs{@oE`4CVĕP(ފZ;[ryxr()Q[Kb!}w_!2a}g*:ř]2KWK,nϊ:s6dZ岂ۨ5I><% s쐻 *Tql>~I{Jt7ؐ;o>X|S&$cs0HnFvl.yC LIE.5孢3gPiQG1 z4bW1Q8F25F4ֈX#kDcͥsh߈X#kDch5F4ֈX#kDch5F4ֈX#kDch]0YۈYF4ֈX#kDchAA*Ȓ5F4ֈX#kDch5F4ֈa06 n=14Ә{sOci=-72+7ڻ⟝7.n:dB 2[GGei Cie}OV+=S95gKIg}J_LdC :d8,Vg OOq@B#H>e*ZdH I#gre-ZY;xO빁I+ʹe #wybœl6K*ZV KQ \ّ#em:OVekt?uxfn?#3ĭ,ú|HPgq@Ya9s: M4 P2'::11kxղe Is+ด΃0s9䉪^8j.r*)4u),%AkmmF2fڠd,Ƙ 4\Vq`K[o$Q ךRWY6-WiAAL!ȭY?.'EMƝҼSHho`U'w^t,Wh/_-jr%jgNQIvMFC\he.53FCֶR9*ٜW0uF7UnTd B JEכ殻ƍuU~S(f((uLA[c3I&('x:nY&4Ik=5;yuwNY_/oDRh]ll/3KYێW'] }Kkr "z(`!3:goB+o˄pX!qt^prWXFLQ ?|[wϙpfϘ=ccY=n/X=S<3S:z* [v*EԣTjm| \dt'! Oݟnz)wLRnKRG.vКnKzM5vAݩX=ޑǷV\nꁿb\/Ya9x`扵l )C:r21|2myrQp1w1ţN] 5IE_oG6Ip88}|yًˋ{F' V2&jo*TbDu}*}y^_=1~{(UYƝ|XP6{c^)2T^Vj >eBbpajfdV@c#] |hMkr&fهv܂ mrP"`h#{CCZM"/Exwn+{}4~c?[<`wv|r2q0$ 2]C6078]Bry6ZOs/ARF)@0d:Vr7Ԏx/VIVGܦMi=S)"E hT Q1 킎 +Fȝ瘙eU7 9 <2yJ"x sIbv&y &T6쪑ӡ2@:~mp^Ъk+зۥm:d 3?YM>U@Y,; On\̱ޠ9=zh}..4ʦIcQ([EӼ~PZ(KDUW[ykr"Dṕ1R +\d˜x M-qstjRak Xmf`6*9jiWD *C$sP-ZWc"mb]XaTzob#$_򿄁QA)W&&Uq:G)t>n=E压^ 㛣OC*סu{\N7WMdV_#'Y D;S[#w*߿*J9uAbvMl\0.DDH4U&D)IU%q6ӁKYrAN۔=0U 72pU$jKj얌Q{HMšf;sr3c6 b2}0NfE,$$H4il4 JVd3Yb,GMY\ﷰMl9i బ !\%Zb&Ҍ.B Ąt!@:Hjp=ec="3f"hZI\&K00$4Jt1`czN9CrIsbGwYVDu*Ly GjvFta^&0f=kS_GdUFuCnL%MZR1vb>ìO'-Sւ!H)xLeϥI;).w!a>2 Hx_ŀ)cퟱF˜_{VbgJ"dS3:tG?켺bT 3%Q ,$v(r %*1Gs[yN\gpAϹ>R^πe Z!t42SjלFn~g>FӼĎ8QrKq-vEoYe)=[r`'S+'*Jw OYW'$`ɈBa" *Tz݋7)aim.byRMم~V>v&=4#h,G's, 2˘uIB9y$ 17iAٓe e2y4Z.Œ?l31-wgk~@OKKa-,bUkJR$y!J6 4"̢d̊N_jnj-ejCxN1M+HsHEJƐ(PKigM^+6W3&-D< r!X) GRj4 zN%UZ-r:U-Xq|G7;3kRiCbւ \y€]A+w&I=*)Qٴp =|eӅ&'%wM.L\9aڊ#RwsKԅ2b'qvCm͓rj+ZGaϬwC΄kg^QHVP0 (E0Gs1#"Bfh۟K5u RȡT`!K @DѻࣱOZDa6;fZTI K3ώ 1Вa0g 2Ͻ˖ТP|lyIѺ}6_z\sB?|4s+~!k9N3$wy~ZMsTar"CLw,nLZ~G!$S ֌rR=)NQgZ{㺭_ѷ~cs 6AipgFjd7N]I#y$K%ل ҌZ<܋ 12ZW'/: E9_-ΐ_?_;XjQ$Ri+&a]nެU4/*G˅A)q%3_L$YMU=;<"4>IO/Sj*]?4JX﷨{[%Z-zڟw\; v\+-_7c/9 Ճ_Vsn,Վ~x{}.&ixHdt87ivMg2Ga,J~|t|ctqgGUQ/rnkeƋQmKR964~@;mʮ|%wW{ՇC>V/=|m߽H޼7?~50g%5YM 01ͪ|h.C۞W;k%όA\fں o_o5?WbYj.&X~]-]|+0a~zθ9\ZV9ªt)C̭ ۞]MF_4x;G&>w‰RdP5bI)W RԲ̥l2Y sъ䵈*ƒ(}7-@`zZm! l*jlL`VDU&*dUv9Vxγ:Ox|/DFtQ/i. vl߅޽g}V}ܾB旦19"J48gkdqȓ\ۗ37>:l'|g=qq߃9*7SJD ۫ V֖=Z>Xav{5P(0pk@fE%9qon{dqg[l\RN=\i8gǧٲdg K<ZhkݿD)G]adGusݟرdWX 0lt!| `7uvf .x~ժ5۲Pq2h2ʛ'=e& !kw>ll?O<5-7K;ϗ3G.?%EV"*JX&Y&6]dЄm>ޟMRx4ìm_vW[x;۹XU$! GT\vч\:N Fa4I wf=w9Zb$]"h%PY) 1fVMdIՊ!w޵]s-ҸH#ɾkHKUsʙl"!=W^l-MP ( ṕ0!jZjٞ[:ut|xjΗj5&4jRV,@RXH E=b!;m&d6fƪKx30f5=FǤjqy"hUbm~nh g&Og}ŒJd-Z1Kkg SmqThRޤ2Cd F`ٹJ;,F\5jY5I_ Wjs3@<[f] /<3i_A!rYThvH>#|AJl(ys٢*I+J*JH9QPJRw[T흨HrYnuľQһ6k=Q֛вXSkZNI;'搔+XP>I A#cF-ËshQc]3ж8 Q K6@yUñ5<LlbD-S)[Uh)@\B֮٠ǩ)[Ƣ(n3il vym$bx̄ Jq2M!̳8@CX `gEhH}%ɕ*Du޹TUoQY$<@Yc#5^qW]$*dSI1 9lX7oLW/‰Y){Sv 4@(qHseԸYcj lSu;]u$!)˒2FT0ȋE/ڨVRVKf!BiX*{@Htx(+W{Vc+mp0d3[oq6Xb#RdۊY%⤁c HQ|ZmNR!'xğ7&Ìb { K-V%ӞKsuA#X L͇ɠ<];\ ;o _XP`&ЃcF>@ $5-*J2Eպy)dq恶Np `1}Q@V'V6(r lh ~t`u+*@JTUĝEɜDpyc0 vnz#qP'SXt5`QED_[b =NxB6DoAj xѕP kJhƚq 4]n0VҮ3, aI+fB޲PRʙ4[y/z묱PH6 CieahL.ٴM8o}w^yvatZO~iˍ@Y#)nKd3 N[ Qa7-~) R4(Mc5`ZCRYkp^y(Y5X`4vfct0嫃?#}ˮ0נ|ျ!(A/Q ȡM>fTs\@7<"d H]U.b:R ;(pI%Q B>Nre@qokdشl +dO1 [QWKEs\ \Y"GogP.:U ; =PѢaX4=ZF`gP\GܠͲp(NGp `NmMir׺I(%Dn4Hյ"EH>9NXHL6 Ba0fu V,k:w \-J APf Ƭ}p$k7VPX5r02 ja3^\T{.$O#0%7aDF,eWZ>rI#WJR盉0K1:Faf5ФQ1% $"0@t) Z`4rM4L銆QvlgB P GVz|7m{~xfPu !G_n9[O/D o5NɬU"{{['=[?ys(!7oa9;OhAtKBZ;ԤקIMydG_c h-jgJ;N+~a|tݟvy#ohϭ_kG*Oy$=3ikU@bWI*=(P,8q>䅻3׋q:^ihGwSQm>iՇj3}f/E ֲլT<}bZ!f0 3bC` 1!f0 3bC` 1!f0 3bC` 1!f0 3bC` 1!f0 3bC` 1!f0 3bC3ģܕݭ?ޝAڠT 7i'$0Dś< r2awH{<;_t$sS=T2ԇH+cQsu ԹXj2~`q[{؜;\^卖[[/{S79ǷކvHu;Z.8_$?uoBۛSq=Y!^Y۞8A'K}IW-fLLk9Y]_Rcwoɠ­_AWۡtT(lQ:οrFgt+J_[oU%Q9m%A8*%5f4-NaIf:X=KrjSb񖽌kK26IXD&T_JQXb]|*ɀQUe8mm{ Yʐ(K2=%&v%Ћ|2O|O~ݔ%׊y)~{ӭZHhYAlyB{TC9Z|m\Et2n UZ]5{F.:̛E+@{"BҧYzR'{aU.}bq}回@B*gYݍOv]r7TJP*}Co 7TJP*}Co 7TJP*}Co 7TJP*}Co 7TJP*}Co 7TJP*}_Jý( m^UŨXZ퟽4bX|*:kX2٣b/cůuЩH TT&7ir"j2S?C:tT<nOe#6(K 0MA4PHRtL7l{wXJҨ-=vtlqf-ӕ H4KS+ѡъ Z+}h6fl>Gh6|4fl>Gh6|4fl>Gh6|4fl>Gh6|4fl>Gh6|4fl>km#E.y{1F0h x5Q$$IaUbY%2ebWXHys ؼ ؼ.\k"9͵H%l)&lT Qolv";|DÛ63xVl4^rÜQVFK*&_C>߂9a~`\>'FXcc)Xw>0Nh:2B뗌Q δEv॒PY֛R;I;?ʼ]'r7lxgBP {6kv%X@Я                       zV6uBMנkm@.?k5x&@9O |6DNjwv:9ĥ@lώr;ۗ?Օ=GqpB-I&XTA.{R\8Eqp"!Qp`NO6  < ( ];Į}* ., >f2Bm.pl*7^+l1k{}x*}{Z%bpz@ ު=]Z'7O!mLNf9uՍInS=]L.twnh]ASi{lO6svhB ZVj=]U=ܽɎwXQX`m_س|Oހ=kĉ+6:m[_4-y,7.[wq {,{L{jZk͎gھo[s+Eܝ/==FL4J^ds+IE^'crv|j fQl[\ mƢc\WWX7TX1teU$锃+*.sRur'0J fzp;}v9|_+꽏RdԶ>@IOqx'$*8ӡ:zƌƁZ1b"MP NVN']=Löp2ke9q8ήnrkds^/#_Kyu"fa4`cv݈)+٫\p^܍.ך97_}%aY1Omxh4QqK ""oFv+;كqj4J歩5.a~0tί`lҏ^r}YeٜبzV+ʷ o)]h],vD2 2v-K0i.E[u䦥]Mϥ\adY7I7Idx;- 0w}@ *uqrX Z(ڵ/Q㻎ܱ™[80fac Ϻph_]2cxaXm x*%}ʡv*GԓTjnyh@Nd&mC%n=np"$6;t շ*ʚTn֙a'Zۮ3ۆ'&.Рt83hc޶2kI ZNqd]t䵓TZS։Iڑ[8ܭxV沁XQrp:ʹqao cf# Rf"H.|*s uׂ&KTQ'NI>ّ?;jB'/qf4L ER1`.0TKe֊<'RRX ' }/W^H=(խs0{M=4 O%K3w\S+'?iɼJ  CF9g#XM^ "" 9Mb Epd Ͷ& Er.{ >DMth,.eKGJ) `yE! CpK(KQpn˷#NCp0ѳ0_%r,̷8==@vhu8%6H'l78pEx4a%b, G>Xc:}l+X`MYRΌCn@qigHJmHE@B 6y`Mتii0ٜsIo74.'8=pQxė&/~+5F!\e2<5>9?O~>!lgKyӏϣRH<*ZH9EM@(Ko@|Vk,>ظD\k;B.$+$hZm6}PݶљpA^˻E{*o9ՑdC+Vass~cm_ ?w>E/XL9oY#|? ?Ӡ Ebom7UJ7ЬKRo;AkşO=# C]m&k)|N#e"}BUJK\y:+H FIOQ9ʸTXKvp$q څ W:rDՌ*"L:F}:&9n|j!k"ZYoCtRZ%AYu8XAHgzQA1 fa~ZtF? ].寂y2'Ԯj3]TRj mCA 0bR2]lq,Iqx(ts Jw‚`a)D1:,A!獍 *ED)A b C!fH Y@SJx->cl)e#瀚3O8p}:7]`mI>QmQxmoA;mGˮO9蕬#9B'\aa/ $sB*̫^ҠcTQ p-FJ M0&e2:![cgU9wC}h]RFσ=L]rN#_x)^4%yz`iZ ³"]Sͯc';q{Zno/~c{7.V3o{5Ezl:+rFJ@lbZKP+icX7$ܾloUUsfB\e*R.*j+SXCԆcמ)X*aQ$d߃4p} 6&~[[;~i/C0!B*Caj3jX$"uJ =6M "Qjuyfy|E^]$CN% +k"0!\2Hp{1N$,o:j| `]QQ?>x ̦lrroD>E@ƫi}T4U3+8curl&ӱ鷛ɰKf kr^<CjS]jXIoXrc,R X'RH$SJFd>Z7=)Q[M*Ffd j0DR%c6rKVZG@W-FWd5ԗկ鼫##SsN}6f7 *3MGæ,$q6h.z^4$"?*0$xﲄ$6N0+D{CJGt`ZD"rL-_bد9ڹVGOv_oڂ!"qVqy 3pp 1N@`Ӂ =riL=Gg%UxC #$$G(pF$F:łKs6&9bc7 N;%5)!9K-rß~WbcVJlVFg[IR\t6gVV&*y|3W@BlU"JԊއ'*UI[H~F J:om>q*QuLsqFs0G\EgD-C}W{Jp:#qVlĕi鹈DP1+ +ə+}z`l.M=iQ0O燵&0ӏ^ ֢>^k 10; uR@E)a q/!\is@?Ƿdr?`k6Ș ໋_Z~$M3֒`SJ ԁ\.Fʞ>R TT[T}Rn9lYJ^@ o9B%cӘL1o eުj8ϾNuhE<1ڃJqhh m-D)1 H yFjvIhUsVJ_B?aأ7 \wَ֯nYe&?R{&.>Ϧ~C'΅xr;?,寇ڍ|wc2:|z9EsD80nt[v9nM$MIx;wcp!+6Z@EUއ޻;,'`(=8C1SEKH8̦$RRh5:FK2^f WO7"v#~\;v5Y=KKzU"@U*1肩&UV)AH5V+QB:(=XyRSE ч7ڲ",*QyTdkٯkejhP݀؂=M;60%[xc:NuELw0ǜeP-#w=w5Zm tq@&| =ΖuVA qNe"lEr޷R@td9[:[`ך1a•6сqnb*a& NNi),2፵Jcc"ؚ Ul23!\5/흗lBGzH4Oq2-h#}zW1B3AL lO){}SH/a2=oH&BV.KګZdoʣ'U*(h$Q6؆D+6PjKq*1JRhqU\ ;[/i煋9v~__I]b7.pNp>g>.XsIY`wAo<&>OI㘧X0o(]Ak} ]t,q5>_S'aG 5MhEw ~oONlC_k&W"[dqK^۷?Z Qa6l٨7 RoNi~T =e=saU# oe>\Gܮ =j39*樘S'z-z̊5*+ύQ'~YˏӖ%}v+'ԋ\jo|\׉9\N޶k:ªxhdlOOGѲYYDZ_O,Vc{Oէ'Y}J JŞ'iUm];l :X&10]b5e*q$8 zPiY7!z}Qxp:qaoo7_S4bpmT*bCLHQUX́ltY\0R\W&l5{tfnm,mt@A>253vaamեz=f'3(*tIl '?V}DP3('w!V` 2 EVc^&ea>Lf ߮i5}B$Ç曼}ާUO=t j繠m+e|p,om{ v N/ѶĶ^^B\.=߫۶,tsޙ1 Bw0lnw?s:m@`oԄN":vU܉*Xgqz@5{m"5t#,:Z#m 彑ֿ zoS :wݯ5XT b@E 9f(6%E,2E%gjnx˿AϷ}y|nVZ4&w.v,Ϫk33Uk'ۚ,7jʷ&?>kʟ/+KԹ5*1$HND{MZ##/w]d뺄Ic)(#IT5H41d(d5-*_H' 8nL٘ܲ~ɾ13 npn>lGu&UpCQˋ)5RC6i]zy&ecywX2hN WWSY.4Y=,\0Q_֠mIĂE@V\ё,:Q{4+5Fxe8ʳ0d z;Wl|L%"g2`P (%'ĨH*K*SJS4پ]Rw;E߿}Hր.IAVExA#01HҸ= ֙1vзYLCX)jdk0UB?= ƚ@9!w>D&~Ec vtj!oJHurlHrlݏ6SEown?)v/fZ$K{Kᯯ_JQήVyz:a+R.a)%萲:PRP2:'AW]%.ZJE-PQMKKj8LRBlz[n췌QYlaq-vd^v%p"({]c|?o"M&o5zk-%'H%VPm^1s&cMΚjr>\K Y \}SoCh,F7iA:_H]Te~Moq6-r0V>h Ո*@xf9mk2x5Tc+b.lqNtu/e V3dF EgCm 4GQ"3j)uo~]ơXnX;[D=Z"vJJi+Y0*Vxp`9 j&yUkGM5mvvS2tYuaX\QݠsǸpƸמq@2,$Ki+5vgw$< }m2W%`VdRE{dr ьecD@À EQZ*'kTƕٶ,1K֤|zT}<8/8:p|5tge{9mw|so!%O;o_:eɧ(A+kXkՒAd*1G<Ө& C@hAekȮ(*bL*(ڜBIa YJIy$Vengqow}rկH.B;,Fq(HY^L&EC'm&輎ee6->#[Ą **%X+D}v{EL-u*@P([2EUo%G*D RS|)^YY꨹4W0kD~; ʵ$dIʹQJTSMFdK6XFLK&y aY55Ϭ]2-K]AGǖNuMk 9$myQϺs-!ّ5Cv,֚ t@OHɚ)'kF+":!mF*S.eg^̆X^!OR59 HoH=l6Tv~P+OF4v[3& DbLYsIN ഫ@Ih8A&J`5}aWA7ۡu@ \ a@ #fHʨmvDX/w$Zf~ЃDU>YǴێ,.bVEٗEfs2`kȏIBɖŠ&;G?xcmY;=n~7ǚd{"!bCIy8.-d0 61 6a)㛥1< '3eMk+QTm*xn4`8(G.QC4rJl VLbvq+3˜tpMҾF;~춣ݻϫ҇'cs7آkٰN͎iqTY, +mH藙\R?x={1/c4򈔸M2I[?CQ"%%RFb+2#C{n2zdZ˔K\0't@0)Qࢷ >IvBnI2yԍE9rן3PYAeL::f(x֪>e]փ>k3wc< Aqq,O)[e9؀9Kg51%0Y4V81R(R w7+Y&KצPwx"76<]ke xr'_9+h,Zڇ*R&_<ѥ`ȴYd YbB*-mfIv]h(i<+L0i9xȖH IzR 0{Acf(b, Gh0nϙZ瘠T9 E i,24R2 -Er5?ִ2{phOMA]R" mVvVZ +HOI ^-ϖ TLPi>$ـ2a; ֐KZF*mlQX*?{q#57{?H3W0抗xDNp}É1 Izsi5ME&>?->I??\WjS&K5y?!Kqy[dMJ1qK@qNnƸd-%O&xWG)\5i5CO stJȜDf,OatEҏ= pZ\_F>aԞx;5gfoB~{<=;=TqR_ %Q.r&a$7WWLӵ*Us?z7i~wov{q3ү_+??xCĜaONgs+ܮ8Ն^L~ƚFq$Pt4hFC01pTc`ŲDOv 𨋗l]cל4)>~\HXs4¯+c~{!-=.boӇ?O>}'.ǟ]o4S15&_[!?uhkho1ЦͷOhƽ#>\#RmroH?z^|~~Ob9HZOʡIVq(]_!If;I/QSTY=gK:&ry]y]ctSx\8,ZqC)hgicRJ62sɳ &t顢uL$ "&ԝaE=V!& X|`3턑UR6ByE6l )h#;i`gSwح^VWu1L/R'C~9)sM"Ta.cU(*,U( ]y[0tN 9%zl-Fϓe="PpiS)1VWS],A=`Tz_Nq؛}e7B6zߦIq{{WoZ><}6/[y OenrdUlPkWnJB*QjŴ?z?Lau׾ģlj!Gب`LF\(WqjV*U'>yċ}bFEʘT\2NoX{<{%vMo~NuEk;P-[y?;B\Tp*"`nþ1nRj7LlNpZ}MNĭ^Ƃ~i2׶mU&[FIMi-g{Æ\Y4Y% x+SJ) BmVpѭWXMZ2q8⸅`)8n!-*q Zvq`q\aZQ \\jw*TvJ v@pU`ઐk̡Uֹ}+p+; "\CB{Pi;c-•v \rUVvdkJ;-!ډ``ઐ+١UUR \؊+h[”?>~?VOtѴf9dx\ TecUv^V{Q s] R8B6'Bt"-RhCp2ywl)ՎP?^ /rfGL0EGMӅ;˜o7o6e}Zo৞>zJK YVf#r5SjPtrǸYnSBN$vK{quir!$y`ɫ$wep LR4d@K+52.an˄F{s/xwv\Cߣ4YW7K|V6ỜٳҺ˩-UjVl|u\M|-(s!ůeهTiCNE'v>5Oo}j[[R߰3G#$h@i VcH§FEndҰX 2"Z3e/\F2mC*-B*yxP)Df YA.S?BS˲WV+ai?LMW =w3NNݵMf̮*1B3m髣ai]ݬȜZPZ^a RVUWJNiݑ oJV"NzW`trRq:^bvyb?xzzÐBoRY ر}q6 A&A*fwBBe2N [8]ML}f榺̷qV֞o.Ou.gwUx>]f~_ڠ/J~4{r kƁϯiܖͫ,䦓n KS%JqB(I$%`RdGцܔ B9itk^NH5Ѣ;~Lfj5^6*cN.G.k{$N0_N0oY֝6 V',f2Q͍7ݽvVGmm@#limLoPMʹC}B[.ySsDuEN&SP޲ڴXQ,e7 P{'C*/Z`r-"j7^P)oh̬֫j)3Q5Mڤ]"6S+ƹN}.I+swT[ȄMPUe\RT,,)!EEf)9̉ީMlg=_|e%A홉2G% smr 49!"11. Z!0s:pMB]D+^!!euP)llٯ9EOWi q9*ί!eX$-akڨ_Ĩ`Dl?eD"voۧ3 m $wh\p&-F͍D#YC,}eڮ 9ya}u%OE2.;\kz:xbM,/Fݵ8X ١{|AWBȠ%AxaKOG['ga%NM:o!OgGU *)W W2a.ίۓ5vؽ_U )&Is+ETQGJy6Ί09.O}lJkWON2n;y!Mrwv/y4Whc $PO.:e Qr<-]/St\`w9=Jzm6*Gֶ߽5rKz+h4 Vkm#G/{l;|?ep;deEmMd#8߯zrܒbl /)dUGW;jByN10N|"x%9>|!s*[)*T&$֭)`;JQ{]Q7FuOZ֌oO5X8 BLXQ8\Z!K4A.t̕!h9ẐYGR-6d0DOgš T,h*Kҟ6Dl-.H "\J{iv7|g#B+[=*Mb ?dFjÕ=FI>WwwTeG(w2Dg68aB&-'f0!I1N1gS\;msהd:p59Y ܘhA،&eY<0t'F<LjBDF,a[r('0qȝ*ѹZ_ߢȺ{vH.\ZْjOlvRދlk6jɕ *XAr egu+ZKe8`0:XM|..OrY&YB<%/HY,IzL2#e tIdNjs0RF#.aCdb#!Z~93d^Np"8*Imk{^N$Z vY osLrif@I+sN uWA"̑і$hRmYD1V%+:-dLhmܗd̷݂=Hvx֡9:E0h\\ӹJvzEf˒%gKcVwޞInن(M68*wL+(vEv7pnL Z% EJίGTV/ skJ ^}5sV.M޾>y&.2ME> F㫇vO#~)/0ro菻_~a:[jQ)R')EMa(+#šoj}՛zs}>ͮA9n#pv@gwqJvf7;uagE: j0vok]o_b xڛʠL/Lz~7;.ơQ/Q>(z{WԢH\8^K/9c'e9Uq@2r^JH2(Y4%= ۻ0&0zP 1A"ES#FhbF+RI B:@-u59{Bh7R@ڑky@B$S h -<'R[#g >v'gtaYVt~SB`LĬa&J.Ǩ04R$iG%E;!*6Y$HBAB·{kYl0ӠQXZ-3焭WAeGF,)H]8*.}+?6 qvh0uY*bșpLR "s`ZQd)J0Gs$r5ni>EjyXt 3-:FjoAώ 1Вa0g 2Ͻ˖k7$eġ\8 7%K }yn@Lx.43+>6W.EVHyn'T4*09MLzze6ueTWq)"O2wK,I1d#H(tfK&щ5Rrdg0:LIWgahbx62nr-+~#&jE3Xfj1ѓ1g麗3qT֏xɦQj\:)W28/&$r\G7YޒcN\ө=Ygq/Oi;巿8(?>s ̄S/4S;`~}!|5ihkho1tjmo=-ƵWn+3Z" m?.֚$Eܑ_ *zzi"lv7ԟ⼋ݦɗTųTG~@.C>}NYGZEdv`#^;:5cLX2^FFڐB \lxBtI'SJ=Xd bB)JmqXȼ=,Hɍ!bIKPmQpfRUVE+KXV.v*Qk{cg:Gs|KO?l@#TTt|>h#ÑOVPeR[\B݅=8s#pH ͟yp` GiRBӢzDQ,`)hXhƭP DU$܁W'}o)Y*+ne=@g0/OQw% }Cieg\y|/nZ6q@* 0F\ IXdȀI+MKm%Ɂo=A%h^?k0h;Tk,o?rXݸ|_dU0*,*V-(56/qWJ{]Xzk<f2vJ͎Ӵhi?[f#FSr,yH޳7`;OAZg+eϤ~q)}S iLt밺;\x:Ť#irqLzDI ܰ(Q=f&yVb|lK5wɛ=b|3 ,UϮS=N=ek{뽴YS]Yju;]HwuxӺ_AZ;|Ү4ҐvʂB Zd.vvZ梨eYmw^KRbZ 2CQ&j  !DVjͅZC)&LEB9GXcm lIBW+.kMIU!>6-@i[I|sVp|ZdR  i?*zS d eLd|9s.C`!ƨcBE*D JH*%y9h2Xp$URX _iWڙj:|n KnA}v牠\]9N6,ki`IU\])U3Al,cs:XY~}&K58F#Ŝ^n L?d"sJX@$,sxI KH1xrYJf.s"P 5 ´[#gK}KMWucCa0K pҍ&Uks^yE2&w7L*nLzTY޾fqJ϶&r-iQu< '79"2O5Ctj]QKjz}4ɆNm6٬;unMW=_7@=h~?G5-u0oD:{*ʷlܚ_46Zk4VmSY=?uFq:tsT>lO,:q(ZܾO,T ߕO|RW"X+q0⪐F\jދBzJJü8 qU܉>qE E\}qBpzJI/: qEk*jv(ٻ6$W 0)}@70`^0آHHbFIxxYiEdZiN]]^Js]*K幨LƧ2uՕRsRW@c fr:uԪ}WWʂ^RrN*,٨Lf碮*SYՋTWcuЪ6L_?L7I*޼ISE&iCB]Lcҫj^UI%HMMBYͮo=aHo5y6kfZ"u& }CmM(:%8 i&ލqK7APORTnk[ARhVCg"I@h@Q̺c`0gRP†̥_DkN^ G. T!tܧn/]-T`e/[\Qg:)G$&ߔaxg}v;er=%Pc,5h0s & Ӌ;|q$c ;[{@ ^ĞAɿZh#c RF:tlHU}3D__n.g?Bp&䗣otL9 0㟟`5?>BqfЙ{f+wݿW~.4?x7 ΖC9Sbnn?WeqmgEj>w׶?]^74Ft&.t4 mFif=bA`*|=Lz8g;ZpjKgxlY% ` }Wd% `AlŠx];_?~~o|P緰V`}Nnl$<=Lӻ6ZҶ&S37Zo0m>rüOGTŘ1ۧoćwۻ[C]n]Y]C'LdߎcߺÎ"UZTTP b7 15/GdXi#='eĚpCIap9cx0$ 1 $ )}p {re1q <$`O|HݻQ!=Xxm+a7Q;䬣'I7rC/NpϸXkek‡ȃ,bISCT3bSv`YMH4\ ZB'jfV5>GC̻5)454n 7o.WpVg&Xϑ#X}՜"\?mxּqm82 +d$8Y;ϚgyZ9Bw=ۏHq{4Kтą9FIC)Xavs,Z`;dE[rpJNb4)$XhtZmh=*f/g[4Z\ج~\b>_k"QipbԘip6Y3-y&&e25q3 q}N)ӎjTNǗk>Ї)z[,RY-v=*egJz#LU؜z#L%ť ld$\-v8I d#4^%K!q>Mgrvk)mv5Ips}}aA:ފhNYzʚzflH´Rh\a\Z[5}NsB;A5O!x_ I위 y )+%+ʸIu"ǖ`ɸjC DFfD)cSJXo$KM Z{0#@7h2` rZZn Yjܥ5r֟z;{qy*m\^ BI*!X]?Xc )xM*6īcA:T`/?r B<0H&P"eЈ:V@볼3AI=PG{7 q!f'/աL}Gסwun [E>g'wdvVܞTThOV< OL[̷ ﳎpn<*03Mn􁽽Ȣw.=[<7k 5eт;']7SsZ -H glE2Hf"˺Hh\]YPX(V82Qқ^ml;y+]GV[j۰8)9-m,MO u姢NI-ۭ:׎  VtgNfr}q)bn=?k[g7ݤZmTnʉ#樔My{ه MؚwڅJ3u& tR׮?((i9q؟8i7[:c-eT"7!]X bΣ0F88HST3jQ@?έN#yZM'εcu _j/jO$f8_c;p.P\4.k{%OQr,JQr^ť)woq@?diiNh*O5dDJ2{*VDh.{.i gi)"RwVLT6Ųn fIsr:ZGd"$Ua)IȺ6Fa&&cVLxb'ĝiNۥ~ r][M]G2@H DzGB\R fSVD2Vk9EmZ[*Xe`|dN:@[2 D}.q [!I>a(N\"q2 񘻠 6KsQq Θ ,xGT:ZFJAbw:3UeWysi\I loܶWpk7ǂ$QdC^a m@I\ m! { #r)k$`P>)xʹ**F=Ho}E* [n`6u0-`3zu,ŔGX#w,ڎ\۱h;j:NǢ:-:ұhWXe݆1s^2^!Q=n(ISE&ip|o 7#[/xr[s~ȞvE?(:DN v]im)rQDUj*9!mj5F:\AZ+GC\+*iAk HFnFƧש5bfѫ7p{}miͻ7jֵ.?-Ӫz!\\,n/_K#RNTA58 RU5쑕b0U' LJftqBۈL r[#j,wdsFb+V"[%3(,:A!I3pr8 F/BfT 3v3g7~-Ӏq*?ED͌3">m- W=j9csf1ySI3H! XmZ)G(ܸXd'jÙcb,8MP8%a iL&&ݚ9}˵6:yɡqf\q)AE$EJ!?IlHhPC Idф@Yv"ϸx \ vC;!Gw"s5$'vW"1mQ7WoR/&9yOC]dNv(JQH.mEjZ]fhK`#ZSkp9hIF1)N!`r!7%>\`55~ǝNɅǞЗ)` v^]~t/}Whʵ 'oz5`%Bʊ`1xWC »l |Ep-Ac AUK͜hbqSh|~d͓ۃP>!N0KmEem ΝIE5A*Ш8DKY$R;Y+A R0 i/K@ُ!l,Bx [U#)5]SDEc"D]Z/j/ň*)b7s8$#:%oC:_ΫX%Gl i[[h66oE.ghz|ڤx}dUI6mEUXϺU1R,be bʙUΑ-rsŚArV/cw`*B*>-HfQIaYP588rь,5)Sx|;L-+B\,b݆K8fΞq6L)Zc'5X'"Z D]Ѫ,j}&F2贌,xjѕľضf!{Pӯ\!+!%I7Ł8~VEe l#&P"N̓ftoTH`QICRS )hem͙q5䍓18GV;^>qS^mӚ<]I\&h3$ y՘c0RRREJry~+"bۇ22Гѷ"t6!4(  %ʣdQm&0ivsEXL4/<I)auC;GE6p |d?2s|D՟ bW*6\{B"b0*2"\OEYf"Jdn/ڸl.s\~I5ͩ2?Ƀю ޞ-.^erEY)[)_ l#bMhkY7|ozpUK4ywf{URg)@ĄE6b?׭Go-7˘,?]?-;WRu4U~1ڥ?ւMs(I!jl;n"~(ۍf1(?c=8g:(Z̀␘M σ(!NXf=~ٕt4}Ȓgg>pFpVi%!D4CL*eg6F  y:ן~wv? W磅˟q_)媨mog<~ɆZh3̪_`Ӫ@`JU7*j:idઙkTY ^!/Ǎ̱y׮$~|'_j5 λz`\J_|j,hsyp䝪*J:1NMᤛ<9TTBֶxGТ^5?;xDf|R99IÍ_/=3<^UX0ځSMCXrx1Q6YBպ*)cZhL`"_M Fk7 !]/: Xz1F=!5'6e7*`8˃lyVEdUtO8bbv25 ܥ] FLP 8 aJ̘d'mJ|h ƉDẔDh G[}%ݔ3rN1 o )8`I/8}o>TWmY|JuNi5}'Z{AUڥ_ B1Q%kH#Py@TvE}tƲ-̝Ň;E:QCL s*()u]&b&qZS(QR*+*uMUUf1`:)E+Ng Ser1!ru3gwGd5^3l2y>vjNJe^\nulg-[˫WnR۽Rn'/;h3=ΙQA2n66?:=yrVuv>z/&p_aYmҶe/V4ckH/v/5f[rtv_CQPζ^vܲ^l'Wi#Rx$6moiM_\\y޿6mG1~p}=Ι+eZހ?,b0:u2\S)kS/lVnK-NVR\ROL QE&L\E&.Ӫ"{&w'(uc'L^SS)gaő/%Gs 1zX*gOUC:<ծfNɧmR3<bpٗXNڱ3(J.Hycb ɗ S!n486dmYp[]NTlen?SغE#{9{4z][piRmFAaa}˛OWmP@mYl.>y3r?uǏbv//!;1Tz)" ;9C"eB.ʖJMX9T3S͹ƨc)[GC\+*iAk HFnFƧשbfQ,3 -jv#mzXi2Ww|y#RNTA58 RU5J,S%qq@]*ch/d# (93 SA_лJU3bw3g7be A^1^]oF9V "'>;ℶ)` FeYfJW<{!E!KfQXtB5,%g$q_* .z7fn<Ic>ϥT3"gD|Z<{>r +p`c򦒸7f'C 4>ꊈRPqWN ˷Ն31X\qp iLUgDfnD}_drXה:yɡqf\q)AE$EJ!?IlHhPC Idф@Yv"ϸx \ vC;-aEd_kiyS~ȝ' J%,ەh{zu=ؿ܃߯:EJd? E!ފy"f]EnaށWX6b U);UfĩmIb Q+*YIRvSXdʈoNicOKMZi^<^:m h ͕k"O2 XkKQ'C5䵭cxM0)p5+'õM4jBI%f~4z[j4>wuJA({W򕘥+;镓jJU4Qq* 0IDw ɳ^W٩gʚ㸑_ؗ}(GrޟѦL}_E[7"sәm0Z~(چ٢2>դB&VTt N!Œ GP1S;v˄/ςG;;TKZ}_#T/j򩥐SYSm β'EYGhA &Bjdk^#s#hl0qޮ>g?<ݦr)li@pCYJmYե"$w ^˛ɠ =SL~5 i^/'-?@2X'3I]J2(~Q*6@{):VV mmC-:?yȓƻŨj֒uJL*GLY$h])_AdY8q 2qW1tZCV:2_p*hF+ȫblfh/ #"?c7-SVǴ[IX\BV6GN8\JbP^^BŠa7pc{ k0 yu&y[X)l[n=@Fg9l;{͗gg. !tj"E(=%VdbLcV`ƥ Kt-I#ugxlOm7 iڨL) Q;(BM)k >W%Y6 5sdE-e4ZBL9٨պE#9aX@ҿniJ쓱C(Y.[PRd J5\瀈+XXΘZ>lv20Oh4}V` ًa˿!X3bˉ0TR!YG7ke-:]_tvw'S};8{[o8Cۄm{fܘƜ0;ý\\;UMJpjt`URpQ&;n22Jr/tO75xXt΢l}-]ܔ'ϬtTT n7cz6E\YK&^pI`,b n-_?P@Up{,aW_` zt[;`]F0J\2J> |)%"lJkr0.cԏs|OS_{R&[*cgSl!}>aN%ED >P92Pޖi6A8 &ʱ 7$\\z\q{PefxC+/v>2.*MZ; ?Nk.=7o?L"T)Fumc-}\(.VJ dw]Ra4='ސ-wx@Γ+( ߎfGƇDϊ55pǁbfKA,.&c:@]B̝)*-`L,bfF{v=D>&b׀Ec8T9Ipq[ 1b $#|ڇt {Y|arJQFCY$+}l}>%kr:za)D:Nr]C3C لl @!C"*b[, hpEIcƺs\ڔW*M`El5-?6ec<% ȻIN3d|pc>E +gFC֐]Tk)dthUmF\lY$cJ:Dxsl-s'3\Oq郤a',ZL0 klmG';1W>cI&]{`j޶1O3|`/ח[.+S+xBe|w'[^[}^qFLJ Ub<@*DYܠ̅ 1GGu=:e'SOO$Qe{LwN/ Z.rfTPb7G;ZF)>{}ź3-,noیc]ŕ ߿?ūa h]xZO,֦Q$6yaM6dƞӻMݘxeeY tQ2mJ>|Z w=O{5^W׍5{=~^H؉uvFs=-cFҨ碱O߈:闿W_o~~M|_e9S n(BjUOwP]KmmC>r7/G֘'Ǵ^[{O~Zr<_dZOڡŪ)6]Oˤ>"b['lץTSH 0 i_v}{n5!'[OiSUT6Y[].̱[q.`PBCHza5a:|h_04.Zd5W(53#Sb>Xv9٫ : 㝗r⡛3 jaڍS; 0! wf<Ȑbgl\uه*#0Z웵EJGR f(k'\3٢aH ^J^cLݰ%:h#K:NΡ*h&Ճ HN1F1?(A:`܍e/ 4Ե.O=ś``Q녻c;)ڐvx^;Nv?qmInKJg~%^Hk}ZyЏ`}x|zVyOdE N=e; +0TI!!?\V=q>2 9gѨeЧ`,VZPU+훣֥i 4,G,pf梙wD@7:[vsh3\BW `On^GMˊX9gߺ#}UN: TURΌ&Vo# Aߋj!ZpN9죯Ikbժ<%j #9"F\B1i֦RR"s99ugKGe&=:MXu~7=x–{>lzk\E݌ۛ {GcfCeMN6e6B*|FoNsB7Z}"0f.wwrwnqpNw4 c}4Nں} -_|w}-Ztp6ϼϦu;GhxlZ מju}Sw$oI&WFn#n!x>L/Y:-2D;=Sn~Zml#\r"QO<{ fW<{G[NQmQ%Zŀ;@la$Hʎ|zx԰?\b>7t99DP֡Z'룲)ES0׌jFT$bK u959רk%Du/,r=>=~<{j#3^,=CݎuGYTͥN^|v«@}ʠůaTh*CtYwv@*u0u8#`8د{:-eb=$`zCT摞CAK:*ɺf@a KRw [_pSPvmM덁7<trRYyr2k@||ɹז͏&&A>ӫ_NNg\K_7kF \o]\^ȵJ_B U_p9llPoGP/Fc?'n" Q]Dgb'Sf}+D/ #{ziKiD ӎKEAk *k-DXL.h6:lP\NРf̄$1fE;J#&2ϐ}rFq98[T^/p\˴u 4LK-y=Gq ,+3U)9A$'h\}ON RB| XO/)e?kPXU 札Qq;\yr\incWEU@7zTeTy^,|Ӡ(ZE!Fu}Lst?_H{7Yk\ 7LЅH-`s963Y˅7ȖpU=\.r)zt)@/WhD-ǘ9roIbId -8RXQΈ(uy30 `M)֋^&s_d̈[jsP3!՝}\+}U~](M/mH/Gl/,Qd^]॥?1(Ȍʶ*S>Y=)吲>F1g6Bό<$)@Q_v"6"E] j`6TKIęJvXl  3'S8zNZ A?gFo!Z[Ȑ8_T'CsF<[b0sff|sKnKcM-6cqdhP?EF{veݿvr_݀Wx }aՒ7f`YVȵ$;bF+KJ$d n?_ ԘWkuն?ڳN_N m/_-7t+S?9#k|~uejՁ,Ҫ}u 7h&;4]L=Mnq&֬o;62>OX2xt!F_/_Q'Ub].hqF)8k\VZ{jPQcx66ﷱ9(7` ׏_mJB0' A>Zj*zJ:&QF)8XM DQou ~ZŗϿ8侶-OzØ:lM!e ZH ~z߫~g*J$sr*7'K n1]8ZBk9OTKO-,WZxp 5xR"S)9ρjsn6cm99C!e1 {]m8g=mdRzߏ8g{6c&2fŃjL6BʜĨD2I-W5Vwk&&uBA-%goOU^Itm( `C&vCiJKCRɾ^* RPFD&_T PT޾Sue΅Qc6"k2GriM3خpbbz@TX#;Rq5K0GzMAj42g;2*ðpf쉅78=knSiI_:3zqao 8YlWL;>5bج-Ypi9xhҸ՞]S>al,4Н\^8p6!DXRM};C9L)4f/#0q#v9;7\̃AQq o5̌گRl$Au-Vubt\#`˒~"_Cpj+Ce.-,Uz k,qp∃G(C.'ȑĜ0qҩβx(8L?ED7݌3">XlEQK)M*6ҀZ_8U )N "e2c;5&DTX1EUWD>goftv:~ZbObM M1uM41: pd9:4{ГYd!l VmdB0\Zv}sG|Hݍhb~^WͬW;ŕٯ=, {e {⾜a]ڄ>Q$ap#(@ ǩ3J[y)^`Й2)Γղ!6/!rHr|:ОlFzfv?+&U WS.#a A/dk=AWYH@l LI|7H wgDjP5 U)3듴&'T Bdy MLK^7⋓Ov?%r`hJLIM&٪ؤh"_+Z4\_Je!qP~^YC,% TslURj(!%PJ-٩EY1r PgkQپ[>f=Td_wUy(v$Ra /9eS;ث1V]d)XE1J2Tjnj`hTm ܜ2LGGA<|dy(˾J{'"r[q %Rz%ǨZb!]a<[3`f~R-M}`ic!Z}k1!Xs(h lC Bt,_GPM&=;]th,wۺSA-&]"e %%us{}E!< r#spb2XYPIPBQP0ci(2\siht)pج%g J$16E":KJ5PQ\3JL. Rxz<7{M_ ;.+`]74Jnb|yԆoF?]'E[Ǚ=;XhzUIm8)H5i]cP3@0OέC33ČeaiU>F:uxu.AmJ=b0}Q݇_{q'uE~^]q}ż2@&/QP{7*_=>xzvy!exjwk>6ˬt7nl<XZnDucE~\lgTs/O L`x&WfPEw-3%C,LJ Dy0A !PsAhMԢ PYpv"GgeM~~?눋ϾI6=*_OUEzs}SGD2V%ME_*dc?sfw=8׿r~ejju) wx`;d/z XRgҕ ƽl޼ɋ|*',.?4^NHƍSQWIL G6~qqŭ?VZ$]|kgJ=GrXxQ}K Y%~͊w^|Ic^f1/Л^`e ,0bVǖՖsk`ؒ#+eI6Ďs:GdW"&<(]6Vw)h G?⼨)~wf+N֞iq)?u_iv7x F0\ϣq߭wvQT7ŷa|r/h|h&gLCi NX~)1joW/=st9p*Y94೚og֐$-HXkCOjT'W==6ԆAMY6/hr^/~|ן~_?_ ޽Ϸ~rš<(n]૩Omd|˩m-z·BGSΘ;43hGx{6R:鯣)ּ/KYNb*zNWe_cC .IR9ĭTxf* 4 0ƯOPwHhےa 6JxWNH)y(%%Lm` h9zţDJqXl$QWC+)JLf+X5@\`˹/\(sRvY<ڴJe@X+DL@9ze.eJ7j+M7 6G5Wa\P:r]"+\&9mٙ8ߺAz^t]*E&HEylHEpQs,ro]lk;{6a\`>y$SV[ 1u aHR5GYO1, e}rvU#VLp6#\0Fll鎹=9Usqvs=hA=cdYdxbj`\@(RНel6(; FͽWv WA]<3s*ӧ#8J;*m.?\˷(jIϟ]^LA)*R03"HcE7w(.qKV:)^{DY§6 UGb-mn5~F8??&-huQ f %$/Q){F!WNyB4(4)z .Xk#_vy;|{ 5[!@M=0,Mc\.lr,d Ȧo-7*XŌC@{bDd5^Dn2ufGF rj!4nK 0r FGB$ T 6+pR]`HAYIyf1霼UU+da̞5J jF4,P!2+L1bda8*g 쳖$kqF (u-je6ͼڟ`|/]ip[w1=n,\n/?\<5,b)k\eۤr6J9E?- <ӥWYҒt4|ETp1-Ya9 b\F<z7>eBB0$wQc"H(<#H,!d= ZeiンiYG:[Wt5 5ٴ ^]6ࣵEp>;pp J7vf~ ӑ'`By`>s|F[\u44dW[h@kq&sB!@1gn L2 ŒX@DޘY"q " !3O,TNU:ekzfldȏ? yGp|seVk>G58poc.?[ x{x ]*죴\1+"\ 8jr3WE^{I@hunQboY{n6{u z{#pGqۜ XdNX~5͏pm1ʝՆW7=6kj%Dw};=ۥ|R ! [9ɹVdnrls W; K5%Ǭ?<k #h{sי|H@fܳgYjd 2r5}cdj5qT^>AsmBUr0Vbv6dS\q/Wm nLz2 eo*^gkmV8;sEa̔.2>[.jXQA2c4@(rs1=]<& L٩57 dH9 CkGZŕ9F;M1>p Q8UFt >&SDL`Qq<8[pX \N>ufjX`Hqft3x$]tjRػf|Z0u)U{,"+'a꜈BP'$ϋa)ܣWH7x\Ax)XB%@ P4TM4d[yx ,t9Ln6xvmr<#O9]1Z@\ipzzkМgeI VEgΔT˹j*vܑ '1YX9!CLsO\U$ pTBL: HbwLqL,lێ1BqX#Zy,tUZ/Hiy#~tVpsE|:B/p1+)nz(I\`>f3 ivwu+8҇x,(hN#,DBʸW8XPH/E4aY:(E*& 1V>@2B$BPUc}.k4gK.΍*V{jhk< rl^t(W%/;,@nTnDaIgʿJH&Riur7j3qkkn#7/IF Wl}7 h4,%R%{4],4J=* =@wF_6 *0;gM58pzP2+ɮ8v̬\>שE 廿*Ex'Y%ksT+oN3Kb*J `LRTF[+*gFf8%璒 -s IHJ1Sv(5jY3Uf.\{,An_eUZkRxe}lot[c K639DguXD \kf,FUpq,mw  bVlFf}؃eeE#6]5jՠxo}H!A6EGT4m g6Y{bF@mH!7֎ׅyRL #'&%Qk~lVlֈ:^-kuc8{DZ67 ~P<à):O ])h3jaEk@y E6]iw8qGn>cg7c(X[{dLjǬkx1=qt:Աg ]$ڎet.]0t' }A`u8pai\ Jbv]4UY)D@!1ZBġ%^ZRؽk~o{y uЍb6e`M>CNZ1,J"~)ѐ&CIS2}( @b$H18 Rj6Q1D2@֑h}Ҍ| tzQDp\]{5TVPލ)s=!KoAȒw\ͳ@HJĎ*%RUό$GwR/Aca!*҆CAT;Ҳ޷Z3O&Rl& Pjt )5X&bC`!i@hij" <:I -uƕ䀴 jrWdfɚdL(MflB:`,•6 ]Ke\56b%t̠m7zL/ϑ *lsQAۀ;E %AP[(&dY)D(,@(# Lr;jk oN9{!Q:>)_k4-,XƢ+""-yeEQbݙ B2#SӪPŨ;&2*{"%b. 9[Y/WӾk?{6YF&z#) Rճ( ك 8+Jڗ#@D|VO@YIy%eAYxM"ko.$m}Zxq2' ֨,%""b(YFdMrH#0(ǿ(Ӡ8(jڌ'v He4qi-5$]6\+@A -r'"I0)HZE~R ;w򷳃D7þMYj[Ь)i" erP\jEDР cIr !53g|A;Udaoi7ŝv_jhR#:UQy49G3t֬<.ΰ4XdXN$+]̱2#ɀi Q):@$IMs62 C^+FVFn>Ik(V%Q5>GkFf{uym-|\7-my0􎭔U;|qc[3%z >,T,.XS| ghS& Mwa}#oc5e߲!jPu~Êpo|Duř#09t*dH)[(,礅&AP3.;u47~d|Lh.ɤ&e,%S,Uvڹ֨J1e#)Z{ޕǿm+[}^F"x84)B%Ir) K" ٌ-OtVK#{E_YNKt1 X;Pc1C9 -Am7v!*)MRy+: [* s XF).<(g4`R1ҳDPdA mofxx}< Ō&4|zr2 aݪI{X s!Ki *Uah([IhUbTa@*@S?  X[VQQ"r(2^:OC YY+PX5R[A%o8E]33,TB{IY-(6g&(A( [7VFXM?+OOIWoGt1~꣗9&,:gE%D!ELdQ;_c]J'(,FzHh 2`\Dg=X+3[B H$r MȘӪ1 !)!ĠA) }N\ciF Vp2D@ȼ]Ӕ|9 kPEd*cN(s5 "ZWGWaهܯ>D=IFI$rm)q1+ƿݿR) %L2LY N&$e_ { |j{(;"-L]#7h5b#oT\Տ!3%(G`=Z`ĴQfPɾ(ċHn0s^WtFn5~VtU:>tsKЯ*|U{Lۇyqcлkv/o}ʬٺ/#7|_Mv|2H?y9y!NY鎁W؏hÆ讷M/`nk^S1ȳgnsnJ |᪢a:/ :׿9#~whkkgWMnrE?u8Fa(Ps+ ɈKDIC@ My Q4ϧ3oIE!5SRdj4i6!EƣTJNR\n6 b"˖ 5W8wdjP 4&% e0sv; Z|~ {y;k} 76_r| ϳ5H{UYk\/c6a4Ohq 1̆:Nn^s>6zAzc0XvR05A1if5"[)\A1΄T c/wp<3o<__?+yfޮ_F}' @6gьL0s/LA_154VPsD8JDsy50W b;%RE\*\kWi0"Y qf)RpAtп._oO{-^~m8CY٢N zƱGE=>p*it K@r6gJEaIm 9 <׷z69ȓ|Ec9LAPH4lGAco1ps81N>?>*C/+G7 xIŸ<5H^kZ AzB0X5=Rq/!]F= hkF~B_Xi;0.&Mb&DVdRAChBd*d fʷ4)>_q>M[}ߦӦ/g t-~!ُG4cUn| pu}ujOՅG6ióRCn0缫{r4:8/?TZ3 ރ#݈kƗ?qbc=[WL4>{QH NW(X**!ܢ"'>7'v߹ͽ@9~X'PrQNhM1r!9!GǀʹXQ<9 z}V8헫ߌ~{bfO)>վOnDž.nl)bu*1M)޶`SMns7z5Ih{5"8m8<*{؜Ч|PZ!.oDm.6$P1YGLAS%rN(ES/5r<z:- 8m^QܻQ#H 9N$ ff"L[B3꒯hC͉a|݂qSh^mB+GE/j\Soe@)&&T& S`$@hgQ+V)&Lm.V[˖J)Y[ PKd6XNs&5*TI@B#aUX*9쓦,j*/ȼ8nt!ִȾBK8)\gtJ(vvV>^#)tݮ' v޲!k b&;=bQ1RXiޕ@ÉtWo^e}YYR-;k ˣ~_7BpH땐V2as}Tr/veqf.݌6\HLQ0*u9`WY%pj$>rAA/j"΅+lbRhQ%JB\0[s\cLd3s#Z-DZF^-B%/"=@aYE5dRR^߿}?F¢`rh2XuޚldT͓|:c]`0;#٨ r`O{e ut>h:6RhhR>ʜݎ[ K~]8 aЎ4&x2pq{lS(YA蜼]T!w,FZtܬ__+dC+sT[duvkc!uFT*WdX*f[&kXPZ"cJMuzv6lɵ-@AnF_ PD8̜ݑqVW5jf jw gkґ勋z5Gao~yT/KGlc48e[A&}%)1xUL :Wbq%>y} =+{MF ]n(ۙAfQ*;b3gwĦˋv/<=jvH>w uDQxW'V,GMЃ$AI(x({R 4C.Ytk,إ8j_jT3283gw<\$kFO%"FD78"9"q+/za.Il@|.F4f'WǘKAp$3V=Yc",N|wgQMEϩ Z*teGMo\9l8ŭ*1ŐAAQxO594 JDv5Ԑ\i՛b Zs\$o76偲Pdj5:d1;2^keTg@`-'ԦmDB="Ѫ7q uP'B2Xjocy`M]Es7i/tLJCp}Ԋ!qX)E"ӄJsf)TGg2z8uI˘t%aQEm1qv#\/],X EhZb0ɥDDA#,a'2vSscTee&AaqY 況4Ba 1$8I] >e)GʥHa %XE&xUh}bdcdvf(N3cs(p̕Ь=.y }IKŷ*k3wc< AqX"rSrs6{Rǔ`f'B p%cPI=(/nW)J=pEÁ9Nع@߿|dkG.5dY(+f;}QX|S=&|Ƣu9Z}8(B1y.6BHriedɚ<;Co$yì5Mz2hȸZyTs0p}Dஂ'^j$:cOay1lI;?j>A6@m`u'Dܬ16xyChi?%hN'}QèeO>Jbo׻/r.T7 qjZ[[E}Qo:O~󀶙f=wkn4Fv87ίԕAqURP )W'0Ўl#c1*#Xyr*,TI-A(;Q]<u79÷BR1pttdžx^R Iu:&\-L$Yrlrh/x^h&I Y^ B+ JAhDRkY9x,t0X9;]}yYh.P_N ͣst x1+A;7(U `:IqnuLZņA[1U~mU읏'Oڶ"Se4 Q$ATchegkE2%f5ѫWG*@*hƵJQ4q uh?T=s|hӬtgZ ӏo'\gw#0gs{ݓiJwշ$4?Oh7_cjviMlY5by5S\,h!&jE=Xz~r:g}ni_뼑j]VMqQc|MF|L+wJ?4`tzJ䥚Ŀ/;;<_p=?݇c.9[uB=0Xnv/m,/ 6{UkTT߼jiaMuhKbsj{k;2D)H tkU-~WHlZgӣgm|!U ^'e؉"rˣԼEsT 0f$@] ~*Yp֑ݟL\$febF A'k.(i3fd:pA*vJ26 d doyZ2Q +'nhgr9>KQUt_â`s5{ M/G 1b meC:'e+ߣ~Q^_6#ntU+,TQLYB}iyZWe\Ž`!ƨbBHFKd,h >g,dVҚ{yJ[\vV&WêM՘fpC'Z5n7="?sx^z4TAK;-Y>5j0ɱrGpͦnӃldf$g\1Ggr=Nʐ]#GMvgqI 퓫 VUb_\mEZwV4u@WHÜ#uE{foI$캺*RV]Du1.H]퍺*j/Hk쮫"խzJpS =QWE`eF]quUrUҩV]@ue@sOlިB侨"ycH)[uՕԛW`oPs;:4lFѸLKzX |PN__ e-, ?a*H%Sʼ 2J۱e=C`Ju-w-][e~7ןx߮ Gp%A:-8\;K}cX2`GaEvh FIeNlrNӢZuB!M?\p SeWc=WU; =j~שY!RsO<}qOn^@_-Ǒb*lp 6(()k7k"yԬPHPQvJPǻX)u2 Kyk+WAuFroB&[21 YrzYgzʍ_iKGH a֙ȼd&aŢ%LQKx-M%MY-H}UůuUFE% Ԭ+U%pNU];YgZ\0J']mFcݡB@}.Pؤs`g7rDϾ+y(Nk1ؗ{m1_NzI9W2؝8ёix0tTۙL -[Gx˧o%ꁌ֢k+ E/ԡͧZuBF6]ºlذVri[7]Y?wmGYgi  k}6g4y}q>K^ٕe3Om(CMN徶cˎPx`22 -({=5b/_f-M'v7ƋMF{vN nnΝQtz7"e_[vں^\n#zREk*]>M5~:6 {WCLC, @ H=z7Z8Np{T8E 9kcy튎Dj]!%c*z*&a[!Jk*[T>_/0:Y_ж: .]K -K=Nb* &}AhTY{]E+;ݾ+dtb"$X'h]!(>^-d:X'K Cn|S۫}t_^)_ϯ9twq3TKb)@dWE{7E;%sov/kQk+@㤔F-ÖJPZW3J(oFd>"{2: b! 2 V4Yk,7lGQAʥ*3":nܭFb_SǮtֈfԈFlbktXdMĒZM -XcJDLjĈZ)2=;:VA'h\2!gel%@)h :knܭY:^U퓯mghG88>& 9EJ?L1A̠"|`CVr`R!M9z!nܱ>vӇ:v|ϪmApc#E?jTPrE2W-=a(;Akr+lClY!d1UP!1 r'q^A&SVg߂TQ[_F+#㽊4\A[Ga(=AzraEħxٻ37՜b*C:;=\s.6J>g{8hgE+fT* aZel9+*}=$f)zZ[$V):W$amU% 9wK8qW_Dp\\{5vx͠٦XZѲ|$){s/RvҶ2,}qaD0Qy* .$!P@n3>Ee^gJseQ.E#-^kw-ބdD}0 2im!Kb0:UꔜZbQcŴL΃#[QQ` CA"v#)4Cnz!Qʷ`FmkO25g' mx>ӣ*OMqw ekT! UXt N2BeAa(%hJ%;Z?? 셦ZRЪ$=lBe_m4FKb;q,ֱMAvQ],l|TZ5BKT0QvbKd$S],-\:Y7r)g }¨rk}D BZUdgQWK@ k ^h 50kDkF) =#?V[VdD$+K/*XVh+ё- s.· t:I.Pj֒uDJTTBDI(hBW z+hPryT]ǜOM*g/) UtVĥߍ1 (%h E+ȫ撓 80Sլ f|GDh;;a 7/Ȯg; !#3ilp%9UBWɲLBŊɆ(cNV7U:t=t{2,n{9.-d0-Dܢ9[D9Cp HJ-"Ls@s"GjS>\'M>XY%7 rp?aI%s2rʆ1k-.'{JVk.R.6bAYoE0EG"uҋ' vݖ u>G>BKՕ5? '=j `Ubp0IƂϣ꒗xVOjSzƆ2禜OX-[ RP=TڬRY0:lS:8!e0C&EEnCU {0\l˅uiaU&LԇbTAsv5v JSҚzfL+FrdlzqtX(xoKmAQ6OAM5sq`ɗ)a7k'{ANԷmv4az0xOQ> _Y7~:|8*JuPf$;dk2Ah|7RPNW/ ˀ]WcSNs1jY˟J8VbTt3j#MPT "H!c 'u&ЉLn܍WD'&kϡÖ@هaHA?e/0瓛}mD>7fyLc%هCqkj%Pr)G;V $3)̽zM%-wd9ʠquQ{ߎm >}cݾY^cS^el~[PJʲ03@;Ĝ`"ҼO"@8Oh&r א R9= 39j\bڀ1Zv7_g71X2q`3F0aֱ/l\ \BԵDUz5F=D6e<Qo_lYfl:UB&2T[ g1\ɪrbk12Xs"bx]>hz6qm߄/oZy=S&g&PCB*E ZN"km`q+ }3^&6x:ޫxTtMq\)RfSi5?))WJg4rښ嚫)2ȡ$l61G# w͍hQ,@(H}i4lK+P!ЉO)gg2#[s AE!lK{&+dף<9?7=}WzQE$M`y=95RW>a$?֕GP/& _6|}jX_Y\_YODUzr-Z}Vhc9e)m9xots~槢4k+N`T)fH)1XE] o&7(sAn(rDon7xvڿߎS2I/nըG]QԸu|ۼ6 GEIzrt~_cˇ^xˇ͖'tP?7߿߷?o7WO?|'dNZV/"}  ~Z[m3t֫ Ǹzq?c.4 =޺ k_O~lJZ*)19iׅ+JUPa?I_Vy{؋g]uK4?@04c\k ְ˾BTLKx\gJtg`ӝ'_Ko\Uc8?zQ!)\uE?Qqg5ƱKq?p r:yB!ff1n$ Ƙçsh_Z/1;MŖ^WѺkk`BGbhLٻ6rdWm~ l& b/E[YHrb)v.,mwǰ"lY%S)rҩM9\iޱ;HP- 傒k˵!gh $Z"14X! > 1Em7ĥ3(L4&ȠOF@Z qFoY+PHd>И!HW Ke5`( W /b>;bEFqNJgCw1QSFp;_+/ hy-rO=7r{|2Ut]|= h,҉`UpXAU ^؞83fOLlDm0b!B E-.%'H$xvY(\JH^'mmT%20 O{?L+Mxj5qF2׵PHRxoCK[; ?n`dw{u%xM,xPd}3A-3OEP_*N`NFz4 *,:xÜ' ]]"= kq0pZȎE4gbU@4AQJGO,*!"iV8A&f`/(18-(K$F)"(rgWڳ[gvC y얋eFp;~8vnW*O˷fw| WnF7/y]֙aQu ^d(QkҺlg!6OPSs62 ߄tOO۫t9~fY2g=[wvިy#_y3g0LԷ,~۳.s~2N]"QqLikr/m{>34VgɈ{ebشzylsJu|yWA11Zب"i6V!>V - Ȉpg蘣hraE&XJ2-ht\3AXҮ#f{fqHl}0S#ok䑿YC|T6Y+R˕!Zâwy"%XH6IU$O7@j1 *V斨Q7p2XIZѥ[[,ӫ6KHۮd>+|!C|柤injh0 SXRSZq^@siWMM'SS Qc%G+8*ĹTv(pDU9ELQr.lŻ5tqEC#92r2/2ľhg=i]%p >zL[OVjܢi]Vh3kF-L /}BUJzD88a|$LA3.To4)Q\>2S ֜pM!stְ*δ1)B6#R,ڲ`謏 } F9ᔕ@T<3[nYoq,"FnoٕnV9 50-cZ U<տO{zQx4b+ q0lY\)4KIJl/"zr_އ喢_6< 9ϾGBD ઐZ} ˩}:dcoZ3 |e8\KMT׃v >f2e]K4>ޫmPcB&#Q`DIV0 iCNNAW$2t"Z3"M| rSsYn_lnA1sHD2&9]q68 Isz@SmƷ #)iIO,#-T@l᥷Eb `xǤWo&(;qb=$,폊mKO&t@2D'Rȣ@C{E\)!K<a U2Abi,%l :iCɠM), (. B  *s]X;RB{~y԰,)2xm)68hEOԼA7GԠ1 ;S8<h@%x~#[eFpB SAS U^@ +q%0ӓ<`\@Q>F秉n0/e'L]HQ‘}D_/ΧǛDM\*Sf&;" NP:WT?=&gyLKUcYbqVn" T9w_4Uczl7HpXqZx5#4T9ڒt knFf6'\ƓRGV.Nƣ@mχmSUFdS}UWs8,HIڈ,/樿 C#:TbXU[*U8']U6WŃ0t˯?}8y?=?'wwǷG`!B6`-/@>FMϻ7X[Mc{˦4u=;E]nh=fJvV{k88q~8 rzRxV0q_Ttp>uT?\ʨ**R-v{IiRn'ye nBșCQND$ k,'}TyEzҡÒj,Ƣ\ h{Iԇ-;,˔-;=UvDk ֈ#_ `\JlG&cL>X:]:GEmMEf4Vv39QC,h-LZ<F7a[)WvMݙ-'.tJ bohha@FCU1"uQ+)]פKM?)REC. bfܕׅu.0o5z@ȱ};a7`x0Vs^Ejݫڽj&nx7LBUZcñ׺7)רzZ_` Oy:;9Nni/KPZ0 EU98DC1b*18KEɍN>FTI&)%5+QEQ:bI7Wmf>GG:T+䅽MN>|oMuYF׻lJG_n˻HD\ttZ\ȒV{Ru̢#bs"X9C)D@u!X*(ƻd*%H;֔:霘MJӫcU]ݗ7zahsVS; +8o"?|"Z8`,Par9s%z;,Ve dT.ʻDoAwü'>y|ٯDfh :9WOj{#voȿ1?e0H2{f=h6;xd=n bHbQEF@A8JcXuDtS?&[,*Dm+G:eDZǔl0B Q-=Xvy3$G/yd=ʲ=ج{ ܜ3~=7P.Plh2([P$JsT5ʏnK[Ե.[#Rle^Nv3{HU-+{:wgY䷌`az/5_N/K_4vS6Q3ꝪdT ɇ *;=[d9@p2”4FgI;87U'UU?TlD}2kxM9 W g[6ֺ1IVŹ C/P(:""C >([zn< lYhkCbЊ$_6xq6WuY/$?bZåO'MMfCϥutg=  )(T'ϊ =$}u U:B%8 TۊRc!0S)!EQZldU]pC+N 7NićY]'RI> ֆY}g.R0^/Gdq}w wG5FG*à]r`!sƓ#8YGt#łsi(aYHW\]Ҟ)W0QuDR uL-eNDQ6ECѶ J)E#&&Ύy˒#:Sf_O]60^t~ Oý)l6Ĝ]a[Ǜ}]ס $q|̆C/w݇\ǬOm#]^YȭͮiIj@hm/8\Ũ?\ C{>v{q^Ξ✶?Ρ9,Wn{?|wӃ/wyazuewse_]݈}beZfviuW 1<ܿ5܌׭R_F|sTrXknP+7im.? 4#p'Kz eVq Gq(b0Ǐqa_SzdvI<Q:Ql3@Z, YN޵`IFG~^qyys!= ~Y, f>ܻ~,ǶN ?zo䙶0vzs[z}QiVnEgrHPM$ WDnĺ`չ>-|\8 j*U&VzlR.Y{ѡARLBp)XY#ruhqdƺ"eH2B 茒m>&, B_'|e/oaE10qchN@y}sW1#^Wx5~V&35Ga܃~ȖpEuG dsќ%KGOB^/~A:IZt"

HƤPTF9d9r{-Y9mZܟ65Q؏?76R΂XRTR=Ҥ"$A-XX mF03 Zi'*PKE$A0\:Hىϙ*@ !Aad xIF 9ٜ^ͬ"y>! !4lGʄ7W6<^$ bX't֥'U2:FTa g(Icxgy4=Mhy>BE?Ɉm *^IEJ,V<,:-u\u %n4A?7^*AkB*e\Jo19`VU^c=o $NNQihw]zZl]GSB= iWһ /0M7>;?ytzN&_NOxOd>dYyrOuF-v5kI:Nqb^SB4,]EYr^}jU>yT^ݤ Ãv>dž7 '%o~ÐN='_{喗7 wS_6q9PKG\jM)5o)*75%pl)Ӈ=ft̰;WM?uفe:ڥn8 zh/ցgZR]7X%'؊jډ9l&=y7]ؠlL|kMHwd'[ binB8%OA=iF-6@ye }R!W!x\31?}gG}c>^MT0/`x_ x~y~_{V'd$ )\&Hc"JRuEo@Vs(6h3sB뷑ƽccz9O>_'s~NzG\xcv3BUU{Pu\07ÂƊ-̃EEC1 .dqx[78jҘ*;[74fOY6zgz5|/J%$c $bPX`TVOI>,<^2+4K֗j{; q!aEv2SAu C|> ]z3j~!XkjUpn K\/' ]կz=Y|UZ,:5W|_ &dW D°7soCqz8y?EeǸ<,]쿝q[m9YO3H]q>vƼ, f &ĎQt1td9nEjx<>mFˊׇsGnv^@8f1V%ZC&]2eEhDIlYRQ(z᳐AMف"ySX&oHU Ɓ]3qvh4q ϻ̪6V>c3~oGZf:eqL7o&B?`9ʹGcG,+f_$M :_v'Zv-Ӻv2GJ nS'5o4>%W=5{'WKQ=mctv=2A]e#<~z=7|`u^miTyZ<~mL/hovdZOôJW|Gˇb  X( b|A4XbWƕޏ,1ZUa)K9 R&$c E"t!(ki&dTWLT@|} v9) Js"R D3v+R$Cu`gSs1飐=B~~mPpDh'DeuHd忕Sj~g$-[@17r Tx)m8i$rN`l=Jq9<[64ӓ}˷breCˢO}eHuk |DM}iًjo)~Y*?аg1. TZ;d R&%Bh>E)6EFTNSR1KœJ!',JBajyqvɣ4cO,\bR#bj2-@ԇE^_.Ϯ..t,(6ǁ'Yktѱl%Tے-UXO"F]_C BR|/RZUE d!9+ EmxLfӴb[i@P{}#j v[~ Yf:`9@l9lsNePhS$DzOdFvS<,l{! !33Qt10YVe2qdbTgZh&xCc_DTQ8"4Nk{&(:h1Q-~Uqސ4> l[أ:8f':JJV:N'!*0"^fjb\X5V\g3-uc\#."rAI`zO*)aP 벊h@0hTNq1pq0%[iǾxp"jat _t@}N ~i}Ipce?*)PJm>O!6vԥ mq&h,Щtu8,NE1 AJ=4NKidk2V2&+ڦ Q3JY+|2*A!Har,=߆Aߠ̞ٻw{Q{c>R>;)};}g?_d\B(i4[sRL֤Goqx衔YgTD|F/M!mRZkQ/% Kҹ>[6E5fP4%sYQNj! !4Vs\9{ZNA-iY -ඏEk~W'S5vLBTGQ$!d?FECtll3us<S*kJe (|G|WO\ƃF 3rQ޵Fjdw)5t;6d5Hg7EDL !J(Z)\5XJ(Wiz *%3,Ѐ:&b=:\Uq5 \Ui:\U)7\W4 hઊḵU AEW+c+8c*csJiFvx 5 \Ui:\U)•u#⪣҂:\U)WPhUO0XţaWӮ|0XvWrg밫~+rm_T䃽'uE<Xs/68M+Wr ]f";d;LI9ex|v~/|C<>QR~ 3kY"u,>ǿ}?oJ45f'8lyu*ns}PTPD$xU$ BEWYU#ܲ+y+w^1SoKX 7ݱ^Ni}u0׻$=wɷ2?ynLF~}tm.ExXdD Q3 41)fA[-jaHUq H&{{kgI};m6a[/'7J/7rrjPy8!{)/l_Y),d|!f~mo>6._NUZN&yGh U;tM(A]H>KH2p!n/xKC.aP/Jq=e^՘tjKa!baǡS>+QjtmrRjOCe4d?Q$_ngE1P YZg'/j2xdXP?L0fsMYaɷlwŎ}õn}ƥjm{Eݮ~:K3]>)h::"Y_pR Y$3Qf@6 *#t)9iD*6/ςG;{!cII6:Rda9> `^1iupE6 JYY/$+rC칝'WTTs/Ck)ha 8;YoW7$3w0svQT>9li 0 `%kYO%X#" gSKh5@Kxw',:))Π k7y$eJ^eYp:ԑhNdLϛj1N㤵׮)jV"ȇ(bl#ҳI $?(H p\˫JJ'Θմ3zHv$fU(QPrEf*Ѧ͈lW?u<ЃiLE5>Vw0Q(Pvѥ9Gsp bhJR$JLdbHe3G;XA|o!Z4xWձpc t%oױ^.Z]'*!a4uSK(Ӽ dz1 AUZei^qs4@VvTgFs^ %TWH]6ڐQs{9%˳H`ՖHg2F8i c*3T!~ *X$~Z1uHDEz()-,_'u|RqWX(Q.9He$S|qU.V PwB[͆8` K+8O7#X{m{#z9"60Z]AYG[? DW1/KK`rb LGCG Tc{}:!z#-];b&j, @MQ,1db2Q`W(K,Bk% Mb˒BЈRF hd&"Ă6yCҥ!5fJy)!4ȠیUo6sľ)bÞqKe;8O׿fr;jWIуqdGOĎGcG%:hdF-ү5O+:hEIiﰖr?*wԞem< 4AFMܘ7 ^>=?AS'5o4>%W=5{ 'WKQ=+mctv=2Se#<~5~>Csmolՙ{QM?ukcfk1n G{pu#z"Gˇ.Nϖ>9Pxqڀe 'JDc:%vlk+ؗ}8(Q7ӳ=,XL#(*vǮr_ԩ/q]✲$]G(HQIP}Q>Z!6a6EPI9@ 9֖!(U|JZU4rPu G&+{$$ޯ̉٪ 9@d.B) R`jTs& s1`> gw p6,:hj9;\y{ٻNe."`|N!#V}bzN4z7q#^m;-{jْOsdq`ZyB+;sʁcvrh%rHN嬸VS.dT3k)5glM j@a /5f]P+&Ł3 b֌]3U3 o W]oxM.>k/AΦNgWߺVJ$*͋ᙝ\J$"IW.NA5 ,C3R2VF7i'a:_ŶS!V1J\0;O'喉y0Z{(8o#]3$ֈ RBj`'f ojlb-εك*( ( C@ɰ]#`f p%\-ĨfN19aoO?iE#Ոz`G8jMFcD9&C-q16S {c1"Ŕz!n~|hQO%t_.t[+.[g1˕W~^ma A90!(ZLmJ@v-hT`\Ivs*J&3*eɔrxeٹ/.O?,| [o//;?BwSe‘~{rPz12ne| 5X#\CM6I)ǐ"ؚCNqp1hiI2K)X3Ct>$Jj3.o@/ӫ.ً>drCr vhYhYͲDFsBy_HgԆ,9*8sێ9;–pA ^^8PRQMf4]9u(rcCO-;nL溮E='kN[Tb_#V/UmkrhQL4l!-ؠbN2hA/ &HŴ ˮ :G%2`Zio.:}{1Mj;Rղs+ :ْNC *pŸN }YXAGggm{izB J;[Se}G/&AH<vM<:U/Ǟq(^t9=;Y(*x.FUJ$]bHmTBJZ;5pmW6F]Ohښv:e8\ 4'aws$9R,Vv:  b*GAVai A h񔲶F96)sb`1jѫ1T/ (~;sg|4+ɿV=x)G?pߟA{I;h|u4h%&9ϺpU y5a]$fG] '}FyF/Gu<)}󬹩ǵ 1|Ȃ_]Wo3 JW+Q,vgSQ!Rb*q'b=)ar7kYSzqO} t4ᏽC_]^,VIV˟ng;I;huΙ_NdeOMR^ܬvɿ]|szAѝ9-Z͛wScN  F#1I=;}{2_[oDXm>O4Y-OJɄ#}xu8\;mfYd(Ym'Nή =j\\;VG]Nrݨk纬ڏ:k瑼6ZgQ;:Wp4XcN3t'wdN/NϿo?tx_eSƺ$_CgCv顆C 퇜bwS^3g#{̍F SZO~Rr:v}zlҶ]S![|~&r:;E=UOUpVTyUH 8 f Lv."; vRv1ȦTI@#%2'0PPlq :G%{EѴ/h'=a^z>ay;6?@506cmA+af]R).%\sAu`t7U=ɻj4\ij& ݙ-.=(dz*ݡ.PYgdcK1w@ LD:xVt<)=@Օuք2&@]"g`I"HWRB@Ujs@ѧŴJJZ|)s9Ʀ ĔȒ/R FΗ Vp2UPd g8K8hPRU JsHDV(,Kȳg`Uf'ܖ[j&:) “6E҂R̩TR d; 𯕛[7q =3:1gYO\|vfYM@9wUШEoSL}'LdX>V,Z|{4DLg/DtTU|n] ;, E.dtM'݃NRH?0ݲt뎩&=dY)<S#~>eBL>p>mP[PRQɣ)gC5 -bQxzFJa,f&[BFe0qckٽNflF Tvy*edFgځ8 x.ZKk DkLڔb'2Eyl2+urk$&;v1Vf_}U-iE!ꜽyev`lECww47r0,5qmOm/7kd^~>F,bԨޢUÏO/̺t&"o&d&:0u|͎T8BΗ:c3UPD cΰRŚQ, Z XUJPW ko`c6N g^l|kd=Xf'/`>t!m =o#y&㔏j_uoݯ!ݜfλtRwK}s=-/ ܻެ7$^r˦&_wE6Oi^޵B%8޳v`  ZbL\ 8T ER9Hi g˧hqjZnmDֆ;@욍(Q6*m!R:o A*R!I`^LQ[rK:bv\VyY ("LDH<a U2hRz{[ngPy9yOzDp/O@f^]?c[- ߩzOYꕠ8>@k\%}J6j={id@Ht 7hꔹGܝt1۝Ǥ %գ8JFn4E9B^ka@4`5`3謏 } F9ᔕ@T<|9[Tp/' z:V/߭|,PE|cz$N:#x$eBէWՒ#!+v49ę\͎%8Skg*r_`1cZX-EO|m<҅B&Z eq<ИǺsitl/uc?} &QJ0#`J+A,4h.h9֌H|1])ºa)=L| eUn⾻ +pXnG]_851igHQ8G'457̓6K hovU뀡JPSq!pB$= DԺ+xJӑC7EWi|"~|k$؁w|?3]OplZxPpA .8]VK>q:TKBZ vg<_zC<<7Q?U<xD$E Rcq;&"pfiVU>QR|\ƨpJO{$GO]_=y7eCdc/MǗ\%_e|xz Y?cv }%-a?e^Q0}(LO<[Y |sq 剦#v]d\J҅ax>.FeD5Vm=VMr :fxM6 ψ9 !rȕ'Zyßxr@bZIkrtbw9!Aw3ҋUŽZqG.>cckz̮i[oڜlo{Zd4\ޛXr>狈 K_VM*Ө+.N{FixTKI~q%E?fW6:=,|(K"$w51gK^/uI"޻&L͘SBlb Jgr=&r0fٟ6M}_3L<"T_e=cVz?yAh·ŇϷ&|QOWRЂ^\}r_>մnjM2)WwvJ}H~ݼMTc]"cq5gL=wjhDn^&U,o|`"ʉ hR[> o=bY&u[\ qoiGD_PGvG`{yAk)(G(jHBR2F]vNAgɞ_gK1-my`vthx.5Je(|:PH-D$Q6zC:/J?Оך5XÒ÷K/'ejMɽ*Saw5_"\)f#VU&cLWJE:zp9cA${4peX*S͡UrŁˁ+å  \er:73upJjB~8>/ʢu@o⦳̖x$UV2i2~Qcmjo N+JR3Lxӫ}SA[TYfZB)'/.z?ArWronvx< k?qJHWG=l/a_40̏^҈b\Zn|4aֿnyf )PB t2L-4QK/VNbaE09"P</A]Lbv}[X1t4ᣱ&qc୍c!qw:s;ݹo3nÃ덽Ǿ{+&h\ŻY̓Q!f6:ds% պM!y HTNyz{U϶E^:fQh;Zzsy:~Z94--oY[j .^о~vh\(GuhR<&<2bV:.yR𠢷oF#g砇.m!+RflM mg< ck/P_Ax9siX;foJ }>pZG'rI񐨕"!y)gyTyKcjCLF*wLGLi옢b58/+5W1Y|}&L>hƦ`^qH Lt6Yr8FNx[Z{{s&n3zy`pݣ5-v٠v7 y͂E=/e^z y"ig<Ŏ=W-Y4OEK;TmF>RI)B͢JYw.1送\V)Y)jQ>piPoPHBwyn>eGmiDѲz9[\hy׸)|UHGq"7O~Բ"oEukY7pӹrJ;a!I;7aTX!@N*re-Μ(Б|Gp+Wx(jƍ]ô:iC MF,ZXA}d FPbC֭RB}ZUw,qQ#X)Nǘ 6@2Oc'aqcmM[撳: rD໱% Li$\'kA H RKY︦`h2H"$\Dw DQLZXƽRkF΢eYZh!Z0GO,Uh_ nEJsƟ @H0QGjONi7eaM\_gòomWJsI(FyMq"N{7{u8A݌PNɄk9fp_Ofsڢ(_XHz!h*K6\Mؒ nyj F*uN^?_.6Neҡ&)iDׯIZen6*$luNNimihrs)q{LobBTݫn5X~R^xs5xm *$}pp~Q=[nvA6j^Fּp;O3 y˦iX4UmӬ2{ F V&]_zsp9ߵ8egeYlq Ygٷ )!<W3į˞>=qcՠ1BW5`T]Oٻ]7g?9{e_ޞ :'0dI{^}5mkkjo157buj%>,4hzo-oW_ތ_о.ϗekV-"!QOzU*~0zjpK1C'/\ܳf;IRucM&6[&lQιE5\;6JlWI"IWӵ!E;Vo]x0[Z[}EpԞIS{ݯ"~@sbt:A7yl0]G꘍A18.VD1XiMJkyI=g2ya1gfM"ӿ&("lfkpNjspP\!ϕB.2KeNDJx ^c}VV;;a(Byltu#85l avC6*͸b٭gtm얪Uxg˶N!ɍ][GlO (Dhsե֏nA17LѩtC/i8-^,lͶB*Pɦݫ4|Sx{>5,lC}oyMng7GTGr?ߝp-_ 226RY8u?G{L ۨ?nna  sbcObȿR!V6E(uBM@o灙>jGQbI0\{8iR`8$^%Uyi<7lܝ M7kxVB#[6Gbug^Y{+HPν4IH2aQ:2z>k!MX{:H꘠) *YW[gu,v=/DۜF>LҊ#fA3tv;-L5k.޵/8 8[;]#!픁h G݀q 8jrlv 6;%)\psTFz(JDMRABYaxH XB7`+kéˢ)V&^ĺK-)3QL癩Eܦ:fk+ Z3.!4lcX&g"#ϔ#m3sB*C358+4EΌT`HRNܠ!V2iYd,Uy |Dٜ8:zeї> (Bd΀`̕yfl=.dמN\ԵۛmyR1wFBC6L+CRN/#U=UM`nXČ|&_;U18;9O/鎔06*(yC+i9!H>:|u}7E5[L-O'Ӗ8owqp0U](CF}8(Y~QcPrLG,:0(bK?+ aIY+v1TiJ LV,l6mW6@{ׯi~8h8;ZV>{JeH-wҗ+uIjŠ{A"˽03g!p\#-~A@)sI >2+ *Y;DV1F3=+ &Owp윝*=vGoख़sx^@T΢DgPjtK6[Mh^:0ĤU a:3/O=3"kCA( {#b Hi`ܳl&<` PrkŒSAǻgF?Q2%-1lfB'@h,D.%JVϝEFDЕ|߳<{o'mj10gw5C!C: RT[; ;س{'+ ^&p_*FFø*eǢRQPRK0LpɃU0 )#a0ѵӭ,2|,2 pgD_u:oFX'%b"yM0/&}on&CyiiKO弴lYfKfr 6ٙ'Q,.ٞ/н=[Y3r.[~GV\(a yhfض}7ik7N&mpJ BwO;;Țv_1'^Dïn]bUӚEi1鲭cM @g?כCBVMi/7,{?CX}?F_ u;ŮL?%$!?3}#u ?maXIwϷV2R>.?f3ŤojUN֏8~.n_g{G)Sλ% _ߛBܦrz\OGW:׍sDn=?J.g!M/.atW_h`;oʤ&o_cY!R}W;]U`٩NlgE`:KBA,aK0Gy hDQŘ>̌~l/VH>[Jf9|ɡ RwƠ%| CC@Ps4jo*TbDu#U"14&۱J&J~ҟ"/|>Bl1;ǕMoouU7NrT7'9<3*7UnTd :& -BT"XtS3t(=svP@[-󓜁>Ҿ}kPL@18fLxG|R iNLl.gEʜ)C nK{+v^oRԢ|4ɗSNT,_hMHO Gte:J;X'X =y]_$0a5^|NϷeGp~s~FyEz!v%t?l]ěW=SVQ2e=ލAgtЃ]+ oڟ?ZtE~x#]3,zi(ҿIJ?h~|8\i_i{yZ/ZQ\Ҽh{f1T g:u;dmΛ{ххB|Zt ã{оm68\[t|=,Xwi(ذ2d~KhIWO-XI|=50tRs(O4)ѕu՝|ki;NH'WuW/τIO= #4SBO:{.XuFeJ il<8(tAҤ\i%Qq1 9i=S)")Y\L m+(Ph0d.1BVLsLJ륈<H8L<8,Z%O9Rٙu6Bsj? |sr6nePY\x!Ч S=oRw-vkZ8*sϤiiRLiI cC qk"`FDhiaZ!rSu!(#3zU{S/~þs]qAb Ҩ[vdZĴ,g~֣^{U['n XNh)FEsl7mb~dYg8y-ǓQ}5Ut!p!t4Qq2qn%d*51(," c.8SP\!,%o39qbƆ̞5W;;S%H/Z*m 0+N7:nPI$ih4sIbaQu'**f`nhdQg'o_ ZM+&uNqt 2k'vh)΋)̯H7:;jrX;~Wَ~lJz׷drR`[]ۚZJ<ƾR¾WEVf.&497ushAv-i sT-SŸf u8gsJ9V ōs)K.z0T6eDL.gFZA&պqjXXM2G4bFD,/ 3pnֳ.?߽ Ͼ>1'ˑMPm`d YJžcN^9FsD&bh~ʺ:Y0e46JVd3v,GMYdĮ֝fӴb jג#AdžQ{a"(R x3NMp4Y+Pd.K:z"4Wa18iX#%Z"K."QtB:HJ5ZxX;UWq( 0 "V"bD7\D)\+K*K»t9Iْx$i[4!<8VH7F$؉LZ.8RNAíDWK@(+r_xeD֝U*1uV⢬r7{DDR~"IXRxEMQ;(vrG\. V⡪qx6 ۮ[w_1  ՙQF;֏d$cp?A<%t8޹?ͦ8颂V#AF?L+@j9z9A~fd rD ՗fP;Ƀ)|R ˞K;I# a !p3# -2HeU*fϐ2Z/I\x#@ 'Ld 3_WWΎfϾ^{2ܛ)x|)iWf{ p-xגfsv9kA=gP(QagWH=9ޠx_t>!(\FֶhRjT|/zx>-ͽ;ouN+Fy4KθfIwmI_!H~ȇ\lMpk]cTDɎ_5E%/rF5UտztU[pnȶ bð7F hxJ-Iч{ ΢RgeQW6E!-}_k&7oFF@4DkHRPjt=Rjb HdJVQɱ^!Ħeθ.y| c6րI3֡{WdS Sށ+c6ڦ'_Q9vUZ^fwe*SuZ_)q(B!X)(m1 [11@\A( FV2Hno, }y<١Ē"'zPEl[Y%V]!lrDIugV. ɌL2z Us{AFTVq/瀘yG9덜5嬕?IǾk{/?,#`I'Ht,bcDkiQad?QA_ /zߘPtVR&-swH2if,&7uec.$,_j`' vJx@-iR" &) &LeDFD`sNb$ űWME{㉍RYgzpi xDˠe/"ȫ@؂CDX$ f9@ Bi#´]SSXBMXdS= 2LxlE@ȣm(в `%+eɐ[1џx^C6ӳtrV7VL'l<:вmjQ۠ )hll{w\ׂ*zbLѻ:YBiyiuG3L\h;&q~oLݐ41(bbSkx [y"(6Ơi.Nʖ寧n6Fh ؇(>WXH5FDoM!d)*V7E/{P/N/Qv+6 :l[6.yɛt=jenj9A;+ iCF54F=v<^"dS"gBnBT&ώʺ g_D¼n2!Xq4IYEi L5Xȫ\.D5c/5HڇR5UqQ z/TJdmʒ}W퍜5+mŅ)^dcYn<1]$ORBawn$֑ fHFkB$4>?tb \jk[*[xPTޑ 4ɔb ^JA,9mxԃ=V[b= adu\OONa!k5ȘJa<`SۂJ'U5џ@Al*ZSyrJR ,_M۟jkZ6^IwLK 䂢lbxm"Dt"Vr:+t` PX5RI\ATB{Lt ufYJ"g:Ҍ׌L_ ?ki4/9IWoGZzb4:9&gYPռ7Ua%zU}ѿ翾հZ'iKޏ'<-Fؘܶź l &c3!Pg;ƃ4)|:]<=,- NPÂdB!Tz>z͟xB-"89zW&KN4e p!,d1K<,?-jW/Nh ]<:^ܬ^~6:wZ1;-oQjغ9C{ˋ;Qfo9M'd|t<[YmeU䓣i|cIΑ{ӮaT0tsY~fA4:gbɧOˉ]9>_r.v._kwuWj%LZH2|?=oԗ3_eT*M2oC硎yƓώ~w?};»?|˻x.yc8$!nM?"nCgCکC`֋ _c\o|q?c4RkЎqO|;ߍgX'~ZC5gjdx\AWUד<aYOAlGT}\QU!1f@+ծ bfnGIf_No@85C(I-(d]fm2NzjRay=ρ~*Ƣ E" \)*.W9d6C\^_HَM;/>!Koª!y{ԝ"kB_GON5j &`ơg I:Hh\Bg}XfFB\D%&߫XAb)2jLBH%SA ))0193zy i{. ":MsR4AQ2D² a9_}<{o__F0 s$H#LYII"*?)BE\3v8^4ho*0_77⎻;>{ǜD--o_pnś:LwH[óuiMϚΘ:޼fx;Msgd^ e>JNxۭZ]÷ɴ<NjwÎ>@IHI!Q!HB&6)Q?!ٲт aQM]KWn۫1];˓(3{yc./~dE͟Inkkdq/*w%Crz}$W1 [e^SI]ʯ`ڊBu;I bC_byB [u*8 iQ]L\wVBՅx|ugum7/8& UG.7^wh7[2< ׽q@/H(rNRXJDt 0:lF!b >[pvs}HAj^܂]v0aɄ 3E9C Vj$ U&JByEfo~)&BO) 8Lt8EF+)+4H V:̱2#Ɉ @mF+)aSYZekeI1AZC*ȄHwo4YӧF3O/"\B, `jʒS,Q%| gC"3ExK]6o֯@ؠ z2 @7 ;`<}c g0722̿HlmN҄oX9g 6o}L.uh(z%Whqv1|nc6ZQlѻ?2D2IߛfKI2Dg72'(=6t<#ɳZzڬp@梃̠RF1!H'Pg E*X]f ׼B]Z@ TG¼t-׭uxue=iޞx8wfXnlg]-+w5Of'\ =XPڱu*jRA$=p XE'})Ih5\ sPll@%'Iȉjk1QZ>b+eQ dDCX?;/tF5tsCC! :KMl‹>eI>|~a&ݿc{~j us5J;˞[V3>]T-3k  9>z FרtcF 0q@/!gX&ʔDD3&!Fie*%`(D RhZ pvҽ/̘3ĩ63h9? <]rlaݛ9]}}8&1X4&ۘMML@'s8&r6S{~J"\R"D'0$UCӖe(z` "@)!c/R̠R)ZpT={#gʹʼn=rIMN\jtTМOR[puCLgU߸WΌ74Ӧ`!.؅}&Z.M佤ͦ2Y@ٻ6vWX~M@:f6n{\.\2)_N*=!)Q( hy\e[̠@_ߺaT*ܺݲE8n e-~mƝ/}.wh]f-[_ݞAwٴ얘rq;n zJ8ߵޒ,B4׏nkRϢEꮶyJciڱc|vBpnyvӱ P o c~ߑA&(bh¿\f}mHBr@xGIAkYg%3db.E(:;BZS葭1J ;2&=A`KSו ^\? ׶м ?RzZ88tץx ݗ򩜴=@(5huBaI(PezٔF`x:`"FiH R_(*Oϑi2ѬIFiqDðaOsfNzيnWUOUZ>'!C,iC.R,z4 > QI %)D Fi<F|W^5y }0U&!xԬ8J*ECm*13-a0>`$t59>^馏vh9OfXKξVm?\^x3Oϳ<ͅzW/{4I o'jΦfkA .~3ЈP](S 'w+wb>Uٴ{8/GYdNFJ=:BwK?;ȚSv[1QD.nubU3/imuaRWf~j,t$d͔_VrLxSubw|k a}o4"kS\Y[۞O{Hp0rCu}qK6=]_~bHY)<1ɪ:kAޏ7} Cfz7NG_ЦNFxؿe*[ܝOG\ιwd VmG~ҵڄ#ihf8hDT76y7]ݱ 󥘬-U$|19,&~D=qf=宅dR"HAXFl)3UH*$H|T?V?l>?0_!|58NjK!xg-(q>D: dUE F7^ЫQo=Mnr7+bMIOlD@yH ꆼeւOc)FZ}i3 R>_%U16 d:\:P:Ĉ zK/5a&l=jm- Lي⥶9HU-eI)kOBKPjF ƘG|!akHe$7-mojpxeӯ=r:dnJC%fm.C: B|Cx(l&)LN4%%/>@l=9&|$cqIBA"*eՁ$kpܭǵB8טfk[me%ab,ɲ4[ܲ7%8w<DT1/T3ْLNSkl8wSĊ/d/nח#uln6U~54Tj}!]lU/}뵷K^~(^h YRЛ [1>˂FĈjh EÜQ˨4r85Q~R M]1[X0f"$Y5Ӣd Tς(z!YI:\Qʍ腶)bGH<4Yg^zHjgwA!OK@5i X$ ,Reܢ<`%Ϭ*j,1FKy2u?QQZ"+Y\YI46,.3)sf5zgκ@cHQ8NPv5%X6 XKD!R )HyYe H+˿ M㠜F^NYDf;%v.eRf3J#s ̼J1(!Q*rQ J6nGjjOP|iwPVIl&&d`k.0L*4Ig[1Qգz\.Nv4}k|󺰦pC<=&]"C.K]Rz"QO 39Z`45GŌu-Wl]<.R}KN((㏲X  1j))cJQ})%sh8USz& PQCw-pjm :wF߲dMcJBzMe=VaZI9CY{= m5+&㨀M,[ķeyE-mNC{4Be67֓ram]3.B‚>u-JKc0Tc}pB-AgTc->{Yv*pk9>Z^@NLFfTE&D!ʒZmBHsR&1;/< dё uNٛ(1 Lpv`v ^@M]̔>榭UNіZwo=˟?f1URt8gbGNp4W"AxN:#Zt_W-6y^ZDueKW5bF9?mZ''~ZÛnoixmY[:}cM[jA\\~O c|\ &tvu+B~dr{^ty:S[xbwr ƪ^tmiTI봉{]򵱳;B{p\lGLL yʇ ,Q.::c ӹTRW%%0#>Sj8ԅN½~kNBr1>cWi)"eRbJT@3!N=%k ,"Q ('\tRǜB^TEkŒe(1xIxkpv" هT@ח`ޖzA1O@IF Yy@N6y$t; Jл:{e (>)NF֟2=ĚGiHM¥{譆OYuʴΪOp8zPdLq}SNg_OgCP*9}۫ıL.j\|"a*htyi, 12jeVv)mR ^]H(T$ EA%AakZuLM:,93K`.>(R e+RA526؎j%{bVpju//^wͿ|lЅeEҗB08$XA!KF@XR`] 0q"vj셬-& BF3E'$*"]/[`n6;Φy<n%ګ=2!e ,xJe QNuJM ٖuD&[[6ºRtF[ %4!c d),G( \Lj:޼pި_Wx(L>EDՈ#"n+{> E˱fUAoȚABڦN a6&ZDgT9&dJyIP:bƈl8w#S5Yi싋1.G\ܚ)aR$ް C"dKI`)яx09ZIǾxpϬDaN Ϳ_#wJyrލbY_n'QV^F՗nU] Qɫ?S:vk~«_-S&??S8>ξ>hӌ6\go:!w`RK]Z^#Bj/H׳\΅?{z/| e#M U]5x8v0_u _=dž]>כOˏoؓ޵-7r\_AeS #4 eʳ/QW W$Ahj\᠉&ٔșc?HSNɬlh7L毳=W׋ |OpW~3[=Izڈf7ko? pۖ_P׋m;b~Swkv뾋[׬Un)F $sz+]<ׇ^rXjF!x^h˫bWX#iO5jPՆvc)p?9O2$偫WvRxօN~Y\sx}=IYr ܳhfJq3S80tP.$7Qޝ2gV5YzGϯOSGJow~g#]KUBZ6egy{}k4<] $Ÿ?~ĉߔD}u݊vQ~8]Cp MZ^~ FcW*Fm_{v#jk/Y{ؙ]h!*dǶTxHRzBc)KўV2WղlBzO0w %k!1Hd?4UU`s."YgW\]T>2sօE#p U>m\'S$NOMʆb!Ī!R8 ZI  Ƈh#>Zw}2u1qWCj ,׶Xkxhhc#. 1piA/&9ІhT&o%_GW1*#D ]J!v=&II hv)0ָcFY8 9HFgP!t9is^Ou@ИXK4hQih Lh 6e#s4(qإFt& gh P e v7H-24\ɂ,p0",0Ljr΢b@r",U`k6֜ÿy;mt"q,TRhěW i(U9uYEae +In%LzH/q5$o]Beڍ!K֭!hLb1wqWs]y:i\ӔIiY@ԵE+Z`8L&Ses`*  >;5lmG5$繨)@UGCs6zPN25<4 |@aQB^o*(9͡xyX%\CD[9DKh,wЩzBAA0J$ѥ5@OO LPvGoڮİnhl*dO_6+bE_h"\x>W+fP)o'u63 C ;bC!DE #HbQ:ߞ9c0p׳X:e@ $27jӼ#Ya=+NFh pNiu)Vndj;kҮV g=&EY,JL -(Քq'݃r5Wh[o ƛZ TfDV:ZCi`.rdС 5@+hX4S#j|dj)RpV;ͤ5rIbUv-D,0ԲK1;FflR^E!2 C$) q9. r^JרvwEB Tv70&X`j?®z\Y8_?AKgul猏4:Y89 H5IN lq!\^ճwh 9:\ '9 DN r@"'9 DN r@"'9 DN r@"'9 DN r@"'9 DN r@"'9 DN ru9̺xIN q9'{1N D+w!ʝ/!'L_ZEN r@"'9 DN r@"'9 DN r@"'9 DN r@"'9 DN r@"'9 DN r@"''́Ͽ 'P w'PVjQ*r}N pEN r@"'9 DN r@"'9 DN r@"'9 DN r@"'9 DN r@"'9 DN r@"'ЧK@ n_H3/ h{N D,9>G'l DN r@"'9 DN r@"'9 DN r@"'9 DN r@"'9 DN r@"'9 DN r@|@w̨G~v5mחN,ͯ/n+G@17^_$ eqyYl|{?j7h=f=+gyNI;o5sǷdMN r;m:ة X`;iRQX a򼜟ͰQ_u霎J4?`ˣpyGo;j6a3>+t=2t G;?T{au;}IWk~v}^6/w_]ꌶ̫;HD޵CŮc7.BrVU7P}ޝ.;~ Woww{žaGb(ܗ&M훯OFcp~tNǟMƆE7ݺϊto7{ׯf_kslzC~(o Hxz>>@㿖zfehGȭj~JX(4T1;}6[ev6f_}f|ּ'8}~.g3G}NWy>|ems` )tK:;pz5Ft׸uw sͽ7Nohn',_U_>[ؼzl>WG{qf|̐ӣǼu"$wYq"6n՛} HF'pwoxԎaC_@?KϞ+ʑp]|t;:&;1;Boo[ݧȏpt%.gw#ߕUqZ=C~Ꝟz7mO;Ot:TKUg=(ˑ+t?"^}joDavr K7sn*' n'M7jSNLF[66#acY^2LOEnR6EJowDm F>%­F/Lj Ds+<8nagMBK޵Lfy| ŷbݞ5hٳ_5/ yܒ!`#=鐭x_cUKôػnWVvg`'zJ}r^;x۽E;E'/^a{U,JWmZ6_i{ݡ7L۽Q3meOƣZ H<;[?1.tbܢsx ,)a?mIgin<+A7vJG޵xһMGӘbߩ%'2}dJJfvS?ⱷ>ݧQjc2#91zkmF_,0e1coo]` .Yp`RԲgE!ɲZiҤ ؒd5YUYXnU{j_r[>'q?S\ {B2L lbS-ٹ±)z}'8aIZ61;?TuU~[[f|Eq8刂@.F$Nm$9@"tn@?-?6j0@D!e,zg՘IDk1hFs+R~އĬ "K?a uJƵf_ρ!Ke-0-WwP?i{X\6X=z[0//4eG }3+UL5E=gɨ&¼Qaf`iOWkCe)e5Ov6.\6y"֏f{ ]̯nfLUk]ۛȭPOMJATU^bJDCs(]*ץWY;*].#7 d%[v[>yYxw卛5߸nw-]<~7?Ϧ7K@~_Qw;+GwVo?=*S2#[fm/BaܓR^^<čs1;O]?]/vQkrD\`h.:[i^7sBa}ww7\:j #/R֭6WqyѠ'MBV4ˊvmN0+'wŊ4_¿i!L{)c./_zޥW%ͅr8g\Zr[$bz? 띢ӱޏ w$4h1zC 2OZp"Jc^>rhFΞ$|?% ^i~]}0+&< ]kN!R1DcMBTkE5X@*b! ^#Qخ>Kp3"اM ƀ>$%ADd%0lY{#g/P6m3e07œcSy|)+vX]iZl Ey֢;X[,Ӈ6Keg ٮD*z]}QVA1tR~]~એeN N!FH,8?Xuz"{IXFJs 3sc)Qs @Yb&L7#[!9ʡamGsRȰ,AVJHKƾmoSu} ז]2Onc @3P` un|!wJqm2MIL܌_`ٸ=@N/aN(Esfͭ$GZez 1(4ĸ cDP)sĦ!+tPLhb4\F('aUu-vKF[\dJ)ߢ uB ='<Y;CR+Ɇ"G)a˱bq9DCiV948ԊK},<`?2e~B˔ XEOdG춨R# ) wޭN~6L:⻤؞uyAE18GN1͈!odp'daǂbdDRĕlpz jRLV>#ɠ6rsE@O~|HNɜ}RHid6:DƢg m +,S:2/8$0@Qߖ&?=>x*FKWD- i ,*b5qYe4D )YiA(O ɠWՎeJ\V>8O#ꍳ׸GŨށTxbG 䕋DdS*hﲿ]e}x v5ywv_s&ڪV:oJ]U&Egƕ' &t!t6:^YwY߹=ҍ;j|no(dA܎KH"Mf4df.MGLsrC vN &5w@cmhg^S}>mjPS= l)v͎Ujs*w{#K33gYj童&L.W~Qpd^| :iy:.iͳ.tƲE+S \S6_iU07,f||4c}}'1YRwU<&~|}\nc45cl@y("`b,CQM% f`6b9QpǜE>0f,'+D<_6z(1*FC@*(KJ y d| >Xϕ66h8d L2NF*ՠzQz]ʫE~ܽƃEn"G'n`Ē2:.ד#/]"&9"bzpj: ZsɵT&GXHGhI*OM?ݙbS>SLԙbzEOrHZu? 2Q9F Sk,`BNV9Cc8=F8TZxaR[-aib joa`" J@L[Qʨe&ZJl HMa#|OH6k&ٌd)ڥ5uO0 !M YWSLXFM1:+<{3rqoDzwIVCf8}/«j o_}jW4W?Jzz^ pxWwʼY|%s>Au3]Žc[z _ϯH* Nn|R/|)3 ]U$auί'\ F:!:utic;ueĖ,֢}6[dP 4@5' iim`B1 d:τD) X~J)"\JlB O¨U1kQt@<Gg=3bd_()rf `(fļ\Ru[%Z*$"4`;ђ Iq …E %6c˱#2\*밊a,ȴd"\B6xDM7{,]qGUyyoS1e &>"Ig`00=L9F 4bZ(z_3>,VI;wVڦڲ[1W~a']NơuLR^> u~?x LqY̙C:<NcĂa6ݻo[KGzdcgⳂJOö \(w` O=cF@1gz4‚F߉yxPM1qMoc_iy[סP.MaNrȲOC'W]Ǣc1Ѿi5;aE?Uk5/kJ''Rw9N&Z@[it?p\ fYeG _{{!G%Cm'{sre9Q9|hl%s0!0D`\̊H0$ T h-0d0)'GEJrZ)ъ^wt4D{QY; :H(F[ :aFh=F>QB+|JqiE>W| TrgyLO,. xe*H'[#sU@\y0hO,u'I/^V0#)M<y/O'L0&E-G>dSg&{$1cR;ĬW逰 ojRz\T?{W/pID*`ٛf%/!eV IjCXCo09bQ-%`WU :J缍%Ԃh0q6TyۛOG K7̫tZglZ֡ Gv׹maU'W7b[tN?s^ן$֡^;s.51u2q R4MVɚأHjm jF-WTjgmlm𹩓sgUa/[L{uNhf5PmijDr ЀAR ljdk \' md/WkMї+T)3I|ާ=0q`Q/Q̭ڣcKS1g XW$Qu)FsV6D'7jZV,' 5F26̅ʐZ,AP}%NqT(ɥP +y?&pE[CwǸ-q~l`f8{[ fpRJަB7ҀZP8޾?Ɣ3z2dK`SStUWkϔZDAd2ZϽut(q}E|6:iɦ~EoMq7J~ظPiP#`<9_ӎMf\Nӭ;WQ7g?>Q#Ř 1;g/e`1wXAƆA,I塶)y&Ģܠs䷙ҴEҴi9/9G%d>pk4-Y5[UٹMbc9^Um λ5F;yi V3ZS8lkMZ9P0dT!d]T!P;˦3UEKUqͻ;7bꅌZH)"*@;&=leWl˾ަ l*>}kKY-g' VJ”$x"FE=Ԡ:kCP̬O֚DPůAL&  km{3Oy#N@%Q+1%e63g6bnZ)ؼMɶ̎cX֔r҉R/ƏJH|e|7W"x?wWAYh@)"VjN pbT0±6."~5/Ȧ桾#)Kmn c]uR6A9rc RT9$1<.V!Zl[ ,yo}^2.Ϋ=^?y+ Od'J6>ZõO}QR o&Jhj[O|a=h>ۓ,D>ʪQ'R@,wl |p$d{:sP,+W`<\tE+MqWnFzԯ__ҋ&ht,ҔKcBS -5^J> RB0gb?dmIPLt?xl@36Led|e B9?Bk. mfi~lVȃ |bq1NCڦp432Wya[w5֧ wo=a5wgtD(: ?s?xK8T c'P8$%Ox7֝zV?f8@tlWJgx竦sm>jӍ~8><|66CUk7"Ϧm㰌m3BkRlI|M.F'$ H!4r+R$TG͞䙫kz)QD`+?k@hS[&2 ShCJR3,/>"p)Xr+P.bTcmek@S AZ=J3VE ɓK6UDpAA1S[,Tײ0q+|=+r?va'?6"__0PjL^(Nm6E_*ױJG,134=4+N<%wTh+|;'7 \Yŷj-`@`䋇)'Wñm+Eq{l$7Q&`*b R|.х 's\$ Q!ZX)nBEsGq %%S$Vq!uTLd_Ĺȅ7 e/_Y\_]Oijn,dQWí=ԭ}ǿde ƽ84/b/r)eË:h$Ʃ(Ub In(n7dߜ刿Z$.ARwB~[E^"vʑ%>rW/O6y;>[*]?hAvsю+ީ7Js<޹ή9Uzồwg?28Amf?h{oߝ[H⧓Ufޘu+]ɇW_[Lk,?!VY|~N~p~=ŴvUgn_C[ug/V]ضH_8[, ~i_؝5?vnyӨ:o8}ywߩ:?_o7ћ7?:;p,Ë"/_$@>YKG7ji]|i q>XȏfOGiEhwbo]띟N~nQ~8|u[>ʊw;zQwQWlt[u(ܮr!b3 .{~ uwX}7G&>'=u8O*1nEkȽ%!%C^$[-*B5{ȵ8gss*;!ny(\pU|'>b519'kCrf"1J|3\g Ӊa&;Ͼ͵}с*Sw$u'Nä,ϲ^{aYԘ-LSdkF$mKφ|IbȭTQաPJ|d9uuy^닅 )`XTVrXшIS5?9܆6*EZE;7 :͉KG}ᅿk[,]p^iyFœ.~3C~~׋37w]=K"<9:Q}^kϠ``2,,㫠EUMmsQm޳-nL"A8Cw ?m1x/I>:8޻hڰvzi7=ngw~>8}:]nwNNV XۓwNz@[.>3'qET>l5#^ǽzuifrgv;Z4.n޵{ge].:LoȇH-KkU+bw'h2x}ַ:M Yti:`eÍ`QE4P&]ʀPatM>R;r61b5)xٻ~b1*BB>&2 e%EM>zOrWbjS |ڃ:DyAL%;KT}t()Rt9i欪:^yr}RRfmg=Hdt=fNqzXvˇO';Yn)Ǟ\͑5NhhM?cA ҙܜαZcO|ɬ}Zm$Bbɨ|W!'c#ZZ!ss%E6it:ZaB ?ўutV@N0Q9_ D*USFG>*R(- O9 gfR 2)}qDg()f $JI@yU D;.Qd0)/VA,yoLD@Y#.L-!#u dF|Vj;c`" R',(ДYn(EEc6Pc.ro@PCo0Ȧ)c()9OE>\fsɄӤ22)!}M|zxIiNc_~FhE$ؕ,? NfP +Z#d B&( zNc-AɂA  ?7 ~DUuAS[K1tX,ꈟ58TLc*u &np`R 3G,QMSr6ЙlWd7M-VTLgqJQš\ єL0 !:wB=/:k qJ46T,]ZD5=+((wGAȼcIL'@H/>X֠$DtP(k46Zb$FbڰI&}k?VqC h4ό>tKv֞bA\ JhNcFzPIk!ԄK.ݦ Ü񈝻.t%y&d6Q t56b(h1LcqLΣ'ic]Y. Q$JDOkk* !X1;`èiX Ѽu{:K̨%+&F@ģVϏ08n2.̊d:P?QFbyT'PlJeD!v"}XoݷۅhQEUv!ȕ%V  XF;KrInz YyH! 5y.p Jpe3nR"`H "<ѳv9tgK brbF,x0"4p8~ Ƃy8+ 衲},5⬐ -dlC'jjZ e"A7yt֙tv4YFF4[jRJЎo5QJvL hoV I-hVsr7ǭ F˴~ynzxnO]nn!zf&Y$P`.`5F֞5`aM˿h ~;Z,Qi6fѲFhƴg32 P0#lRӑpݐ =a]˩4ts \A727S;`gMJtCۊNԃ`XR:AgPOt4<$OC,Id̰D SY"e[ރKW(gnI%>)LTSD٬Y@_|6%'§Gb=%(v%[(E%?QH@%(D $J QH@%(D $J QH@%(D $J QH@%(D $J QH@%(D $J QH@%(D *HEgT"x@07GB*+`@(D $J QH@%(D $J QH@%(D $J QH@%(D $J QH@%(D $J QH@%(D $J QH@_%gT@ރl?%M(<@(E%P4hQH@%(D $J QH@%(D $J QH@%(D $J QH@%(D $J QH@%(D $J QH@%(]%vHI 2HH s}4Jz%tF@ߠVD $J QH@%(D $J QH@%(D $J QH@%(D $J QH@%(D $J QH@%(D $J QH@v@o7g`u[Myԛ݆kx~Qק͵w': չ'lU8ZY5jQXrjO?6,y{{i{Rm?\!@ӳ^_/קG//nmZKoW]lLH۽ Ѩh%˗@/0~ |ysz9 Y߬WK:&>a?|4p|J[%4W->!K˻Bh;-W2牢|t$2AOgANɈd; x!,qIe.d QSs'sG;xвk*-r)RA^˟2_ܐZf%A]@bXA QгtG@Yg.<{\aƧ:loow+ `$D$OٔU%J0SKhKBw0:f{la_cDqh_={6=ŋO1Ǘjy憺޵Kj 7;K{wE'Cx̀ɱ_v(=ke{m+Z?_w'>WJ؍*zLŢ * Tl֔CBETi?i CJN#se ȥao9Fs 17 016;]{oHr*D`v $pƢfdJ+JSMRۢ$m[3r7'Y%S$+,QQh6aFu0ۚ ?UСp6*/'8J4ɓQ1{__˾#/}vLqE_[Ҳ׌׽ ڭ|#q9}qVTD8GZD.Xr1"tQDS8h!&: ]wvbI|KUi^L}W~ ][A+If˪l}ͦO 3ͶkGg\>Womw=S(~yliϭq9YD 0YsZ{,b;/+yr=(A*(b2 ")XqJAS$1 -Cs$D(9,k:v[t,I:0%r耕Z-Z`]_X-F+Gl+iy.Wu꼞NR#W9Ζ ;UR*:E&:^!e@Ʋ`陜-3:[VgaS.U4`p9978׸gHB` ,14c[o Gɫ2l{Su} מSL. @+P`u|#)d8;\U)"SfoM*~rz sZD)Z2li%%Yȹr+OŽ~Xpʀ/1F:w>00i@ȦPdbE$ * zGe 9;hL }ŝfC{P݋}H)-ϐ{;9"g=>[w~|Fl;xq],f)}H˾fYR۝7%6E*HpR H"`\ʏ,daǂbdD2կPβ%^E++S~/M.4_υ jlrI$h!2=3iLXQ?ITrA#FHܱ#kq*.)Rɐq cEF93ڹT[ 3ir[WYo [w.+ɻ#bWݽ5Z(u|ix^TD<(=h(<45`2B%s)Y(AQ<W k<:r֠Qǐԇ`b k +b a>P,baAN 0I|S3]5/"F%V!dD{j= ҂H{+r=2R=AIfc+.އ=2{,:Dԓ@QnnlJrB1:3iJ4 2ׂ&K؝TAeQz]* gj ]?o9o@FGo ov=S6'h4#.<{]ufģo4(vWIQ/z@nb%( GSjL ВTG$\.y9CgKOEfO?x-YIP%*1R$\`1dXGvZDyS9TKxaR[ H"'^qNnknLJ|7sI7mo >uwl LX :Ez#p=3.xoZG#p ]>$!sI6g_ݭbu ],랏ᓩ|oE'4E1|5 {w짟z)Pk*;DUy5gL:5&7oD*ZOԶ ;Af"N'F 7Wv1tݣ]'EvD= Չ^6 [£]pc'wz(Lt[iD0tԝ V&\X!ݯ=V]5rk $ ~1lZO꯮&*de(n5b{| ?܍o&@굟BS>qƆa$,`/Vz:`/U&GxiѠ͞_pWn"Mqe w`s; P "CZZc@!)P c+b2T+̡T+¥&L|!ZFe)M4Ā8z< 沑sd6Kٜ<oPRr "P@͈ya(r(JTIDh>vL%Ri- 2}[($+U RYU4eUD-s#F\4&jciCwyQy\5Qw;'S`,S `>+t cÔ3l0AS-:9EwޝZe.%=C[jip^FUZc)^Tz: +G4o/j\[F㺼Uc+S2`K.-` tS=mOcfYT{dي<+1lGz9&LLryUB($<3@cĜE J}/{ P*wƬ״ )F6 *>]X@h~5gT >ſշojeLM帧E/ݴ>{?حdDOؿ&BFDŽC+s(Z5KoaƖUA覭-t ?@P ޾h `Z,'OK-}R k,`̽PU:T}bqFGYTNH2 ʔ"PURRD%r9KhE\S0 +@Tl=N;N4ʽ3ø)a'oGAcSl KKȷ~| ̧tF-LEǾ0 ?}eN2=x^ū%/5檂)bm$gX*,^,^V0#*)MuuLV&zG#3[_d6&C!fH |rVk/;^sXe<=lNŌd~v[ Nv@Zo(NFZ߮fi7rATds>``t!r0{+%R.f#簠, 4sIONM2Q'z5Wl:S%]׷?-,IiUo6=5ik$q{|KrU+2calJ`LR2N"b|@q$!A8վ.$[֎F&b/n|?G:M;i8CMGKJ\b\8 4,T." 22̧ٚlE R 1R֊BbL%*ZĶ.+(V)S, hfH(zMgaQqu3rL`)HiC"{e$xTP :PPsߊ9>˦>kVVw9 s_1xOYZ+A7,js1N$ bǰuDPθ̞$8tcnG^p(Վ|cRfhu*R;keH&;<E1E%m'øz$k[5^?Q3"`1[CflQB:&mپ S#^iynԐN;TnHFh,Pl(AX'>.bBK Ew9;iD.6N('XrtR(E+ds5Ɣ`C㧣FhED YѱeT΢(Zfe$=Բj'Tr;j!#bO伵ŁKC9kFΖr՟$fv.Fˇ2)\KRkdQA%JJ@DZO謤@M s2jf,&Y4EO?iqC֨ BD=b(YFd`#Zy 0(_(Ӡ8(9_L)g+)!u=~QBLBV'~ lvf@5#r=8$GE`S>iAh m""L?;~ьEv>XVǴ[^\H(t ȡ}(а簠7%$>!b"3G;XՌCvh,-A/հbl.[nUD?Y%$ &Ol+c;?5G §JuNvY+=z54o(-D>JߨS۔4IDepx0`DKGq$oS x?שw-؈=E*띌Ko(㛺+7~Ȓ%}`*\dJ(Nփd˩RSq&xX!Vm{hi\)KMyqwg+5phR#:/luhVR(`<$ m6sRb5$HsH2`.x >Q):;AsGEd)*Ȑ0(ʚ&4@*dD8ی ij:kh|9}!d[ Y2`:{?!Z9_veg7<6"wl}yxs(KsmSR| k|"{WG@Ax}%MkOSrO ܂oiwU6eFe9'-4:HRTEhжY KmgN|eJH1Ye `!rwϩcQ&">ygK jZVPϸH$Y:>imfR&8l.Ȓȭg~5#g36 e rݭ`E\Q(BI`,hr*uچ@U Rz"5cS̋_uF9CV|_sLF,).Wϼ"ϤbNJA6U Kg(fcVXWl6 zMX1x.x+MA*gF9#Ij?Y $tkyð`H2*6y\'%1Y房\DJm3QMCz>NcHow朿|Zz'hÿ-bG]?4u$`Y?m[VStJ3--n5S}qoxuBiĜďi.ޭJYmyYm_߁V~J`n_d2j26.sY~cA4:`}ՒϦO=\s|2>lظwzMn|V˨qƶi%#ua֏* ~!h~c] yŇ#ev͏~o߿Ͽ{#_9!$`BgK߫__ڪVKzmv-z·XAGްg6R#׶ _~40c=@*O=a˥dx\AWU<ʫaYOzot#zz*o\S'W!1n@`Sjw}Y}tM|=U߸ll4!b֖((,Vf"MH 3OzJU'>S FLPE01 + iE&8ebDUG)'4 :MuL3Jfr[xujHyeJ /^lj,ȪW0Z )b%ں2vId=v'H $YSu,aJ2OP1!\$MpM_mѱ OgWZ=qZ+ ] 5wWxf=Q`N{87KϐN}?{.'o*l/B) hn|z_-Aܒ*ǒo+7Voh@^.x<]RCN3g_WX< ݈O鸾ޣh3'5Ը&/揟naH#i{dȵJb\9ŷŦt$vM;{SGxͮ 1wsP3A"t`µ,BMI(]2)7iTBPɑ.k9r]R";֩lb.]mnKFfzre) \c7V̻MTkw &VGS<:wmH4l|!vvX{3A"''-ZdK,l84YMOz~zQN˕ՠwYRƩ_{jJXmRQHy :'$יǃℚe ,e=7X&8z?Ꮿ5r2yy [MrTt6'Pm@uTA HƂFdu GQBǻՒod<;ns{yOoio~|f~݋!w/~ @ݓܧd3H]1kJ^K}caʞ2ȎE+ɑiqJxE \r -ޚʈ,Nݤ޻w^mIjޫ|7NҬmgA\Pї?E/gV*S. bWjkޯil€^TR0!b=_~"=/qSgy]Xzk<f2jbSb>,D`V>bI0,C)R90$FӘ!*Ϻ ?S#?S}k!M9 a=_>|רM5Т#Rvѹ2/LwN7)HU1Gz7eܲ2 o'zkwMi3 ~E6]i;]-!Qu(=U\wcr=sגps9P䉚@ÅpH#9)Zs!}IJYqUJG"!jYXX&s3qd% L( ZsyT㽇^(MimCGl9؝YѤɸ/4bϺ`:M- j j_0=| ĝq!t9 3(!UB!砝 v-|k][Hk;H\S}T[qʴ'3椡 4)Y,vpJA5Yٳ!qJs~ƅv5^_?◓e,/L!I 9.KKr>\;UU~>RX|Rfϓ a0kO}rrU|~fk)W{nL]E΅d@]g2TB$Po;z|Do.Ǭ9Zp#YI?]`W߫ܺ\в%ŭkiJ̜Y̌䦮e"Myr6la[g77M{  ˹RXt*=Ahˊy 32I +D ?;-Ήn/sP tId@[ʁl fS&49W|N3mDr8a=Q##'.N)Nd)zuU߽9-6YqFz,6` dVAŹՉ`wI^dAUΎEڋ("l .9X$4鯌XH}|c˄ZirR'mQ1Vj/yi?S*qA%y=8yS?0UۅPkyWӌNR0,-6L/\407XAY۸[μZ~Ɵm/9J{6͢dFW$Z9|oCF8l«44a{cjM09T8LGv.ޫdt>~]_-jݭMҶEjهf tVE[p^]WFevXx:MZfzmɃ4s۽"DvmDǒlGY H\AH +< K+$WfAxL*{dA ns10 6g pm`JϘPVE,b6+qņS֗LK_C؃JܓlE^+2Dg6:Pij ۀ&1:ŜMrR53=$ ipg<,g)pC5Ѥ,tYgo|*A;ͤ,DIj%'۩CYDFH共Ld%ٱeP8 h%#o.j޵#E/ \|?l6{3aX0AYHv<+AXVU N$"& w 4Rfc :\uTӐmCU-&AϦeb0܏Gk*键?eY7D);Ynk&Rl~/UꮳV7WGdrqw8Q(JN=V|(RikQ"Z3˵SZi {D/ʼn^'`/[:]ʻn}Qq9^v?eIF10f6: ]g[x>R4¦Ȍᘒ 1IWLII VʲN >Z[LARAחST ʝuJ1Eth%-6 7hUkz"& *R!`$0A 1+FS!ϐeU*y A:90#g*A?A>L ;&!X)Ug!W@ڽI3*OF,9xGE3dzW$j2bj׷߫AG[ qDT~ hކO9x9F:)K a=>B)ېQ%dyM7zUڦs5뱛Y4@RZ<ޖY.\`觳a}|1@C N2cӌO WMYu~n~}Gk2>E'ޮv_Ʒ,7ϵUܸV[ՅQpXbZJfkA[+usmH뙐)ׇ4ð'k<}<Q!b \|Su CAms Tzj kv$һRذ,Ss\ :q2癑w A(] EǔY4͑<`a"DpE=b˘ Xz#m2'r ^B$&ΆUw]b#OjZzjA}O`+N T8I+%@KF5TT}7uv~ּ0*b!5sS9C%5'}`)NJMp\Tez5=ٖUCt[ N''udN'<9֊{+\Ӆja޽yi,v+߿[Gdsb+hBv,!!fP-S:gP3)Ӂ&\m>0U 72pUj{jazX4cK_ܰn޳em 5gzM/0_߽Np8>MOv9'ˑ2Pmh$YBf֑f+\ ٧U=t2+z % IXQHQܦ4F%+|b;|ʣ&f챫sǎQZ 1wkҎ5y@rhHǁ1C 6kJHVԹt ttG:SHgL :EȄ ($Z5 GQS։hCĹAGqǶQTyxe> RxbaMRG"[Ro$5>.w1cƨ>:F)4hzI`AUĹ#"H~qTKjZ_l~;Y4"F`" KaP ^E&gN)H Z9ay/>_ήZڱ?TᖬDi;k)ϑQw2 n5+7(kՏTHL'>x<ġM' M9dʝ/r3쌾A~{e\ZjsAn%K2Ƞєb>)8$òҤ$XU0y`BA8ř,mȔTAmAf7o{ rf۞=Őpvz ''"gϔZ5gFe`!yY 9(0+υn<9AϹTKK(*\9"فp/*RM-Vd<>l^}=~K( w#h3E^pQ`aVqΥXE7(D9/D1t"@20uu~,4;45 W1 HC4BkH"J&d RZ碳&gaPՃ Zgȅ`I*KN1LO^tϫő>R/7R&d`WNKgJS@qG@I6c\2w>Z#%S<Ë)9=n~l3 0#JX6k :]\ĥ)T贈Ϲ|:?C4Ox18 #?%zoV)Y_?=>qF=,rq&ed7[g4W K(kWi~`vzI>Mwg_[g#7KaG<ە dzӞxa-GocFnHJ>aaZ;̪Lp4}=L> /zz9]7gQ{MuBQ% )kR'P.}6%%`<[fU҉_g;/w}޾ w~-:'pP ׊,O@?DC׻7ڈZCxS mko0U5oy͸7̇KG[{dz?ގ?P],WUr9kRT|T@Wqף4Hd}\V:`~^^ɗRq]# XWBm^+/#]E_']xw4iDHX({Q0<] H2T'4-NzuKIEIӥ<@d"*!2ÙJ:RB"V$bJuTsj:m;]es⯚Z+I] %wWz%)1',/ nm8 =C:EjdPtU\nR-+ yVo{y:Uv*)9ErtC<,e1أpJ8/ca gC'<&TWUgBRlN>JH< O U>HрϳՉ{%ՌA Yf\P=kȠlT(o }yj7w.~|la$LO.h8g2Bb|s}6XCDa۫iٿ9d_ d1ƪFx6Fx}W#oM}#o7Fx}#o7Fx}#oٻ6r,W lo|?=Ց%JvRIed8U4V<<a'QBx_x }!/Bx_K!/Bx_ }YaaNo0KpPBx_ }!/T=Bx_ }!/Bx_Ks }Y .Bx_ }!/ 7:5L]bk=ZSJP7/}LryLVއP{ZYJPb=mڈC>zw{66¿ $B$(a:q.2U2G5%s Ƕˋ!I;|d*]hdu߼S}/ mr.W;(+CE:QE7J$lhI7ԯʗ0)auu\@;M:̫<:u]Gfxx'f2?URb]q~QQs6H>8t9gizN mSs0:%.Y;5WN{e|)[9S Qc%ǀ+8*ĹTq(pDU9ELQrW`˥s7"IR"&AU '( 1&$/&< l{3a`NG\kKu|nY@[ϷI1_bjҿtǃ,vC|/b8g}mjDfZ8kH<^yObB\rkDIVWt`"pUĄz٫NILh'[ 1!v: nPNAxͤeqJ(|`0"))Ih7%d{EX/, [4uzb,23@volI9(h0׌(l+4q^:YFyE)($RԝI%;ȞCM?r s]sVPաz g-u}Tgn4,>ɜ'8N?|X@{U3T4tw?˖{W;Y| T'j~D2k,Γ?goIɠ gb ^}!U&ж|;<ӟZDk!qFS2g:i9zhߊ@-"!L?1~t(%Q`DIV0 iCP44DƜNDkF CEܡwh'7@ÁL?'wZ 'NZ)_)E8ϪLɘvA h$DˆM LMgxxܗoP9JP9nq!/8`! .!F*Q"!ӋG_<x=+Fn|8ۂTe|[5}xfEkyfǃa4p0 hDZqHCXZpxY,g)y yV1$) c1S`Y]5;0[)qq/9^NהzC щ0 AcQ9Q=52Pv$\p2{xiw};XoB5cv[p`!al^É[^(\߸ θ()ZCHH@xng>*UP J߅RvycKB-Q;tr#+1 ht夊p|=eTIӘӄv(Tp3kƣ6'"TkL/_>g"lWZ|"_/n4fۣk^\98~5|ƣїa`TeVmZP}f|Y")7de ڦw:jiY(RҺ fl;{zGw!LcmyꓕB?uh5+q^{{^k!/Ok}@c7o'6\J#MݩRMs觻Fw3?~|M⇵LSלu<Z?sRYي v E&1X_f6@Zb>tr{~EL.N\ܳI`łd Jb eeX&n7x̻I'C衻R:cQţN)+}6B M3gf<\녔m|((s^EZ@ b0J ^UL?&!IlT~e2 i{-`4qǓр\8k-Ub>JcK@R 8H-D$Qa8ε)NӺom%ƍ%mQ;'-cPvzŇoJ0r"ND+"}1 EnrRqոQov nxT s=R7Ɓ; f܊!X)oѲ {~k9N R4ɧ5 ~c&;&u?C>{´oO.W;s,oUe0o ~7 v-ʭiQ{S;W cwo3/Vj5%Yb:j>#ޠOԵ1 M |-.zT]b4u2_BsLI>*i 4'8N\Z-/>'Q"4AFd>^D9o!*,*:j8 ?Ɣ"6O9++{a8Cf%L:1cNŶӨ ݨ<:Ov lT~m2TϷɬ,J9$pq+y9%'F\cjC '*d[*7t* ѩ.+sMO&0!eL[!/s70,SZ^{nXS ⊲2!3q.XO!t-5NT:b$)tf"L߁FTh<}P"mYft>f5a {cLҎq*,0r8B'=Jz46QasA6KfJAp+{p1@ fY 930D>DIe/8 Z#>ANn+ʓ4y$Mhte'blIK]r9} @۩'ؽ٭ H`V$kDzdI"!Jĝ|'61Z4YV7rn>\V!P?+FJ_]ƪ!Euyq.ʝR6>$twdN#-~6l8Gʯ;'7;Q箢C#DZi!5Rhu{KVggG> E9D#+?oΚ|PRΤm-;^ 4lP}&5 j՛%OZλ)HWʥZͺ atQE%p jtf'5:ir#,|E=UP[ W@^P4X6jr-6nG;OV'z.7DNu>?ipYLv@jRؚlRA#&b5!aqqcSy.1BN#n~ k-?o~#dZ],};toI H-?Rl:&op_MzlZJn~8aNvjVzŊnG6GO=Ic!Q<}$lku!dBreh{PX9!A!{KHi:}W?X `N۹h[iԔ(G %uJFѥF&F˭A3朌^Sqq!8-Q6,o8z^WJEШl.J+ `De@'o:@byJI{~up+>e2(S$Ez͏¤`VHRhU!lhm5@ ll8}T) 6*Tr BHҰ{/ }wǤ)+\)7jZ3;7{7DO^?dX^luPtXJfAGbJ_ K\|dt݂!{o=>6ݗ.B7>^9r iψk.|V1"vAgF+歮W4?}jWQP_{Wdԯ_\^c#8a{پ{ mԅ'e֜DvUfsEcm*ŹZ2BL)s F4a11ġABj!Z,l*Ewޚ6# s z,kUO1'VR5O { vU]r:*ê9GQ=@r qQ㢟^84)4J`>j8GԤv:UXW&\}Ue0;9U CJzqs}d#C;}+Mެx֊mWR3=:_?m,fw] a'XVg6ZgtARq Lh'5Ouhd5!2] "6iCjɵ E$nF_ PD8l8#82ke w_U)o+oo>vd~p9}__07{6FSl:¶'E_IJ ^|P=P h.QӧۣЯ'{9,rc(BK.$ 3(RώÆs8bEI1gGYL }qD6^q5mKɱ-JXeQ(;x({ !+2Yt4(Xѱ8j?jTs*͇ p<\%w=0%"S#wPr`3aŻ:hjvԼyus) :D.8J45&&ܘxEщoi 59yIq|Ij4 pD}:դqz8ל:Yɩq~K\|r5((2 o454 j%CEj@WZ&`, %.D\ g(85xx"+ 7jNCtNzmzm>6S&LlA&&O(OBҎL֥W7 EePap.ϲ^)_{l' 6.33![ ib77m ӦJ|_=&,֘m_.#bߧFo|ao&eOs^|M@lpK<&E6yh%y{q|_b7ޥa@y0kΝ0%JL2i-xϭ|OvQKRMڢl}lŢw5vb4[ IqhR™s"Ӊ oίpf_ Dx|ywGZOU u=TgWo SZ9`d)wek[5rc+3kAI38FMla9Zd!$X E+ _ԩѵÆs؃Y߁_]^>Ͱsoֿ^c-/IWQC)}ev1W&\֕; 㪓]9a:@?6կ;'7k_—q%E+!AՙّOBD+f}?ѻF78⛳zԺԪ3i[ˎWo&y6'31T߼  '-yyg.;-%z_}^b(j Zj#6"0E f*1$pKvzM}->"H*(-Y`+ EKQqUNU,59 o#cxߧI'z6,LTؑ5T7>TS[k6WpJ ͑NhŐ㰸8؂Y S{Z~:dZ]2ݱ̞Kf{\$e$z]ARlJ&߬^A( $qœԬr_R2l|/Z0zCԣ|$lku!dBreh{PX9!A!{KK4b]`06k~q;Jť^4EE߲TODEq=R,n.521Zn 51d|ꎋqymYetGcEPtRwͶa HP\~iO%pq0a"2QT ni < hl3(:_?V)GjaR0hj+$`t)RxتTGd6X >so`l9M]!*s)Hsj/`Uê5ӦʸzCt{|UCF_]E: ^AËcI͞`C[ dtӧæ !֟0b{wg/V ~9׃}տgw{5e~K3'48k*b.|V1'ݾ㋳"4^mtvK^EA}]}S\~qy{gǧćWv[ԟvq{Ub0jNvkk_;;[y;v/׋킴ҺTO͎hX.ۨlӚm/o{t0vbzj]@,Sn&!LXN#$Eܘ}Ici"]U\kkXJq+2zų+Rb43qA$XOl:1NR)}Q{#a9ѐ=N9 s-֐*&U0b9d2Xuފ62\>֜8a`jK.XGeXUH6H>?j\+u^!f`#F) F 爒߮V;ϊz 8ɉb#fOf]ķ+s!B0Z.Xj246ɰTͶb4ĵ&DkD&: =\bHM2V-苵h pdUSXX ]&X?*;ܦunL,ۢ-iIW_$T*-DKTihk^iN?w=ljВ-v8 9Je15=l8#6]^s6Q{u$joo W^Hd ;@j8QӶڒ޵q+X,AxxᇬO9@rrpbdhMz0=cѼN:bL~"u;XA ώ В!$al2*<.[lMV@B;坼O}@m5ëx5AXWj=gH&/|?jޖdhTǮiV$!$Sqgv)~^U  a3%)FJ.a/z\׋k˯0 <cDuq5 yΊ=< S1o=To񺚆֐,FFy^8IETsF^tulGf͛գ+Z+KTJ3-غ1S{n0pG7flo9HHƿ\]^jpG8~[דtk{R.u#wcnn+˯4',XVxtrW6|u}VCqkS|︜#cMd髆W'x5;ٱyJnhԐNkp<\pzNxϟ=~g?}ïg\WןUFX+lY࣮/wmDW]SˮStm|ŀЯU]>~?>XaH/W (Q{=vףlܬĊts=NUZ MbbThR93O,J5!0.&vtM|ORM(rwhDHX(eRK O.dJ=fc E˲<KwʗG~"$ U!J g +hJ lX\)+UɾQU L'y= ]XՇ$tG2%9ONQE-QgpVjj5yA<c?`Z2^+kJ{]PGxs=mڨc>zw{6%<6?*#&%7,fWpP"{U<iFPu{mo m!Ix$9};,NoﲬEkl̼{c YWbѣ{i;b!M)1iTFg-I|-|O?P"d~lukNN7f4HwĞq飻e~Nc'sqWmҸ9m_H98*rΛuNAw\-!vʂC]cm ;lPas 1p$kpn ʈ$&j".ȉNԚ )5u&&hmϊ2-PB9Yh韰2ؒX/d@5:>%o| >zy Y-e4uѸ[W>ܦA7pN*XE2 |9CWgpϸ :CАPŖX4>G%Jh<SEXp$UV }^7׃>v;3+ǧY.-yxρ~^rhwߘ?ğ]xȒVΫgfkFᲑ==kGY;GG)ZiTZ33!p993^nb~4\Y,,b tHv1EeLb\)"\D4/(Hٙ8;},O<6rvݦwYr礼;UGrx.Ԍ aBibBk1GIbB(&&%ל9.9hB;b(D$kQy\hh8#sh%3@d3qvL9D`$'Ph3EE_h8G#ƐC'uZ]n̞]Xd+yQsCs^ኖˢPP?աwUWu󛦩.x2]~7USrW/iڮ>UUIՌ/ߏwխO9ɷINNBNFC]jTU/@]jZSv]uk"}ͷKA4 KqxY3nzJX&;uw~Z^"o*Uʱ'7y?G'5.:W[|?h[[ƒq.'$meWMfDըj~ϫPŅ747xx1l5 8hQkjA%kuۃ-j\4Gv»4kj;1BS6x3*i"D1\ pŵWZ4eȚ̝,i+x&5²Ys;"ܼ^'߷-^l;OZNdI;oj9 6o]JʏA)&t@ˆ7K*$I%}qь妡ы_puy>-`u9'(,%v:֩'}~,:F'ƻ1 3$B) .gamqȝ}!?^:gZE  53LVaY{LH)1x+zLY\(: ںXW [ΉL9"q9]ko\9r+  =|_ vɬ0dѣYRԒמ`{ow}t%˚g,ݷ.Yu^*\Ձe evX&IDiyt .0{"C9j 4'ϒYnc9ض!nHԻmT7<>:ϫH[U1ً=YOɇyt ȥ0c+ׄc=MmNEyv9͛q&2le4uoޤ-t3a\R+|S*ִfkwpQ] mq^@Jb9 *uk}^?1?sid7[ VMO]}Y)<>)e3qȷsj瘯0*g޶dɛmmm`,yDj[Rj|AJ9)2Sؤc,1kѹjWRQRXW9JL梢gb\m)dll/ss`o&acYOO"9&1[Zj0Q8:Ek{0j\sd4?wVoFʤ՗ oS$=6`a-FQtDJd|*)JA%]HWM`%O{aaHJ/:>vŝ֟߭eĐ޿Nzt"7L-!p 6'iD Uʊ%hS/rl 3%EGA˗oa O Z嬝hުkܐ$LLdJ#S)}v͠QS`!l[LBf?׼E}J?-OU9пnl|4 |k px@/P b%zN" ti>> B6:%'-)Յ@"Jx <Ou =гE*AUT)Hzی!jTXvn~v"Mġی)Z \Wȉ x_7V%\_8g GTbh5e[fQ65}6U[vOI) jю'N*`U*O߯?KY}JRj&8Ba&ΚaOҝPXL8 }/a{7:XЫM);_`v̑'svO˚R89`yeo؋sfiGn*#GV{z-~?jBzdL70*s>qpw36o.06b!{~n{^oxwng!ݝ,ƷJ\~fu~dcݰi"؍~\hpr|r@w]Ҟ7.,i5rP{. (}91/|X $aIS66덀NOD֯|<=8,z7{uv\3^27G޼u0x?їVnß'G0s}}ßfӏkìN1b ̐C )9p VGfpX/t f 5]&O w{wV jӍڙ|io1[~ꝷCѴ~w֛/ໃ JqV6g]S_/'G|ڭJRկO/M}p*^ax0<^w" 6*0A9%ey-ßn3ܽQhYQrnャ""x'_6[;khHBg( rA^њdc˯;{ce!.dwikV8nf x 0ɪ8?hEq 9恹1,#q<p?yw-|-|yɜɻՙ9t \59 -ָ"Vv}Nʢx|dfK7t//m;]v]y5O5?~mbݯym.7uQ*k"JTNlߟo=ef,i'E7^׏5d2gQ)!\U\r[8jJWU=^S 86tZ]?_o!Zk)EL$?AE&N"ds-~b]md*YN.ǵCmJǾ˘/\XSZ@4'0Y;J=R-%$\BP!Ԋ46c7c6W|\Lk#MZʇ|&ѽr/GtQ_.He]ڦl76\!cM;a{0 -bB1"hD\[SguQLe`{G>Y[V1D.ڔ!uX /-SfcΔ!Iis5A*:(ɍ\XZpN MQ5r-F{ :bxXv KÚGIoG##$}Xȷ iE ҙܔNZcOk;i^,IQ!Ք8nlhe@r"幎@]0 B hH Ԭvi8 _f Fzsdnj|" R- O9nՀE%n<+p-i簾n筌3 ^ezJc#(qzM<}CE bȳեce6b)|c":)tƅ%D8:5L\s$^VbP!I*YՖ C,x@5CmޕA+0)eX|/@10 ܂ѕUV 4)*L@68/ pEKX(F EFzQuLO%ez|-BcnF25VJ`B]:YWBnU{RrEp00c:o]B) <,~"5ҾQD^1y>Ln(許Y40L\JSH(zc9T}`D5nWy0i3XCfi"\|+RAڛj,D\ ̑F+tdIw"=@7/:T? Au8GY }Oʺ@J @TNed0sRBޯ[AQ8PLi, Z| $F=(aL}j>VEp'2ЙnquYXt3RUYdbd@TXT50!!cM9;6ܖwkzڦf}-zD~pAw GƢi1Iխhaq:p8@.O0Jrr+klhvUkÝTHo`kVPX\ 5K&匎f,768- `:#$@(Ynø63)HT& ٗĠ PT 73N7 a4,ƬbZ4,j )bl%UKŁjq?rN:N霆Uj$k4ʬp'޺9H@)JM*56]%& N,y Zmg `0TSv`xnF\>/·Ǵ݃}9\ߕ ;ef-aM˿w4ͪd4Y];r1Y+ǤM+ q)qGhW6nSv#<>.4cg#BD;iP"p `XgR))#K;hLyPuoڬŰh*TOr˅[W TEsJiN '~T;:Iyc(iU ‘8'tYT1tT#P Ƴ^ 3~ [;QٔJǤ$ i'!3AcM hU[/Q&JLd2~*᪥Xsm#GWidC[ؽ_mk"@lcy]>gp"dPK=aa4FAuE[+NEl0:zr&%!@"H<MX|A*.DQl|D^]!މh2Vu5Yl5+30(K2\%#O_o9kOCOR@=ؕULP6G">1Dڽr O_y ^ؕc"@`{D$!!@Zf)@[$GI $BI $BI $BI $BI $BI $BI $BI $BI $BI $BI $BIoIrD$>5@YZ)R$7HiMZ# $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $зK11@π F Jy$H,@" H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! @qT$p@ ru4$HN gF[$`SI $BI $BI $BI $BI $BI $BI $BI $BI $BI $BI $BI $z.[5;/N~V\v{}\mRb}&zvW\#OB!l4q E'[d yOI6kYӇדɺ.U*E 2 p2Hc e &u3eA'-?*\K#U񣼈ų LӐJ b(gqGx\`QWu儮S4Q4G[I;3mSJgvueޜ [Cd]PKᬂԁ.8S(iR-[dѢNTk$lWu.^`%l;X.&h|߯9X &59n[ax0JezcKmF#Is8Qi^@2ʈ-kpɏ W%з'ΙGhvYa˿L&UkHrRU#1l"| .s?Vs![<,ךKs-=RjOqSۓ{*y;\.xY/w݉RW(iHّOp8Nր|7P1IlMOzݾ2t?S|߰5#ر8w9~ӉOVx ,ﮑbvc =\CΚfau;vu}Kuw#̺Vۭ:ߪc-fv붜y]騜>;˧hnKYLi%Ht:|IsADƜ·~`wrw('cn~rܺͼ EǨOv]a[`Jߔq-/{L_ߵG 1 Q p˵HUP+SżR2*)*,řA&̛ľA- ِ?JC-;ʜRNr,RmjzwGhINDYTRESRL>JcO$Im>-%B8εA7I3_o`qkJ<㐮!Mq xt4O'qC&׎M綺6#cz>i«^[to xЌ NDߙD, z>0Jֈ3o`\iq%JFJrHA^ ~1:tP1D/B>Q~KY?=$WU0*)1$f2ib) 9 *K$bT6G\3uQ9m^X"%y €h4@uzb,Rdt`Wu0q tDh@) H+kp̧(M *X62)wtVwE78̻(9  ūМ_..wMS=szz2o`R>׊,h)]4\W~WlWwz}x˫,ɫWūgc Eϛz2Oiʲ|s܏GA4gW~x9 k᧾ l#%͵ ͣ@Q޼awV8#zWLlbFϋ.SX>+.n \'W\ɇwISL*hIx_4Ӈ6˞{-N>,M@ZW.m'SpI]^Ly G C+Wɷ'?>\ \)_;mEH{^Za%r8.<;fM,8[v; eBO酬P1y&]fi۲{bCRyl=B%ܙÂ+NJ l޵X}myqp:_L@=bwdާOKR.}z^ 7,@إdwSdz ÆkWmW{)#fΝϵ.'m{[nUU *uGڻgR{\3>+deE'#5 V{_U*!+љ^"*xk5lXg(4LjM]J1 ;< ,AEmi-b`gn0qvЁK'V1|f#c3 6ȫҨφѠsJ&0:\\EOG>vH}l "[~}Wvuj5Kڻ:me-^֫ ^Clix sO9u#QwuUg9v"CLM֤eV)e< ,M(b>]UTG#,JrkU]񹍥rbjinYa2h/*>\A 'dSVݎW &.gvRO%]]/wo6_xK 7F~=sx?mfn:1~!UJAMMzSwp#z_tmEZ}$,1LL;7@u>i୍c2*Z驎9ocm<,@9S`g˴<ЭŞ|W=,?Z\|r|??Q5'Vcx|7Ha_plU$0Lv$/,,bdh #ϯbbIVKrZ1|sWK )&ޞth~H5'SNOčزUU Rqf=a֍dfTus`ĜMS9#2]pOYb(EoM"R9҉uԚbЪi i,_f&=6ouߖ ~'5݈1dM!VAHfB%Ij.XC419BA-%koWU΂E$:ʆцk6am+ޑz9=4*t&'v&häÃuw{W]7(oygS;&-U{M_.ckz+T̹ jJriQh\b#) }|[sB{RlJ1=8֩T\R5QjsS'53n8 4cC_x6<黴;/^td鋫ʻ|\O/V2v?k=1Ꜣ@E%%L$ƭ&$]S>aq h\U6!DXRM&K3|W{a^[GiǖxMG^{A!"IP社ќ-F'56j[B| Mm/s!oQ"Cʢ$Q# 2a%ss?\1nGzD7## YTb .Q- UkQ+ǐr R&361!rc65E'PuL %HyIv!G+Ggn{G]/.gmSsla_/~6O^AQx䇍.* 9a.9DqZcO.W)d=Řh-pvlq?;[wF13.9MX# nN}2p|cp|5brc=ZbOb/ڦئrML0A~i[XmN[TmN[}A9-'\iB(΅`˹{Qc\:tj]DP:R%7 6RpdTM`C97a3 d~E9!)vԝЫ;݄;tN5sp!<3KɠԚ dk)>R@ Z!1d}+Tr )lisZ[&ƸJF،>C&|#อk#{ڳmfNq0Y>  .o[p4%11O ΔI7]ʽi[Ն|SK!>8ʐ֓%|%I[kr@b*DO6gD?Z|7'=iP[ Z))Ĝ9[tJ 'mDZí)%^Lo4n|D~hMEﮂPE@)"VjNKfŨ9*`Cm\DDe;9äzk_ SM1=C}GR >K9"7&ߪ+,K()NlmUau`1z+ezɴd:x%D?9OHM(Ĺ= Z>1/D)ک}-19e|=(]䣬uB(2;MУwX=_ŹV( ;$`S) #H5T xI/.]!K48BS-hAw .bBR+R!QJ@5L5v,GY~?cFu[:='"]!x`[0y*ȱēۈ11{)MOE(OEi%ͥ<slm;ļ^v#lZЦF;#&*0dd1kr!@m=Ƴu&AD[8Ԓ8S7{g[A3Gmv0ODaAp+?>2HS&LdB'4.O+I΄kHam37Y_ZʘBnTg(VoC1ĵ'T5H4xFlVLUBBrɦ#8\)-6Q*kmf78;Lģ>[h|&da:tπ \8O-o&ȗc,I(5&}xCzGMї ulR#Q0lC }#Mk>͊)O|3x3+( Ma89WrV,p2Z (97aAR)yra0ܶY +3`#Y2,K\It;N\ɹ9ʼnDbh-yckYCa),b30C(KLK|օRc&SG&=D.7W ,ۧ{\'|Q{s6qNwn,&A81Z^ BJrBɌOb]&;Y*Ը_\S>S .FM#W7d1c#EkZd5 D wKqPd%RylW^w+ϥd>JAvMqUgHV\qv.pbi$K˳?/?.xښk9TȑjlRI$N;v!m)Hn6IEXk*mR"< T9[`4&jQ[S(5ap&z6՞ ?/" dq)'{4{sѥ x';9&Gd?YGQ/n _za_yU̥ՕxL< &eQ$nqn;’Qlf,мx~K/{_, 4v/#Y7Nf6Fl"F>[? g9o⸋omxsqwzԝ_Wǝür$gwIϿs+iM^^%^hn.>ڑD'(ov|wN/5v+?^^_g7І ao齅^ -vrʬy>/.n,Ƭ[ hJo^i2n2.sYY~VC8X(.`F/?{|ۇ iܾ\5^m9W>Yj;=Ҿ;k~MnyӠ:o8}yw߫:폯ӏxϿ㧷?[~~]zucxV"_#gKF-/ Y:g7F׬#._+x~X?.}z}R6b[b޸$ o:nO[룍1 [J9x#cHXH׸ZfbjC ")LsZ1VjRN@ƄPYgRa*{4 d8_j sZ|1vĚjI%zP6gvȑ~y 4ؙ1Fcwe_ V[b}JVd)eKm°%WeeEsH`m1pɢKs$"yx;c W;Ͼ ys}1S&2g: ᓎNKTPt^c抂GJ}剿MA.8C5=3&B|!e9DE ͂6˹fIDB5jCfȪ2hlG1ĵjC[]ۜpvyP9{"ԟuJ6ܔϧM{->9u=/_^]4yso ~w(fENM`ݢn0ܡ.MGRRR`g*+A*:m `I$ :8inhJM)1lbvK`*GS^}*`*%R7,9`լ+iMyI4W Zy?\ak&*L"sDUJIiۊ9Y!'W3玓Oo'f0|υyy>k8OVB٩+0*,7n:>߳Cm,|[x6<;iF4Hy:vߺm<} 3WKyT\=ݷGsrkn?Wsgk7<;sowݝ/.~YW|prt~g^ExaoKH_+L8=s<{K0 ًkdu7S.|VmpIkVG/b )P*{"Nۯ#x`'EO䒛rrM'ouޟHD '޳ mMR; `bZs1dZ҅m05 %o,Onz5L^]>idugN oU!MH[8lJє̍sh+SԜ$շeE/QI!QVEuOh0og7TCw0*glлo+qWҫsrxjߜuqe#:jT؇Pg2-';Кd19 %lMeTZ-/ {m(s*rfU`kZ/N\jZiߨ!A91JSQ9 >(s;ED?aIrXu9rϗ^ Pf3ǀ_ bhj:lar:N,$O&F;kBsGc^Fケ]{o̿oO?V9:)nV6w}_lK#^+ׇKQ|zjG?!~837/ؾ?YZ{&|A]oe^YXֿe/WAٜ׮sxP/ۂ.0 IBOH]LGWqa ۦϛ3%޻K"ϱa.ޏO?+67}秃}YGw<ϼprx {͒ڇ)i{}fb^/rqчsS)-˵Эbksq U^گْ~G#݅}6q,,Ag.ؑ@KF?\Lx!.Ýt}l'_?T/@>fgLIwsۛ8zg}2MA@[揙)o¸!Y1r>mjTPlQZbG[bfUlhbtn'YV3=kn9 wj9#KǙ4֢T͗D5`72!ZOgY9K%Ht@=?ɜIjUI9h4fVqQm8bX֢U<}~FRckg,iuscr7ZK)je"5]CR{hhS,G}8Z֜M/*P1U@RJjTK ce\|6wff`ObZ+4 1UwV>k0pfJd>7:,<.mS6J\t{ ׂ d,x2lt ]0ȷ|Y*:Y.$<;Q~bm=bT:@pK+T{w|?>`WYYgC?`2<#Jǜ)?G)A(p>=n.T͍),\[pN#ΩT" z[5j#ȵ3%nku}gtkCI5.'_\-q ~2 :Ag1!M/Y@- ɭi5Co4Ij^+b#8jAQM'c׍ %V3c(!9Y`!J(ՠC; 'A_r swf:nFU%RX7!KhH'KlФtVp1'z$;uV5:/'Ѓ" IdT%22Ї 6H>Z7  hl5j6Z + 8v VVESYߨ ~._Ÿs p 6YD I|ӗ5 6}p8-Lq'*v!fѱMS  { Х[~mŪ<$JI \+:1´lk5a HcUd,u[3:Fpjiuh8^Az2ϱ۰l,$f VDi2~{w`D)pX0Wa1f+ wU2B@f:!Xr+"~ L>Cjˢ9k4X#Ȳ;́ R)ԴKoEUĽ"mZ&B[V@¢lҰAU | F{T_rSv`FTٞ:˴Gm&lE^1H"^SSM'WD0C_u7tĹPic$i)WrQ$<[z#i,W1]Y\F fp `N\56xk#7+ZFj}ҁc Rժ $/QS֠ d bpUc fzeAV)M, o8A%V4e+*V`(o"02jx A+u!5f`JnÌ9>X2'Un66_gpXӥ1u1Q ;ff-Ф3Ydˡ ˮpEתWyW;gb!wT9蕲; ۭ^}#b/n>:zI XhaB S)O2WH-gsy}p&B;O-O:mC.x*|J xJ qI &|@/g>% +:sC *hJJJJJJJJJJJJJJJJJJJJJJJoV }FJ ~6JnnJ}ȓWu+I%з28@C 4@C 4@C 4@C 4@C 4@C 4@C 4@C 4@C 4@C 4@C 4@C 4@C 4@C 4@C 4@C 4@C 4@C 4@C 4@C 4@C 4@C 4@C *LFJ QD@ 4Ϫ %з8@C 4@C 4@C 4@C 4@C 4@C 4@C 4@C 4@C 4@C 4@C 4@C 4@C 4@C 4@C 4@C 4@C 4@bJ{ȑ_ewia`>dl0`}'0x-;غmT,' 8Vjba&Pa&Pa&Pa&Pa&Pa}L J>'&SBqazL @\@ T@ T@ T@ T@ T@ T@ T@ T@ T@ T@ T@ T@ T@ T@ T@ T@ T@ T@ T@ T@ T@ T@ 0L.΋jIӄ]~8ɻH'A 8¨׿H⑔}/a`.6 /Rr/No><ᒕ^{ Gu!K8w'~ZV`%h 6i<>E'K/Qp.KFi07??:8}lxʾ/vŢ+Q+*"L.-:]PW #r^ZeW<Ґ;z6QC}MlDy{W LHɓ7z<3F^:lvo^P^FuFȤF;v=bĨ RçX6LMK{^q5^|W1繋;i:ug2 .¨ ##X2]/ͶbF PXRFG FJ.ћoߦD@? /2⢇\oa7fn t?vVs]vuCY*[?8 <N3il0m<]ʂ>W-I%@n gUsIeقI;ǽ}w`ઞoP ~2ԝs ԧ%5Ng0ȍ:k@O[~27m<²$_by#n^5gÉYl.~<\[MmA۳ ;{KvYo^iP`\;\%MMoͻ.fOpxfo+ &rz7`<!ݛ)-kXŠoVh4o(-ZxIm;\GZ*ozՄ,7FD /3,JJ{+8Z.ꂲ֠7 s0O?$GӕK"DSɲ([,sЭ2 1=Ι* N5NMKFIFG"Ź7j]ص..u >0GJJ(1a6q&߄CrǾdx7i|xDE5*"5hm*zʭ$JPrB md Bzt2zllڂ_VS:lZ_"-~MMbqE ?O$Ahf2'bt\1Fwq% g//?̦C\ؼD&fܺ,^__N$^"?q;_7ZSJuk$Iz! d?>Qُ]w&exN}M:Cxèf,VRZˑDrRA~}ix2Oi{ɒ? p˂1b+R2%yTxX^#"agl4$SW`g1 r`Kk>QiHa>toB<|av8;fw|eƣ=5ﳃF|X` "F0`1JBxYX4XQ!Eğт݂/(dp,=[:|s(=(EJkGjrxccF,ѹ `AT\z$tRp9c#*WDEθ>jOY\sө݄+%fjbZJ:()&n:?c*ROp*(V9I&APP|$ Sj46A-78Is=%hW9vBfF1+ xBtN^'LA>M 1 ̰ @Pyc#C"f'g;RXȠZU>c*}¶$DooUz (ں8tD&6ӡL만W6dgC UP2:XY0sGyCsR:Yw^5νh<ϳ;`Yl8ۑ bgEpϦ-CVR& 1A}L 4a4UTsc%bDW"d:P7NRv4MLBO]e 8fŮ8kmo(LK=^R0% DRD&bJ[k*iD"'l)[,h쌊9`/Y Yˌsbħ59b@T*e1j Լ\Va]b0Xc hn6D//k;t݌޼H/[&D}Ͼb_5rzM4R}_*%(K|B`@[w-w voV'I/jW'Y,6?{ȍJ#8`˵c.vEn5|medIQ˞8+CeK- n]MVbBISƐpӤh0QFs+$XN< `'CNE ȳ}"B:O%2HI3\`IY4G4_Qe'T4k/@sV ;Q ݞZvYӢB,5I' B$t*c([;ͥr6:MJ#K[O|/!}BSG|/_10Zd-3BQ.qz{{Y|/$c~̫f7P|n*(K<4SR Sҡ uQXJKu͇z+}{/57vmv[4̊Y\'UOEyxm bQxzR 5?;>%*w?ׇoda%PJ-LL2>, ,1O$ۨ% +'Z2>,N5T $]HN䫯89,_ea/ yXtglʽ&Rr})Uu1K10!>l%c 7H=kVbe N4RJ#s"xsv,Ϥ<r-(X.HB R;H#8O!]Ja6kXLĜ-F\J7* D-ݫHo V;g@|G6YnQʛDi$&d'zy G╞`".Aq:AI$B2EhpN$+Bwa9̡NKxAg<ۼN!Q\dyR:g=T6\ IJ4: lս1ȱ9͐]۶gg+9-',3 2멞IYy-9{KW^->XᔧSZۦ Xq'8 C<ナ0).QGIQ6roOxi( -bDķ0mP+GRj,wjm]JH:M4sJ'Ka(m"΁g&)Rc_ĪU}#>o[y3CB%.$*!ӠHH`'B^x) !skxAyo@Y"1Z(OAҞm9;n۷U U]s`?-i14&#?U <j<WVձaWnϱa Si^O,2͢WfseI8'D)XD*iRd9^Nй&VpVҖgeGuTU!BP9 FPOvc}^CkguM^䐬b% !}hU&cQQopdӦ}QIb//mmawh,yY=7Yb~SۤVQ\j?fgVݐ:\NU7.>8*D-5]!mLVҪ)JB\,g.^ۻν~2Ƚ͵Wuϭ˦q8'#.K=RqJïLN_d}~(g<]-g,bwJs}^hȠZ KD (BI2%9PRzk[[6UZHE¼Z1O7_ۑ`nL,MHNv`ʹۮqRϺs64}D@ \uJ+] tJk C:=봷rޯE딂wX JA#D#d^nxP bx@TJqjA\H',\rvF*Q!غ~kC^yJ+Db3ýպ[<~r3E [Š h`p ш%)mD`3j-7NY$D ӏqP=UϠy4cs5 8@$$ $) c1S`L ky tOqxR)a2rhBcQ9V=52P2z.8sv[ꏰ.ޅdvWp`1Fq/^Cgnop5368Fh=: IF#uWAq/(27g*mPbwXV@]YXYNvۘKVUZza:9Ljrp#k:<ѹL/>oJk R'?_ӓ!i6֥G6`4YIu5Ϗu:iX-O)J+=Mmx)S,~F5]5|ژ%EL.go6`2ۻ Pj9TJi93R/Ҧqf:pbs3({2 R_ۼt7V" tw?^ow=o_h5*it$:2>ro߶lhlxpR*ɩQN̤mi~Otx?ow Tnc,UNgwJGSfKl=mQcy5韦#oMPG=G7ߴc|tm_풆nw oc^Mܶل3v](ֈ3JA+6$TjI6> b^c>A2 b<rl>s`F;7TRb( f2ib1P V%j~lDG\LPEzaidAh%Y4Adg9=DB84GsC8E@XS`Fr%ϙPB .L#z]d ~Yԕ.~e`4(e(}~,I7mUn2umQͦEW:.wu_+Ӣ څJ}׻vŻeJ޽+ޝuE|6ͪ7U2RYGdyxuڴt3sSݾc]stO|J((l͓WF4h+B`/f^q*D#MboA'T4LçO^:Ȟ='5ZLZ7 6I?9BtR…F΍٥ۛSo'W,a>֕&'@H{Vu5{GI%hdǬ9©_'Ԡr˰P !dFNeBE䉫*㛓j m~z@lKfO&$QarIxA6=69c1#BRyPd ~ nKnF3~>t߿IJ:CqxIؿ $l-"\6$pvKQWG͜q8T,,h/>qW鼅m"Ic!6ٓUwG{æn{{B?bKQ|k. F%E5IO $ %+QsB԰~aRk(c$r ;<0"-g09;gte]пBIa^ /8qDDD${$if6h8ИaY͕I  ˊ7Pc$*qRI.PAʕM3 ׽ikG7ɻ`ye,&6θݰaԭw~ٴdDjlޓceu9i{fUD>iBks%X! rSI=I& )+﬈KH yH\EEWdR%B*ȬS%Q)Xkl ZC"һy //fQKY_x3K`iǛn>zxѥT}xUȮ:1^m–x%s p-QcԩMNx?;=C^y xI)_ՠ:oǑo` *:\frc6@ǎka2~M).꾁Ի_pv~׿?R4T}z5z_yM_կKpz#szh} 7͑L** Egs2*1s-7ҿD˙֍,Dx>T0ڰa.қ ` *c]Ҫa݆y1mxo,ލ$9_Cs-;yf.,S+]No͟/Ek9F8)t2Ts6zM̧Ϡ-0|v&{g.jR}-``&$-XO<Y˘S9+(SDbVY6`Β6҉a;(H5l+a3ڢc#M=S!hᠬ 8;6yGRfɒ5daj-VK7r;VP슁v7Zh.C}5|kKDW)$:!* +DEɢ#jM^ #jHQ9D P bF#Z;s"x(PIi 蜝rq#Ǎ0צǹBz'tkY5hӹ1s3HS*&B:IqnuLbضo^vdϤ(Faڳ,ْYSnZ#GO TeТߏ }Ja'8(b{_[% NLx9&U)KXdhGCB"f9$Dw^+CLs=Mwるڠ.)6J+j2LidJt ݅S*W/ O,E˳%)H*&MA(4Jb0g6 `fS(N`:%YMuK}_;8/{F3P>O}?%_)4Û{ϩo{5H7gq)։'|0 V8H>)Đ3.!˸rl = GTB Vdzq ⱉ!8K]n;tMf^tm]^qZ3)|iZ7)^kkW?[gk\s?gJ%IaJ겎yw=XSK6j}KM͈f4460͙vH* )"kH6НrsP9Z;+۝J!;/t9*Bw&t8~BwӨQӃx gP*G+* Phc#g #g"[N|ASscTve43,s=2UAZea [| )"Qf)9̉ 6AnMr?Qg J1I*3CzŜNyC&[ )Fo骢6jaA?qxCU2ڞ}tJNq?]GGQmP)#QٜQyVzuq2 ` =@"ћ4:WLJYw9$~4? 1~Meʾw~( S1TOgD\[WC{!J:/8_?&ɞM^D$_@ _Ϥp>E(eT鈳jx=Pya{MȆP9%Si.WK=+7Cݶ lqu=E=I%/G_k{LhЯh SmxHVoQ|4:W>k LseKqH϶rhD>.|1k6W]jdi sUTuKjRzs4AHGJVT4j^ib-j^*y> ͞n>^y9O yZ|􎊧7h&xSsYm9/cZ12/Nuq ,a:Oyڤ{V1:wT]V\2Zj|?~l~Ad|Yiyʩ*h;_\;Tv@4 |q10ZU <%L+y3Y sx k2=8ma{2jy{4ŧ~a$:ZNH~cZ璩Ĝ Ɩ %MPΝԖIHgLEi5dt]VBX <,./CBQהah1O7{/t hs 8"tvtfnetٮJ0~HRփ'eۈ[m%&yQU1|]J:=봵O%c:%+p'#,Q #4c$ 'ZR).T԰XVq.2mP*  Q;km"Chdnض&4t_H6/>i= M >:D09{)3VţeY&ʟѸ[:m#kQӕVVYb~YAÂ]3{V0cT1!-] $$4YpyyJ;\ ^K;^if^ll!DfL}N F[4vָg&A> "Cwd.|NmDHY΅MI TUAWʨm=ߧmz~|@~]zqBAs{Vc>-`LxU,Zp 7Ē1A$7#1@zSzx>a)m_,M?o}I l $f{hXFFcT+BF؋F(sy:;)'P rZ2S.`1/=xsI]0bZ: ^g/yWz=&$O#8<)խTl[d#+>5(j 牴KEǓL9iv`IL@d;ӎ.OSzUOy^q-@N˄E ;5ĭC̋$hBNu&Y'78;tKq=x%i8Yf2*ɐ@^f!)IEV=*rM >z2s0[_b] kv[p(`.ڶng/ ^jC<-t z0e$zjI)k P%ۈ?vE?<$PX0/e@.*;x uOu_m]OnhO"W&v(=oGh[tWgU7vj#.T|X׆MVm }n>wN[wʖA_h=4zhQiɤET!B{JA|Ԫh$kTQ>[*_Ǫ[mE_/2X-gx:#aK˫ 篗c/%uqrr.Wvr_U*_Do[uk''J rLURՃ~&V&cS L O͛dm(n]Sv+^6"M,EεrG5 7NjۚuNIS;!v`?;Ç5Z{CS!Yݷ=%QyJ]u'cS޵Y[mzK]k"gd2{ǿua:4eMqߗaa旭F~7HKK^~@ɊC|O2m"6(r[q*tfX=eCW=ԳJPWO拫eWc9ob4ŽU|Ί-/l=R=Cv@P ˒ ՀbV!,-iosZh') Uj KP>K"FI2fiOW69!1 F;Z]5#琚^Rq.{pqq"ߺaXZ~rl0_ÄW`5Yoj>)-٫&QтEtZLPrRnD;kFI61p,(e3zI @V`9A*h7QL*Dccrɶ1jp^­vSxb;"'BZ9Ѿ9~|B^Ἧtϻy&291-WDZ3ah~Bm'<~B cJdHO))҅T=ET@<5,Ŷ(YS (=Q u)X;TE]dY㔽ۉ¶@_xoK":ȹ_c4oڭc Z~.V()@51dlsNNi(X)W";2ZŲT" l!32VtX0XJVd8R-b$Bdب&Bc}،aoԯKCW8،?ՈFTF5ÁH+,{>bVs,K&&cY!8P,>hRmc2/Nt櫕gb ,Y4a/ NB×[sFx#͐|͸XE=Q/xDT0( oIBe C:Rɖ m0e,ƨ[šfq>ӇNׯzm8dF=7F?>Mu}cHY; I{b6{/A d:\:P:" < .JAABZF[)[Q6aUFVZᓐ)BͪmF*l- ʁ)@g Zق-Ry( Ib6 W7#0ȫRwy5|3ǣÐ;ʽqj*w=wrWˣ/ԧ@q\),K1"dA'kR&)Lz D[RZkVI/d, ҹ>[D=f/w_@fEUus징T>nbia+F,K]s˒W\ܨ͟QRǰuDx q-Ip!m ˢRoEu$-Ϣ_X\Pa)O )I픮!KHlE}a\tɢsش Ƃ-"iMN0fsWPZCfB:tHeM|F*yj*ȿ鞟~<]ꢡ(Bh}…p,Έ]Lm(!t9;iD.6v(L_OG9{-SI6\!4xsm? !ڐWQM,ʬ$BMvBPWn|MdL*ZI[n4f(g\}>;Ǩ|x( "$EIt,b|vx.JYUԮ-%d?QQZ_W=Eg%E0wcHgRY͌$vPg֣I:B™ƹjh'v%uK>$0d&g W? 8(9-/ي' Hݴmz|;m|?>(nZcˠf'@D̀5#s -D(IYNaԂͣ0mW/;Th" n,|măug+$kw %*Ql%6K%)YE&VbN؊(cU'+X:9:AuaM|w!:5d0@שi"" Sy[pc;_gÏ~~f51b\,ȘͼSQ )l42/'U ΪY: 3t%W_@O?dtK2xw17ɤvʾ&z: {`AVi6Hx?|͙Ny$"kY"5aERgb?Xa+PHUYr-XX'?{QBDPo2SKAZ`)@)XFrecSGWMPy'lD S&$c`T09sH[ ʠei?8r`30QN)93ZΊ;+ZD ![Ӯ9T۫5e,%D-͋WeM-&I7܊N5GM1;+dJN`Հvquh8hAq9X݂+k2Spwޠ[W2PNe)].x}m.h9F YiogO6\deeqv-A rB>Bs3{IaөQ$;3bUjAZ`Er~!%qJA|+A~3|ˆ_؞6_|{3t^[,!%NEȘev=d`RoF3U;Pv.2?h߃KQlӤ ]IanBjegdR,65^6:܇S2D TAeTL?~^{QL `= oUXe<}@B5(Rl!:$ӂfAUIeӔ;}ZRٚW;%/'yA҉ҝ1~uLKр~mn|&J|*goQߚ~M1JNhGM` PDC@QSA.B2E8Cĭwa #.z@Ģq`M^kyLBP&_@eA)VRۧ;1R-`F!TozF*ljXfIJB`PT,+L$|ؔXsq-ZJ &)|4ٔYKUp|=" ң|maZKg)ՇwIb﹉EN{Uc2hgu6Auڐ "tR As3]/v^EA%wj8X.w/g4M 8^Bk)\f09cfA ѫ=YBRTu09,uMds\=jb߰ ~vύ뚚%MFh|pwNk&uWs `+i,99+@p= /˺}2JOY]5lGTbDm71+1AGd;aI}nJ. JO,-7Y.(d{P;td4z!Ew=_H)UizC{tu WErV_.0n,"wؽ0 e$G]ieCKyO+S-ͦ^`NI+ZZ#{d?'G6fke-e` 0ABPSR0JĨ6xΨjgoKWNGNA/DSٹL˯įLOvI0aRЙ۔?~ 4!P d[JL.0"4'f :*tf;EW` JGa8h7:j\E`a f‰ya } s5&>_Ng0x*Xs(vYoO 7LaLoƤl_SI-E4\iBrf͵&'F(r2ܞ9\bR|;|lыw Y4/5զgS{wNYˌsb *CA:w݂T0{;"E Rr?EdQTafXT0TXG$ ܶXЭ.&`4u:'fc:r4xNkD`+yt8CArH"Ü!D8BT!:͉_A?j嚋F8 H0@& F[ iω#Br2jEX#5HVO fL*G)$ GT:c HjlPS3BU24*J%|jr(cެc#zT-*ٻvǗ."UvMޥqCqd70 I/bT1a7.|y|SW4-~1 !GC6I3 l0EGRPk0ck8!x W+ siCL\:$UX>\t `u-S梁5{Ccx$*~3jLwq>~~;uEoAND7 w+n@LP&9Uogɽn'܍n.gْV^u/sk>pS-ӵqu'k.oLw!'VnWuCVwSf^,F+dT̳4yy/pg=Z79WJ^Ǔ\ʹ @^ɷ<:@>ŝaH%2'p+ rl5 ecn8s _~뷗~6ݛo޾D]_o@V`T&IZ9~k| ҥ8b!Lq؇<"ȅpJ4E8NznXqLNV13Em32y9%>d,2n, M$%[pcB9I[!);u 访'152ixryDr];%gr|EBR8ʳ0tg :y;B(IstgXz!A<:Erh!WN\8A J;y x%rgDZ)l:u`Mf`Մfqi;X8qTx{HSx<@4aXJ [0Np`Rq ."+yMlAU ;-Qt:>zT!$UHQ0NzhAζ r:Vp2nvʃp+V4 PD>D%s^a xzbC0Ku꧓g/4SY)l@mX#'h3S\{)AG&x(l0:_crl7;n{{&BBK"%Ӵ+ jtM R?*R5,yPrIzJOꕃ%:gَ]6e L5d9 CR1F: l"A2rVk/;0[,pFvUl/a*!yq7 3]*%Y(N/K=܄n6xCZY6mXBwѪ/}c]ؗ|jMN<i6'NjӖ!ÌV *wWAXrNq t9S:"kc%* Z^;$f>|;HxyDZ4 Âpʩ8ek"u1ga,mIl AHRVt 2LwZYQDk1h 4hl8 \'tzq8Q)Q}pb' FDȢUGpMO˄O˯@n۶)z۩3a·7VPɁ1 }?O\kB0cRɅ R'28ydaʪeO1/&A45 kZkTe-Z@+J!N1ҟC40lKaNh0Qj9 uQ޻3<9Zlu{*XiFsݎJcM eoTɢ499G[Rs cS0vn'geo8+r=n7|m+#+IrE!Ȫ \YAڰ[=IfS[ ,1x;rv!Oۮ_͢Ax37ZmcWeQH҃"ڢE?\ڋ=7"XW/eKeږ5K֮˙ϐÙݛx|sÍvhyp>mioqt=Ȼ4<ϧyk聥j.nqQ Nc{L5:>uo ?~Ycdd5.Viblzk<d"(,;UFuL%-m9$DF xi< ʽ85ؽZf~0YItZnc5Bw'4\Y Dd#IA98&w"lE14ɳxX!}YXyF9; c:6j3x{eJm2Vo}W]T]pruHن3(I#N'_)""w|f^1uLŖX0>%JhrTf3 ZG`p-mvB9!hK̄[)8r5w`{{JGs_*2#;~jDHYօm ҙAjsBa'=?<|ٞ}my?ޛzoɨpL9͢S^ cb ,F/Qa0, 63k.G@AphILZž^[u&'$~ȠJc9ɡ RwƠ% CC@Psd 7U*dW|Npz1>tc෷=2RVU?f›~G+sV}G$"M2E=ez Xnze(4H?bQCA,bDϲwOY6og>ۼ{~/`-u\rT v! ћL^"G]s\7y`d'y,2+m!^Z1qV #=2Z$j5q*DBVMaIT>fs4D2|?mZH4Aߚtbi XͻR]צ6߿Lߧ0}3Χ'餶?|X|M5uKҬ}׼]ݻgcN> jl~Z+#u\,R,czż}twi-]p"~z4']F#UǦސu%,F vłn燚ű8QMY?w0L QՌH6iFzd6xuԵSkit5tD0b!\ME9cYl ⛒ӷL<1Hc1L/]iI r{[\Kl1\ Ҍ >oTjoRUGɸ1~4fG':P } ~ ɔ//#RҀ.tpxM\[Y~Out?~JW&H؅dkBhF?=뗱]9ע7MjX_p1=i[ʩibc0F5;:r:I --E ˡeӻ,Xt*=QhˊyB0cH\(E\!*ΖK̠-Y r'9V*$i',IEE(*,rbPzgIU&s1?ݡ`_ ' :"0 53LVR*>l fS&GVW4>XV4[hDd Dp–)s#01GN\^`eǀKʽSvrMC1GGMVeu_z,6 sao4)p⨽HT-ggT[ ,gI7Nmb[ejjE^JUq| '.E;|Q;\ ["'Hiߞ²Ψ"WU6,Q@Y) 6mISzt^׍Ozje-=wSFYFc"Zec&ɤx2M̵2972+>J{briq_,IѻAP>6хS6S: W4&$|!wG]7uo::] 72Mܨqa%jmK_6?|D"mN-yLzZg1:PKT8S*3۬ #|O^9c&uK8r6_ϪFh<@B.hZ|٪ a!={: ꡚR@l ΌT`HRNܠ!V2G#c1`D 3sD8򐸎>r(NCRt 9 %\y8psf Ƌ(jvJMl1'q_Ro W⫞ϳMM38#>e..ا%:i=yQ%>e}f%xzڲ rB)3j:"-$+[$: <2# hw1x sIbv&yM( OL֙ܠv?d] npe])g0-Wַ>f< ?I;nR=Nh)VE[sl7m"?i=WmPÙ'K\EBG׉!s@Xj! )+Y1aF[D@4: \p# )BrYJf.s%NK glX/'yjq)q^߯ۙ{J:QQK˼dqJz'IE'%F5TTz ܐdQg']BGT ɝSʆZM O΋-p֩'qS=lsxv.f8nl]K]ڟB7$q{]o42Cp1$c@E Z& !2y["%.uR!X5Υ, Xm>\ ,&ÍεL U02*հf쉅gf/7O/ 3ސmv |xz?;> ^~1'ˑ(6Kg02 F,cf֑易W-D\ ٧[)[:=At ()F>Ls> Qy#fT>0b0ڵ@P{=j, VRʡ!/xGCneLVԹTt2FTL18y "/:J5D(\4D$1bR։ErTjlpTmfNe,?*WE#Imd{)*ِ3y@6,Lߍ~1*0h󗳪vI&r bsڊGӔ5vKjSbzv,%m6[s4csFl2zuP A s-4tl>O96CΑ8-%TOQ]4|5ݸCH&gɭC5j\S2*eSrYPX||l555d'CtbeٛTZO( ;ΒiDfPfƶj1"qld,0D۾phgĹ-v}2m'?ަx_6ebJ"B,αR$%e/=Zx;LxgOWgM8)㼽x7[9gă RڏN*H}x;w!^㥵f.@Ybب UsɁ&n9޴eD Y×,ޜ~1MEj|~]ywD-87)RS IEWLVZd` 9Y4AIЋXK:Y[6-]Ĩ?,H8|b Mֶ1|P?>T*6=-FM}Z-({zGy,zNr͖☍XVbbG9ւZ`˫8׳ ޑaSl:X|>] `w36=q2Nқn;7zpf/spR Z)(Ä9 pWrN|)ˏ7ϘQV俆{'QxŶ>aTK,} %q{cRiroa>bb&m=Ew:=ёҲ\NaL۸4 8DVNXvbhg/mSϾÕ蘊c\G =ߝB~8k7?p~GI uxZi/ ?NJ=@E2@}H6r[3\lws_=bg_.,qa6B-[ϿNմ|q]wE(X|]wKOXRG_|f\O'g\::x߿hp/1w{?|K{uW_A~;84-۳{ {smoOƺ{,yzUWqWrG_b>{(A_N_2d99~sO<`B'r(&$\r,qc9q?h`4,jYG _ R0s3Qc)翽* />~ `V=Q ǯo~{ьj, 7stVV9pX|OZYIfP7bTƙSVV]@xr_r bOnBm/Z]p#n03zJ IuƼ~m' ,lAMH1$_ru2ӑ!eIS[yǓunUx{o-R3K˽s;~Ckx~9Jo+u}e{spU-h*NZ'}iYT8i۴GO͟3*߱>`?K]9Rϱ%Gڵ$ko6muَB8u[Ɇb'|tq۽=WosC?x}H{~<>Gp՟~uUm^K)l1&j= &ۖػ*G6d9:Cpp,׃5Kt9eڼ1M{Al>fI9WWwU@h15?Y]lcl͓0հB݇R+IB!I]@YFFKd+2Z0D*-N)̔VF' uj4X%kbDKԬ0jjS)99~UZL=}"ӗ//^RR̥ft6:7lُS EK͘#l{]"0f,cw.Xh30f"zyiN i%dT #DC-"mՕ!*Ph3jcpŋƓ&W%gsX*x˿ϐAj.m3XhpKHI?dXWc+p1C3d̉LUB"aUŊiSp5Ժ0[9JJmjzpr=ڜE!]gg3Ij`š B@H8jg'$hH,!f﹅G.*5Jžl~j\,d$sHt`{^j,rSYh k]0SgBCtfAԆR.Hڞs"S$#<ݫ 6` `` Cǥl|[Gʎ2U HV(!:S8=g3а  4hQkjQ!(JYN1c r\ ꠩RcK $#  hBE ŝdLag@( ~.-V !vBݨK[ θx bhg $\LhΣ&vSD\)6#Ϋ2xkNAWi8@K8" j$@gLJ5HK@42~U:"Yi{S:ec(.`ʈq‚@,-[# 0#|ࠠA®BF(1M3NQX((w3qݍZA8ޡl/v^b^jȘuNo0U(61e@; B 9&ݹ`W rrqm. G"WP /{YxRئkׂ%a1-WGYGV2P&V`0 8h@ ., M:C/B ЃVs?@$1"+2QkcTFns?(Jj s ( $WbFf8=braond7*'6!+C j)-'g\x㶵Xp"$Z] f}C0VJ5aA[.b0;fa&qy" 0H?t%衷@`F0uB+C5 WW$0TY #Bp0 a^tA@ ĩ5C B]Fv$`y}0 g:WP9s竐'(oHzCۯ㎻ﶠU/icu.^u2]6070@y%l\\.$\9{`Ħ܅M ȷRH ƭ͓ERJ`D`;a1t\W |14SS3_VוV`IJ2wV\-wh? > ]MȭO>H@C. \0I'I'I'I'I'I'I'I'I'I'I'I'Ÿ/ v|dX?# +\an|*DJ0\pEW\pEW\pEW\pEW\pEW\pEW\pEW\pEW\pEW\pEW\pEW\pEW \IYX+vz> V?J*%͓` -脫/ZY:ኀ+"+"+"+"+"+"+"+"+"+"+"+_0gs@WB>pݓ A՗\L0GW\pEW\pEW\pEW\pEW\pEW\pEW\pEW\pEW\pEW\pEW\pEW\p5(pu`Zw[7Cd:nxRYxC]ή!k5ڬ#T4fs0@gi}ݫŸRE\yuA1lMRg72(HL#o.Z0*WOO0GQ} b档An$?N_}#h6-!|a76٧;xL0Z ry"afC߅֎#qjï01 +-gj,{w~J%,]B R,C6U0JϓS $; 3GgX6Eh[P+q 냬 7f65Ƕ:أ@ϸ_@f]i/Mkl'uJy=~F$YZ([w>*tQie"Wy,Y@*TӲ iYU' L!zjv%(u>sf[㊞Ƿ+gG+7T^]H=rt3/㪡FiH1oQ>=>.SL0J/rv]`t,ߌ#ze?}Ĉj^`~=dbs- (ݗ{NkL҆Ӷ3gPctj~t!QsW8]~gYK I Jd#Q"Hz6%F1%S}~6.{q]8B1@9 L:Wi8-K S:{ 8 ؉xN=M:;64e5:97via{"p-hQgu۹tZ/^88n͊v|K{v niynܦHY0*l gݲ/ 0HIx-=a] n#p^OǸ1~("/x,r) X̋WQX*ŒSNu*cSo+Dy+zRs}wKԔ*o87ҧBN *l`:eza #t~a~Ji}. K9.3<EyqJ(4Fun$1GbrSKe,wfMl:c(u5$giWљ\̕g!>Uw0Hrnʽ/{U{oFa)Ũc210ڋR*%* U1|yYDj]uy x {_i5bB}/٭ULJ`{O$}Ibp| ]1g-՜ksU8y(*#+O2~9oNw7F0cnz۝;>E?=}9QAW*Wʙx͵d>|)ߣ/oނ :g]>M'Z_]s1W}8*ߖBSFP0q{?ShUQk;&?Jv2M&+ӝBYZ!n'8e,#=]0gi 8M3#:9"_p6Uq߾έL9~^Ao!,gQ)C39덷>cp olOŝp\inkۓ^Y7Lziwƙܙz ]wg͚Ic n+ n;ǻCqNk]ݳl;s3Z͓UWR^|}p7WwlQoa@_]$NtћsMp]\݉onھmfhr)QXoK% <'(ݺ`*2Qx*Ue6i >l8l3oU*nx!I`( <}A*HR2'#NUu&L^_v}ul&ϲ|}9LYI!c)zLf$?,a'߾t^L']}˯eӬ48>I/7d)ť^|ݗxsExYb^f(~ޘZUw"|l6a#{gk_KqٟA5O:IS;Z O,yVdW)Pp(_gлʄ,8l84uy&b~2##?D:VP#`gIb_k)ObBeo:8CdXg>q>jjY)}bưU1LCwog h|mMQs-*p>8R!= ,0d\H?]@ u: χ`\'v.- I>hpqxF4 wE n{ƍz34Q|XwwSPm8s~BMFOAK6y󱽾ZK1c_Q/Y_E0= 2gM|*Bar>Y&4E-cݸ&Y>a2{3鼓myD7=߰7|}}owB,/y;5xiMY Z4 UXFrBTCW0,+0bnsseթlѿ"g"y }pY&d6iX)O&rwjF^G})"s 2&0  RP;㓓 xl `-[s_k+/x=j%]S[@c֫iJ ~PTv=ߍi~'I^R'{s2U\tpy%0 3@~N{ eOJ4;4[^fo{qfܟU r?+ݼz>k'vERcsHZR)xLЧkUITm+S BEТaȗ|ȡl L`W$D2yBq̓hTxbI1ϙ0)—UMV2iYd," 6p =ݸkrO\s6/a%b,L!r':jRĐ^}$jXٕ^=_ƨR%4 O*~r] R[vtzȟLG>Z7'N"T:2N{~5bThqcUGУQF)U-Wk#6 d>PL2/pQ{9~?_,cjח.dؼqOGzոLscH/}{YyZ^Z;7l'^LozJ,loD:ؚ;~aYm-mS"< HbxZw&w s2 jD]j&Cʪ:48F,xKv&8}-Y-Ym9i#YꦚU*EVeTU~aNL`sK9,$45%i& *4! y3^ɭMR|G!r# ;GԪmnZ>3H0elcEХIb@Fh1Bfi>svP& y~4L7f-$8Q0UlT2 :xv y.p4 ǛFWlؠ5B2X*jL XepbktYcV"RNNIifo7r^^eI-Ftv_у842kP"E-= _M7'ݽشae85WT\ّ pՀ8 )oʥ 6a2C-l.:.]٘-uPwoՀl'XCV]5uG%[|v^Z4rhƗ-l1_= Ȩܺf03篭!˻w?aC8?./}cmʉ|y0nsj?ˋvt!;iGOV~ziᎯWWQiOW\itIf{ oB7ϝ{cwLㇱtxq-Bcuܶ{=ݵlWGuU /t=f}dϽX4rxZl>N44Ugg0^ث53úqXPs2XO<#h1߮t^ pqfݜC40q7jsGۼ䜲npwMsf{Rģe⚱;0y?~lRں1<|7mA=h6kW6\OJE`ސkh]s&/v#1?^M?<}1/f%d$ )/@\*0&Ȣb]PTbڀuR}?_whǺ w]|=NϧC-G+ ^qq[R+s7t2rBclM͇M;Tit0 CcU84ss5kBMvfSOvlҎ;D}튎9uUZ-&%K !DέۓNL}]5PmDX%m-6EG>&݆g݃}L>wq#,˴: .BU5;T-ʛQ{S@(++`CnJ[<y$f *D;AaG5EdrxVo|\Q[_F+m#㽊Y霨&кa4xTzOi,n!~8q ݸާ\baa' ,chsZ[Istt\5[&:XQ1QJH)zZ[$Vkb%T1tCMT5{(݆[Y?HǷcVk sܮ,,卢y=:v*2<Z`eJsXr@΢2oE%seQ.EuX^soAF"֐&Ac[fHmX Fj@3\K}[N@!{ @䑭(ZrF0 9@oM gУa[49ȲVN֗*#m'b|hcDts{nHen2F5_- ՈL&kT! UTŊEQ n-R|Q #( TvܣM'={s-)hU>2/նhWE%NYgUul[f-%US[J FhiA &BlTԽ"s@ehvֳnSj= 53>*@5Z6ϢVQ.s ^hȆJD3Fv͟hͤ=UG P/,Xm1Ze`H7e&u)Ѱ"E˪mf#l뜋sS:N*q(q5ykuDKŔUœSNZtDGYDR_Ax 8Q& fuM\^(nW x۴F+̗BD+0J .s`U3HVqeҶ~nmxj4DMn"rwMzvD"em 8P*Qbe'6\5ŐLFdbŐa7G?ؒMB3:A/mat6}pordav8^A~ݸE|u:[ma_u)cGMƨҘFpf}p{r϶OJ^]gm?Yn4V/"4f7?-y@Amԯg_#6=mھ$ѧqE\ymNO{7uͳlB{p]l'BL+g1w:-9ps&ͽV&+Ld{<Cne40&>Uosi.FJ!=)X]2KEVoPϥ@ઊ.04Q5.ƻdRUBnH; tʦpv[>rdk|q9_r10-8+wO;'zn|=g>cCGmXVKjwX*ZH<)oݰ ّ,T\7X ~;7/cXmz7WĔ71_;E3BVYś#QbVeOSuW o?z姈f^a, y֖5 3R5.agm@$ Rߔ;O:tbƨb!ȃK3P„dc_ظTBԵDUz&6=h1*gߌLMBGeR&WgdVFCY ]!ecl|>FkR= %K|zqk߆lq=S MV0e9b:FpE`o]Ӏ~u{>%mDö餬7u.%ZŨLdE؟ʊl5lWN3WںO"ȓ?Ҵk\tB8|" D9gy[黈TLWI\d[Jɨ+(!'2*6#[sXX l$qEX+F`Sіc[O|NWz=Ӳv1`.ͻ<y5;B*ι9 X7a5IMS?m?u5\|.U ۝ikUnޑEz.g)K ~(w-m E;/$}-iFjl'b=i$ٖdr"Ml Er~'9e!LQL J1uSUa`םA'u9{ ^N[lB`*!jq%" q5 HϝBYOEcq2G_bp]MF Fm94{1߈-fR ۪4k$y3$y {9,SarR0yRӬt1-kky7^-\~|:T/vA6yun1Ch]KTm- 5C7#f-LDzLvѰ th7]lk[%hsM'׵P*C3GRĥ/+Xb<]hM+etCtieqzsP_~|{ߤ~s~7oO1Qs#p\ 3[+~??J4ݫoZb4-lhW6;ŊT=7[ DIourZzÖŮ Vq/\}ᛯ0N:ܬZooKsK 0 .LM:.]?F&>Nb=%dViBĜC,)^$p!9MS^vD,y{H9IFXȸH`$4nEd>Jq$ $[o,rW$'ӝϘqs* (y& FÒU#,e(Mk֔2RZڜNNqwv/3q1†{s*ҫŏAԝ%zxCihPZrTtisʂ~IxA5ֵ^+k)L@yM>a4UTsc%bD*uwݽGۍ=u`Ǐ>jґǖ0x?}RR?3dR1@IiMPbV8A-GF馃LYPia.bqhr,,+̔ 1R}/H%T8 y)hKI PL޵XZYh6|ݚ8X'})+ǃj ,vX/ΠʱwO+/."HW*(b[M!FQ#xdsHp)J) y2ygS`0zSj\Pd\{Ts0pVl<}lUf S,0'c?ڸ慦FR(K5/uAKE"lІջ}  dm:=lu-,1 tDloiwfnɶ3ٮ~Eۮ~Xu. ]ySGN N!FH,|=N[;e|.Yvp8)ct`9̹(Tvh` ,1&֛ȑV-J94M(rZjEؠ#J)'RHDc5qB7VO#A52%}jyO@A$wG1 bR˝5t"M4}x?m۔0EI%3ȖXR[e:CΗaC1A w>00i@ -# hbYQ{OBU۫')\ۥ/+=$ ke}N &h.Fkܳz癠LAlóȳsK4+uDKTtT2ÜY{n؈ij@YGV`H,-TE@4s) 4]#,;FvLG aj3j̒Dye4zhj4BZ6#ekⴐҥ"R9R"@.F$  (9wʣu &5xMɍ[T8כUEu~^۵*<`dmY_;g[ QjSf-&5NS0݆=d7t a8=l6=y4 d %˙n_SU.yf}Fkn&t^yy;Ų>=ߺٳoέ^oஞfk.\yS*4>-IJ&{ڞF &n?%"@]yK(!hJKP,Ra!A*rjL45؃$QjYcDI#= s80+ ׄ+%CF0uhE(82pcAiE !Ie0Ƌ:Lc1)3 E N&|:kM\gF:IcnwAũZYfĆ 0L41vηtэ\_MvYSep @oҦP8|<&p G߿(!8r9/)OVQQC8< w䫯8UᗳroI,|hnG[Ω3Z o;-;0cq]U?k3ᇶtw^]$d] :&H-pH;#p10|H̥u0LIM=}T4ϊ3@UI$?/O EU1F/w}fRZx5fB\_ @!bgc/7]Sa2`t^|sQzh> #;v AK)F9F{)#ZlTJV%؉0!P1U:A'u9{ .`,':=.rXTzb>_im Nd8 (}5`>oj3hYҘ_Mw;;Ud|;v`tn\I3 C5L ס8Mf5o7ylo|O%3 9V!-1`*X1ÕT$Lp*`Esy׻[>#L!3]0C5#咢tmQDkЀ}FK*7I/i(|0I΅҄58. åhLU )01QMKco˸b헔W'cl6"Ig`00ur&h*ŀQۧ9~e_zh+Pm8Ut^(lgeW~hk|Q9_▁ĸ.;q±)np0%sH[]F‰uX0r*sbܤv%&2AgZ{_f=GEH{d&ry$ŖdٲLLRY)4Q&eU:aKg>]4Ptvʐ.w2ϼ ܬ Oz]C'|}3})|lz[{"g4lNze?.ԶR>weܢtg4.:|A 1Z8/{ҝjpΖ݋Հ%`S?U%KKGQ " &xRY0_Y:8@tQYt䫍J|䘄f;Z]ς53g \'u-(*'}~..Y|3 u˂&n\?Rx C#d_$0]+3G\M.,-2jy_5b[ܟV_O}t:ZD-?1je W[GƝLfu-NOBgս-/?~Z()=ScU8x[T~Ѻւ";&3hjoZ Ȳfq{ ^R%Î>PA) gDrAY r@J^SHA&hEV*`J" nM)}w+ś+c,Vܭ~N߽?9C>y9*k\ !!Db=}>@(UuBAe3a')b$XV/!@(@P6|J5=GNȈC=&&1G>fQJ͘ɧ7[hL6lW=nWlQNY^U{daz8$MΒ6̌.ŢG8`1^`}`ZA"Aa63=ly>5u]_PD["G@` y"%AA&- P%Xٽ(7qm2 /9דQX2YǬ#;.Jet9kQt&I7!#ͼ~^zZlmG PmF݉F7_jՁ;]=oX*~0^JWm+ JN% DI0pX@ f*m.dQSp!CrX:eĤ jbR"Á3/(#^e". ڪ:'fYb#zW>6XeY5~]u>V#5F_ނ*)Fy'_p\6+[7<˼[Hrix5f2;=m:Y87\fUHt5\dVz[nk%V ga>?ϾhkN4_tEH[9M{Mc܄#vW Tm_%f2Xm=rBp x~CI}.WXtRO'kYq%f}cf]fe"7l_*p$sK_G?V5%_,R|g.,:rE4t>Mt|o6;x+DZ}_PlzrףI^}v<9}?9CVG?糃%ɞQ6w_}SaR_cMz#я?vj\sWzh`]spi˿Nj͜gZu?q>vZxS~;W+M滯bg%,qx_4&Y4>{yιĿ)x_{ӓ^3οhEprȁ9uzxH_Wp!Qiק٣u|O[ո-_T7U)mڃ:E"w:O|p|rnl_]&-+Q6tL0sC=k3Zm,+H%Qi E13@Er='C0J:d*#)e[u5y~Bx鎗 PBR̹V ݧI} _ [a,}6=jQ[$NNvtE/^`aL%Y4"F䰁(,P@8 sN~RJ]Gl.hWLZ>g6ZD>kC2IM,jϬ2I., U@y6%ULuYM󺱟53gK?}r%fWc74ԺdO*E0P%%Qж**+jc')V twJ_8IhíۅMʜ"gDzgκ@c@Q4q'=bLI%"Dp)$܉Ye H+bq&WS{NS{guM^x΀qi2)%9̈́if^%Gp(IrQ J6nDbCj. پ8_&η=;&YHiWd8T3%ȯI:+]N8㌌uizq_Y ;w8z:k hC ez{"QN&Ol"-8 dg6FM2%e@i0 ODISqDgLICYuLݧqW݂mN*4X+ZaYq L%!FGyU+G0Ga!Ĭqp&_C?jM{(RN@ +54C@. ]L!R 9 ,wS텖vdTAcuY@5秤@9rw"V@۬1휔I@ OBYt$}el&J̤Bt1SjfN.'@M+*X_,99r۪`w*$[ȍ4;zvcG(Ns?#Ho_y6}^>kֵpy9R.]]JN fԈm_yt6v`݌eݫwz{MQfXu0g_Oohtq1>.Boh2~|=B~hth~= GE6Żyd~aħ =P,BJ ATK3!N=%} .XD@dMQN'9kv=Ku`gJCs2F'I{B>fm׾f' Dd4 79i $mGBWSXPYh}QZ4md+S'Xq6΁(\j=ʜ-rOӾA '*3Y;U5Ct; L'Gd)n6957[3ZUޠ{uLY*߾ZpK;6RVJ%Cv!P!0-[˙)2Z13;K`.>(R e+RA5263vdlUecj;bjX}͖jʌ7,_ln_g48Kb.SGl!P Lېhu%XNEc,dN/UXϾd^;NP_+**ljBQc;|Q -H̹tB̽AVcOP{y)?.FH#m< vd9lCTSF mJ-lmm}/J(:iB&K*R.Y&QE,Մzf܎}PQq_YUcDT"xSZPtʻh&& r;wR1вt:Jřcr@DMΗJH*&h̹߿!~ՑqqTOcZɮpqU*&E ?Ià<,Jd 9ia B{ժu슇wT%zpGn%ݹϸ"Oz" n~|Gfe>ѥsI`>ni t$\:_W"F \ȗi_i;> 22ښ̧Lي⥶9HVidŘ%|2Xr "-=)-2i}2/O-z ][W;_.)N 7+{tRzoۿp7>OJ/4⪏0BhvlΪu/6Пwa▅cMo'x&`=kև:8d`J:%pnJDt^"F%;́,6͚lZ #aÉ|¬ԪgKhrlE}f~뵱V<Wu5Pj@)cʩvdXnk"KRCLoO㯿H(Ԁ{QcT0+cPi q->mN3T.x83ͅt—+|#VcS0a/Y9),lyQ$N^8o+Ao߈sDӛZ'p~0xh+(vEYo;n8%73l MZǖ9Kc+޻EH$i.'/:K(!GD,p^1)EHYq5,9KL:` 9xUJV|n{q2F"Npolg<T&L-MђݲLd.MT-G!hôW̫E9BXHMXv Pq AFO`EgRjwAx*ne_U9xm{6NL L+`ZKBuC M9B0(. Bݭ`q]5C! S}ٵ[p/G إxZp*e ;P9H %8ĉSHRձLsQ!3 % Li$\'kA H-vCn{s M@ ( y&9QGĜI ˸XjMH؆шwWGC1ssKGO,UΠ׏SCq5֥l 8I1!PN'By1T#//*~yq0{R\X-(gD|(Iq` *UTTQ99׊sÉ*jBl5 )XO] U%ȱʥDXO&lAn酳&Y"L~|tp>ӏp]\-HQ‘}5h%aeI*ﰝJaowMp̻6߯pȡg (Us>5il^ÏwW7l 1LP5'} Rw~1[ncd AVp5=k\|p0x%7[r}KaQbc3bQ8yN>g]-]&:7uX*#[urS5[mfk$7,}1*_ȖSzB}CM+|K yieqz d?om?(3go׸ G`!gzHݛ"@C7X[Mc{4u3;E]{B" m/zqA{ zȧ>ȇ-]^]Y\bg ٸ/km9tܤ(NTXa@`v}h`D|f:*${PzW!ZT G9MFeT@T*DI<֊:= 3W֫1fɎzuZ9ש`r5:ȩ-ă"K=COw<[ <7:}xui\w4ujƧģA[ )V D &*PrVqNDxJuy -' ghA6K+Pd@vJI-8BBt^{NИ!HW ve Szym} /b>s: "OJgCw1QSFp;_+/(Q/3 'bkA:C6m3t2 I?#PqXTn|'ŏ0]6F\ 0Ҥ_5YLJ#@-GuW;tvW~=w>ϻ3= aR~#;c;FjClE$!^f 8wb*_6OfYq=-Lƃj*^bś+w( iNOw(I1,z'2Q ps*R)%ELLak0{YA2C!('Q?ý.ܻS$vK^7{$ۈGj)`L)hpM27N :,JhHy*-*_e#R L.c4`q{-xwĻ;X{jP̭+t yz1"|4X =h[$Dɨh:sfBn(T,ZaxՏ^?FX|A1r"Bm:)շqk9޴Im28~l0e9큾%1W,~0wnZ3Ԓ!q:QwZFWjmJ;3Ә冚Ce?i-$J'Ka(kj<3Ii,,uiP>l@Zڏ`NSy&PSrRށ'Z4Ʈ]og;[FLo|| |꺚e+wsmd ٗ"kǪ!mLVҪ!.٬l:r'B%K۫TwT:/sGŞ>uXĒeíWmxoռ(|wYJaCK%CN~>(ws\wTܜE~f7zesΚϊ!$e3|~ܤ]vD6-[6Qn|Q FFڸ*Z ie}-I~rDQ'Z(6@(W cZ0^Q",Y%s f+䴑;!ه߀}(˩7Y}dt/2m582(+CE:QE7J$lhId_].}rQWAuGMދy9T=>ݞ:lC?N~#]fE3]YWv[:tW qP>s4mx@ \vJ+] tJk ]N;=TOE2l딂wXQ J=|zgsϗޟ ߿:_vWM$ Zx@-$!RM Yoh~?.>J>_*l &V(\r2x}'g0 PO1L~uiwQZ_j&tlt񛖷-'pLߗ8'o~ št7zy󫾘e?RYL+=k|h0|Vdts{zV]P!RV"^ ' X '°fg땸IB;$hg}uV/h5rv՚מ |=Kl*BߨBF\]Kw8FR;]@׉C9lYt8gkCN{r<|Oyx6*\y8+毷~M))iځi|G5+o}_`\6_$.jy@ݠr0&m&d^xpa杺ޢE!f U&̎PrA1äVa 4ltT1fO3{GRnTiB7x"ECebHrT1hu)+)c3#ɴB@eJ,hn(1N{Gǧh}=xǞm豿Iw-xW3OZ:K,^ ߏ-y(|^[!Aw V Zƍf} %HC%w9AZکiE%-s *,ؤ3%G^h108d+25uk#F}""ҖBWAyge i< )Er0 ^V#Bh%Dfh@Q8-m.(Ah8GC@!C7FQޚ]3_dy!to?20`Y66Oޖ承7hOik dqOn2>6/h.?{yM7q~O=kVF|ɕ<_o0 -|=5;L&U=lMsDyT/z ,CT¯/iȇIP\d?O] BY (* -/#X θ`nzT",0:l~d۪5Ik!ֲδ۷{8u7#yWҳZfETƻ! 3$ɄRlIbm);Q̱J;ȅ`I*/B|T evG0Je}fxzZڒi9E#دyw}>ɊKpLq yqYJлO$ST1;&\;&8&װW?Ó,=z$9w::>糗la_gکp (e'sR֭9Զ;(lW+ծ ?Vr]\MT\'cyf[!biF,8gX9`-o,sq2!qEf)ỷ8e/ПjOV#gÈ[U..#;}9O)T`6*9jiW7:nPI$ih$ıYXaT]AEƯ̆ I@uvk^%*X!5sSXYPsQ)ܞKHÁ'RZl[ZU.㶌jMO':U2:}%7`v;#ek)vVV;B dsу\<&2 0G2Z)Pw8S! kKYrAcn) `r9+ 78 2)ښ9wkzX.[ys/3ׅ^nd6d xwKw[g}o{n)'ˑ (6Kg02 F,cf֑易W,\ ɧ\:=bA]AHQԦ46JVd3v,GMYZ5v5ra)K!K!z-J` Xe3VmZI^Z/rr3!4Jte,W_E-Tҋݢ.:qɶzQV֋rz6@#xCOd" KfP (9j%΀VNX>zqWa5VPm@m֝_#~_pf,z n8я 'C|Kq^Hz; x)xLeϥI$XU0xA8U(2dU*fR!!d^FN"w@f2FΆa'b;98ߡǭÐ;}{Ƕ⻫(.wS[ʣϮ٨O|-xג95 Akgyi3(0H GcNFu.797ޗhi2y. }Y=j-7⬯@Oi^Wuy} ʛmޮ[dY  ,R?˂9Ɩ`m` ZmYFܾߍݲġbQmI7/\ $Dȥd,N;r@ gU3 6gRɴm8=}$gX0-:KX"V#gҡT6_W)O k0JOCoEA5eQW Xf"gJ x`&`sLSٔ,,Vt('{9{cN`9K6Ѥ,K4TOvI-DIj`i%[U΀e(+0qbdNjDVm/Qd]=;D H6\*Y5r6^ol=ƨtzW6X g9"YYdRN3+ܓcI32♋Z(K7z-%ݗ;pT^Q:|U D()PqGVљAzB9`S)*k۱ӗ{T; (+7~Ȓ4|``mt׻ϖhM1%byo4Ա)`'~0b^IzO3Ӄs< /TЏS-d7Q>ZRNGrCh>CU Ӽy/Qxf\kv!&9x\kʣ4]Ocws"$4G]w+H߽4mq<9ox݂/gF}<>]ō.).Fj0Fk-_W"[ym]Yhۆ i+XCZadICކp (e's)^ֽ }́Po(k -!'6RVbHJbÎ} \\ ׉{9όsYId1eEpzMk$X,sq2&1zrYJf.s!e gIʡj2N}@:@լ.:l,(9jiW7@~4-ZWR$~VUWQQ DnHI b!R;:Ek1Z"'t,i ӋL&';e @X?;trɔzc6ދLtq$~ʮdvٲ80|iM+sARvM:ù` i 3Lޖ1L,RZFlN) q.eN۔})`r9`1nd\HնպseUecՒm-*7[W[Q |.f߽/nh49_f)(6Kg4bF,!3H3cAd.Sڪ[:$((fS FE>Ls> Qy3*[jݹb8]q1jג5y@rhHƁ1 m TesT|YG#ɘB:[fҽiC&dy ђ!Fp`8BLY'T ϛWpp'tzCcW(*[D,bwI| K^q]RY)K&#x-JUWJ;Θ|cJr Dbgb4Jte$ee\A"R"Vi#YTCuV]le.X 17IXrR*`, 09vJ@2L ˣovqv`8jIǮPU; m ӰFnEy{wR,F1$=nײ(F4ḻRoy= ͻ2isoV-J} <1ɘn|,&m:EjA%S}9lm,){"H<,ҫ.ϝES_g`$BJ!=&$Ҳ /1b)Όd7*}pQnn>N'/;*^ԷG!l~kּ'JW\b=l.U VCphI@"4Z+hE*xJC.KRYzqJFAEj.J$[Z"y`<'Rj[[/~Eg|xaZ=1ѢM6Lp6])24i$_gU6RL_u٬xlz 6S Aas^@^%u<ܒF`9UFVqD@~1.6d4jzMYJ Kr&<=4:AdU"ǀETۅ BYv1&O4mˡ0 p@@ $&GƢa6IG Ӣ)R]E%'Y8,fBBKfPL9U^Ȩ,ܻlc o#1@a7k֑?0>D(-B ŠwٸJ$λ HXn*0; f<԰2\2_gR=ԻnA4$^JgV2w a3%G5Rr Ga6Oqtq4t#/~:0ЄAyUo-0Z&..腜%"E8,xɯXLBԅnCGN`Ho췫߮2ھ~jmϽU6n`?8bެ6Zq\{L_BAr}]nq{ ѯCꯋ+? ;/?xsYwSdG'Vw{%@B6l>ֽ9[ئؒҷiS3bs3Fml檰Bp<lJ>.V/zvɧ]3~cjZ8VCqq#aMʣfdrW"&vlP%x!mr'$/ޖ~߽=zUOrcGu@?5o~ھi#j5MNmӴ9 ߢ]jyC1iphO` $?OfP]]rjW|R@W1ԥWۤv'z盽rFuSL# ؔBm,/#]G~|W'E-R6Jc` )%φ'Dt2ȳ6R , M:2y=J>- U]O.x?+>\)́jrh/$w3p]{\nPTygI]ϧ\!Lb@+RV=8W tl2+HNx+ UPnﭠ|6]2y&腕Lް6HLJJ쾪'0ДaELfJ )JG)]_B!*G` GĈ"%[ _p(yyhD^?CfA06#pHCrbfM-o\ JI4: d6oަ70!TWW7n\@ kmo<}6sG{&J-P)%D(i.EeӼT&RH+KC(=X,D%m&O(6y,E- HHQ{#"juh)J]d%$ӂF5%ݞsY K;k]hbu3١>  MEreְ]r^'HFMq-zJ{33p\@ {BYtnOUf}-V.`]OtH[v.,s;-H+"ERNm8G9]Qu,Mi`LɹB8Vd@:9t,N)xG ਔsf:4R ڡUQxeB3E}tȱN-2%~uI6 H@H@Dm1!15cZiS3qn tdȵ!0UfF^(6UCGVj-Mǻ74J7v^ )? D/=y#f.~7{`'QT:Y cEiѥu<3Idszc{Z^;5mH:8]@wz~V'U7D\Cqs}}y5WnkqHnHTY;V676isog!VMpYw˩fsxBfKX~\\^=6縺eh跷f]!,dٌݫּWj^%F0ߡ浒l2R_{w{B;2Q!'zPqsɞ\z-C}97CyCGh|؜~H6nfC h6\K`(=#4jJB@zjhN5؝d"3ZJJ LZ& Z2l*%"ERRR mPZ F'( Ds# f_^X"%)S1h4`szb,r]gӁy}C8E@XS`Fr%ϙPbJK.]gLg|k.įf-IQ>20B^2/>eɻ/YeU?n<eQe5t\WQIQEb/_\ū\݋,ɋŋ|P^0s,VEـ׻0-ʞ)(ܝ`]"mo'~KFx}#(ʳ׬nɫQ"M̀gW9^ JV7ߗU W#ĺ4H.<2Vj'Fůcw:?֙72aEgަad|̧dՂ& b)>r4b 4 Pús_8ե)/|0/a8f"Rs=IY #04&AXpVsaY'7PpvIT6>P\huy ԷIxxĠKAtIF,5iIyyTyb9f<`pIBRҨgh9%PzKAU,WSĨ{Js};C5h+oPm٭iJrM/':vv5A 6p8. !myaV20HĖ^z[&&y4Aq`}M@ -e2DEp68SR$*Ӑing^Khsu="y7I01~: y~yS!lm=+| )#xIE=8Eʏ,xL*"9_bR.h=8ugpgK=Ӏ)hƅ™ ŅQ#s1E[ij $b>Ʋ!2C۴f1=^}$jL>ӏb%g2]})d%hZU8SX|),҃B7)f_CA $J sF`Li%Ht "1њi=U}Pԋ3HΏovXjkm#Ϛ?8kɘvA ($DBM 9eA hE# ^1%S T(ʼnq!pBrv# ډuO !Gk5>9+Fҳ |8!0Wak6,qsj|ntVx^ F`]VKT> =ruKBZ W<(l~{ճQi y9@$$N0jIR"5Ǽc"Z.590q?׫+c5g?5Ct"LF#qX&FBUO = $'x՝Oz^ MB um 2w#ԅ)t Ѥp}j<1MMj@0,1Z+aSBKT\'mgvN+PiThKJܼ+Ɨo~ߜ,n3|]9R㛕B]:Ib`&1׫GM0-J?|Wퟸ ,~ٖ Q &ý8חu]U=90^7Q俾.~ΫFTɟ'a")?3v~7:_=N߾/m_d:O#1"PN@,<A`M| ɲ5YKzX6eˆډceeX&6yP-Iu=#w#3H[ŭNޢ5E MivɅ/<+:~p"C9/"r- *eή&!IlTI]TO,=v:`~ox0$0 gRCLGi` Hji$צWz( O6w66.xɎ':֩nžiC\ug|z_'ժ-VR}*u3¼hqNg㪥OoP,3WD=Aq&b9?#מ*a|>(ɰ|B}TT0.HcnVGt8|a(=(?}V{91^QΑ%?~1sZߪ^hdQIjLlM>JeԔ=tY;?~Xm{8f}Y> y-ob_ziͥhU2 d%P-9MNYm)ddF4A orףߗJ1>$'х2fE,_bJEU?I͉bc~?_=(F)>OZKK<9uZ xI~T (5[נc0ɺքb 3ؔ L}SV~.IʔE8_Ä{"Z" r!&:$%D? ߡ,4]F4**/nF`?)W8XZ\$gfk^P(D"v{pO DG&m<{sdᝏYl4Zڲ:$Dy9ZvӋ q)}ƬQ@q|1FL:;pD9!`}2(ya ؂aUo:r8mKƨwF<.di> ka$ƫE G^‘ @td鳒@hKQ#YeF$8xbgQܴƁ|ѬƂB'ązPK()dBU iIQ^´y@ />"$/e;76T l*^X:7+@JTU U;#*)<`%g/$Sƪ`|qˆ:N*q>e[}%0+pm e,"zХt13Ŭ<$J,xҗ 78"rjI7k)zc H"Hr9%E. `LLy>u9 0c'vzA*c5:R ;( @R9$TiOW,X7odllAV(e (+TC(MxrUpN9%I^2`\SL}BN28FR%HԜ0UY:@9E@ =JmDRD:csjgV댑M#ifz@JQZ^{NFR!cB0 >Y tStrgAVA)f/b P[;i*C Y0m@…YX)*:LZ8bԋA+Bb\?Me0!#I rV9q65fc)kh.X9eT$ҬrHrө K]q%G*\2輩G!wTu)Xow 0@5V[FNz/nZ,.G@, 4t0ې m+x$pD~Aҙ<8Pߡ3^̐]?Y3z]6H@ iV @o`%З2}+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@_9șCRYd O^ sZ@+D% w@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X )֎I 1bN K`@ֆ'"@;X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@_a= %P78Ν@0W }(JnO] ԭԚ@_H ¬b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V}9J[CW dNVq{}\P ;Z.csE t{mɻn"uEӣ)W_g>ˣxtZ9p=8*')$_|~-p:u0}r/spy~׮kn>O_C6MyiCt RfCj|-+BnNsx櫣'gg*֛bWCdw/'!8Ҷj -AR^m05=L|z9ZPXK8b?K&L M 6i,`C/*)b񮮛X4,3xZ0^^!-q&|M Una~HM+FCƝI[#sCTSb"5,rC};ޒWwW -\.9^t9sL?/~>h_#7͟?֞鄿֎He1ǎ1~y\"rgR^]G c_pHJS!` ƙ-F-t|+n~׷:;1<08EkJn-`N s v&OxN$)v Y+4/>"Jv1|X." vƓì; <̺[IOծڥ=W'Yw S)bJw5uxldޞgNsu OW1E<4Qzzr.^V3>3Ω?B*=_Nmx^~~Xzboӗa[g/}?9/.\z쇫Qa}$fꦿ^n-V>pG_AzqGٻ6ndWX|9nF*=d-ĩYj1DdR療eH%y #JrA_u薬dJvsȖ19Z"h{+ɨ/g*\́)az@sf<s@M)S&%F {kΜmc*Q9TCjΆ-fL R/MƷo=WT`%ăzU?|pf*+IBl+k`()0S\ uX@QۢnX'\$;aK(B\(еc"}䂏-cj Ltć)!B&jE7-DlE?l,:e0l]Ԋho3mL%MՊG*H<2IRz՝UɄZLReQ _e3r}Lgx[0޻칤6R*@kxwi}{!.Z ^K3M`d˘LNamO_Mj8/xt\v*+On6BGҌSAi BD&USFi aV&kh٣,M_D RfJ{כi)U+pT2}Ӿ e#rqaG_vOóGQt1~Jne-f) D-j*JHyjLIxb:?/e_I&u! -ỵ⾌ѷf<ޛ|?5l?$|LM_[^ Ωh=ˆqBoہ54,;=hPw<ׂgYD݌ Cߞͫ8*sx%NW)81d Kћ2UuoJ6?eBC˫ϓ4u:7,+G1E)"]?ԛ*!|N[';0}8}v7&3;9ȇÛ8qXߵaP-"0ܗ!YD1:4!NQEѱbZMŇ޲L7gKpښ1K$[9K ?jre$Ǔj|lon`rdozk"P2^6zy2= 3բ=͟'N ]m}e^cz9K')|V4\VPHRdsf8יPU7.R^U=_ֲJRUeEx2ͳJf+\,Kq<$y]X+9dBP ]pGq:q?A{F>OsҎs$g UDb%P)*sE*eYUvR3YVIeLS).T">JUT0ᵇ"lwU }ߞ+mWޫQ:4>濿*s,&ê14 ZO_4ןafbSädX&0a*pApɫQJq}ߵ !b'!2#PrܱV1 d2^ßWotQc12BTX>H穾ŮTVblY,(B6_P<ճ>x-h8`Kd:b8ݻos!bV/!3@;/Ɋ 49@襧J4)GDcҸ;Ԇ-y}/NCcK&51<^ sxD^,56PKy5?5Ǔ݆> ) Ri"5f `]n<'x@и"61md'N@0"Г1zbjѦM[nKPt46/-`n|cx^I>y%_5ˣw<:;^]Q4?  %KQݨKQ Q)y[bb);f56 P#eD,LF8JT'qZ)HG[^w}fӶ܋ڹ4Kwf,:;/r!CE<8U&iG,Q ӘՋA;s@Pv4&.`JTVxB D#dbEgrZ7u;2?92fŎ,1\^$d䥦\sa‹.:)0):>z2#ј<9f2thf]$? 22TiV,Z׊k 4GA?˜txݩ&.x ^,͆󽄬c2L`#eAH<L)Oi׫T qy}7t)ٽEM1QRfS^ :Ny-E)b8pKJFqeA( ʺ]zL,ښdQ- ӌ2)JJauB įhE r$9<XnkpĊ%VDiYfBl}0;])яh>[85DMƙr* ѴY<qс3Վi_¼  2v1իUS!DRtGT3{oz§v`'9>5l5ݻ[kPwZ"$Ԟ>c/Z ֺ=^{L2A>Rp r(rR)2#XFKنH0r/{i0Ή&yFƺT"g'Ώm _d6n14]׳,|,< ;\3p_iCmC(Gr 8\VP)(٥Iq_r(/0&Gz/)2 ?z/6YoWZ;UZ'%/{'ϹU~[44{ՠL 0[cl] Ɠް+G3x=x9{ؼZۏ.;/GLڸa 7;hX5XU5 &Pg3Qduz(ߖd5˫0v\k.ٖdeqsryEyI7 $Wv;$iOƤtDV0ֱ M6cU8Bꆔ\2z`vU4<kjxwUzk7̌Tz GxOc߅fzZzZnbߑb;^ɦ[g5. JIWȜ.@aޔwmA#(~[Ojg۝2;wf6;Ms*X5C O1Pxy}߻}! C&, $O=Tw@/Jh3Bߚ}Loa7[.6 ^yyLH'i3{tD[|g})w{]Dy}s2Ğ|sSVv%`yld?6IxDY=|CX)ZG+!֙bw~Dl_;T=%R?WWvɨv:+95 i4w?lN-jSs9֒lxG{Ny|Hsyі-OIJ t Q tC8vElқLA7jooe}8k pT+PX[]LmG*Xe`p8"rDԬǮ}Q;E2StI@+XC{h›Mr{g.9Fo, zlw|fheÔ=W.$Pğ5O-iR ͅ5F.`~('m]/T{:+86 +M8ӊ^ԅ* -0[* 74OYPϢӷ` l nGݤYT%No/T)YA$ 4H`BKX:֔yZ`>Gaz _5g3ZR0>R10;E}R;6^N8zcvF'X5*R%_U}x.Jqc7Ö>,xv۴زe''|d>:e,y~,)Opib I('>G㷪ms_Ιy3g^ 篫h@㌗J( T3G"-< P Ax'LHxRšSs}T"u|+(Xlڴ\ӂڋrBkno |R85?m*[Z[5"}u%"26 zPAa ƳdV1۫;J'5IH"yl7}C|{?M ,sOPbk>xP s01$ͭ !(xZćA }쏕 ֩d?-r%Ja$,^lbQs>1 auw >9`Re-WrHBJ4`$T`}'cfE!uߦZG1`JRj3ñGKUR?jc)ZCK[ԡ^6 mN[rϥlDPjrx4oAΧi,8Go w-1DHOIXdNk% L& :3u/4@w?>J7ıf C)М(@y)D xqi*~W:'5]k->de#"Ky^Ǐ/E JrPE(Xs`dQ@Pt :m]n岦,w )%ȈECVj" Y-?Hr %R<]k%D-M(@{Qh,c"`Z 2J)}V{o9%t68I{tYQiOY.ܟQ'ر-ȼ]:EoS J'w̝NXeAW SL6N5^gwEķ=FL.gT =nɚ@Ɏ3Ul1-i|4uj&7VFF7po ^8墹3)1+#Nȧ{AdZ<ȿ>u赶|kRN"y3?&yUxYE v ` ĸJMAƾwX`D;RJ-]ɦĦ;#" `8qU w$L^CRĶzkn?H""d~.zDsӥo>.^40w7Ւkљߡq"UuG}]QxDhV ?S?zIOeeɝ%7 "(0%Q7g.\z+gq3Ĝ-Nz(qiqxi-HdcKG"tÞ3SbNJ>IhYvb_y0Ol:9>1 C6ͳZ?vjxuv s3J?{t^ƤXT+DgPr""Ӥ잰* ci|Aͱ*nEEwH/=@ dH*$JR %3lvJ#:|G1k#AU{ȑcASG(U*LH$ Ky0j@)"&|Yx<4ZsU h>zVcH#Z(צ$>^!&/,;zU7ʇ:x"3jHS߽&IĚ韷ؒP=(#Jpp"4&0np=M|Lp7 *\xs M_Uxhw3Y+%b@=b)FA?2^:#:(:yi<W!8 l O[Hh\cS'Rj 92@]2 n2"C=r2RS>K^O7Ϩ.AUgU{\ǿs`z2x\]OHd2 h[SdX4_bE>#*ߦ*zmTpi8}8x{?ƒ*E{s/}@s5y/Y^w (`_28,sF==;JHDi vbK!:(gZCA3qY#ItԬ Y)̛U%ʃ6踢JI@9PH"J"r 9hYCHݦ+'j.0᎗:vpp3!Ke?'Ve`3h`gokTL2-4Y)[SG-Ҝ[+~ZPVAF 'Fjm|sccw.[~D J&l'ESh"K@iFe}$D{ t I.^xjQy9CYe )LBXL @* %=~ĮyE ʰ䣒jrv9.c t=&w]u>d$Du<9AfRvh+ױ VhQCy$^f{5۰H,zx/lj&l/-v#}V97W)g ;]Pr&{zU:ch3yA{]8g3h5%aߣO[&/ѱyi>𴯱}[ރ+0]}$ x365U8uq cH@n{XΝsJ)}JqbF@27~-AT(u@DW}p=t;fYE/n3ԩc " 0qQYt VMg91 Bh>t+ }8K g^F->wES#^ǭ|2?A)e,ԀqA40 #BLM<@YBx)O0bfrTc8l:^N@.<{2° 4h]=r1yb!c$]_ՀGO(MO!ύ0sE%0X ^?9xɶB6j/&!BLRGatL-Ep/ّ )z 2[rgV ͔s1oef_[_?;?o85g CLu'=ho8 `p=WG'W5{Q$9O_%;&4zfDt~}rC7"j- b^gGF?w{|N?w05mz807oPgpauş{?&RJm]B9z36a W(pQ?fY>V(ᖑL4'L @Lš"`{{dEi_?hg?P$6Ajз^%J]d2~A E-0W$\Edt97j '}Vi%;v,mA$X,lRNtd2g>_aP 5cKBo+oP^ʾOẎIC/ͷ̵͘ /=>H4 ![d1Ѡ{tne/#!a̱|SU=F|zydaQYLUr&U, ?Jd)KlRϿNyxt$<$Bx.ynGNǭİY Y ո_ŧY*E 81ăڝzyz毋I3 UR0^ߛFX'^YdY:+gܷ?YLIJbFp{vXQQѻkW(`x?s{KFsθU|<-R{CJv?8\`HA*1,؃KBŜu cuwDuOwf\L{`4}q`` yI+rooyO|ߵfL8kS 6%uV{U%Hgp.UMװ9Gnj44];ոqUa&pج}GGba㡤n<+߫R ,BVhh=/'RI9RX#@,vM5aCt88C-'gDg.4aHQ Z3)i6R0?lcqکGK =\K^̕U&:-fdJ-߂GBc/yor1i9,j_#vJ9(T&0m Nkm9f>MJ[~{q Nqi`N cXL} ,el29ipKX}4?W'QU`-.R=0ݵē+5qRPMwR3w0cn?|4oȌp%l>;j m4ܼm<ټ&:??պ!"KvvN+{ ,sE`(Č[= K jo5Jk8D2:LjJ7mpeV-JbNV%k]=*=20lkKpMX#r`&yA}:qr{ ''Y¡jd _sAY>RKwp7يFksՁ5Se =*J1 @u~F5/"ScZ%þG^dٷOSPQ 8):@e"7,$"'hs4?v>:sT7&P"!!!-'V]zb0ZAƀ{ފS~D7-` 6Vf-(ӬNRֶ@e% 뺇`R(TĄo☓jb( j6 ^]sй) 1Y\֎^$fw.7J%'GL'Sfb8Z&с)ϝߠ[a:9t0-*FSZ{P>cEϙ0-'e8dVR / wX҉UG/,F#Xg(^֐yI|ѝ:J4.ٱ1C;MRx}iNczVT'OGÐC2U3c֒ܞ>,Ӓ(Iٴn*a9w~׺h\ 7&RlD #rl#Qi+F}hkבDs&&R8 .u3Ho)~̑85MC< rަ/+[  =;o_)C>~JjD>kfkX=}W&M3"W#@7$/dAEG7~[h(ԯߗAQ> f7.0-,h@13+'P1 `-Iă4oxt۠.W? }v3ҌfYm^lV5c6؛ơzM).sX{O!V9< ^ԣX[c|0 @%0X?]>q?7.R/Alm=zwg~ǧ$vCӊ5e%ч #}JN* aV@p!]g)_~\IOE^+ ip:!%İdK3 u@>i+U*Ly%+D %4?;3 #-+H< Ft%MXQElQJNʡr[H d.he1Rr4pGTL񙖔кpFh&/kbNeML g+"eŠJv +Ih7+lY8°0N7kIbIkhִʏ]#0R^G":,pf܉GxdfѧG|0d> YL0b8go:<͗Ṣ&D V 2& 5k4}]khi0O ;nZJ \v=_GVy[)6x. O I=_IT?=59íg7O\p5_E^M C_'>gmխ{CߔNXܛOM`*!'{䟽0 Zo7PT Нx)E}gHf-pIjˆ'<^,-+bh<-(5OpKs%M. oNh6iʪy*ٛs;^`4ylds9QZ3EGaviGpiJ8VjE֬\ڨ[s `oKttT\>r$W+iT;I/eB7E!| mgL;t?>-1]~I FaV[-`}-#ۥ |:L]q-䗛/V$LU|^¹<)^geG988)br~++"W qlIFp{}_'3VKmJQ){y0LȤ>g&Zf 'wn ݄D8W c8c7C֏I]*p/wgHℷ[~2(]q8]Y/? =Lq4wCb'ufKf _z8FmAW4{ewISB֋y' 7#}ě&.Rhخ-R3"qZ!܏鿭^쬖5?De0),9:l$#CLʤeǸi6|o7e>.mVos-so5\ĕ$QU)en'u$,U 4Lx倕mQkBEFyUn,S0w%b T%da;0Z)C×OjDȊ6&I$ .;DlNu;S$ͅf9I% Կ7ْEz*0 .Ɩ/EAngV@|Q "ڗiD)st' -\=Lw}eцb k!nϕ熂;v׿5r!~Y˝"1xֹ@k`ns72^  ^Q`T,Na4ZF=$dG{,s>Cd;}%@7t%Uû(ݝꋧIu2y &ĽHS! J1 $m<=yįwWŒJUG]ڣGT1,=zQοO&(o\A}y[|,Q%Q=Q#U,.Ѣ6v@BhX7j V&V:cD(upmc;ⶫiV.+TԕCW6+Nhh9ϛ\!41<{ (8#jp\Y0$H&U*dҽ#e/p d0%)NihÈlYUf9*0uPeol&C|pʂt0ɢѫY q4r,s(ZZ,V/8W} 2"W?"Y<6KA?.H1<#-GXe\`7 ^\D8MZ$D^t4̆u5\?0hU9jSro8<0rr!Xa Wsc?8H-#V/C=tSZZ\NCP̥6  e3^@"Dv枑`0cQb,[X4ŗ&JPZ#(bTf=挦6IوBIMsɽT5{855Ӫɀ6Nx-* ~ .k(V2C9v.iM0Jt`>:jȣ%g'*JENڊF((PEc{O$\09姷t4.m\E!q!6<*; 1)=5q6_Mzc}X|~ẔnouZ^Ż<1+}g\S~Q۳b6&H1A>0~?ךK֒O&*)­q =^JD@ɶa (d˵L !aQABBF-2>֙uFl#BzY^T`Yy3rBWYT!IuQYNjYᠪ9) DM/Joյ, \2ygGҶ߼'ʓjE)Up%WC8D BQn=F( qb= %TY!:%!q4ܛB˩sz[KuPOȱBq8@r^dT8](Rš 5SzAzYxQW!㚨 |hQ!h Z]Rlul:_ւ,*7z/B. e*!&1#Ɨr\N]*|̨45 ʒIt,Kfqz/y1{.3pDAˉ/l%L\z%r%F3d<QkUn4Fe}]c %4?;Sa}$ݓuS6IkyAqr;aW96]2fPel%}DsFgQ~|mڼ?ANԣb>Jɬ2GF cS؂P#ڪy- )\6㲊 s̮(^m)lGz{H@?7]V,Ԡnn%;# W +ot+VPK78aDƵt+~١۩[PͮkZNav{~_];q iI[ZUxZR{&oO[ܭskVF?59mhmIm8ro>&lXK(%##h1Xmo٧ 2b>#6*Usj~4ypp FVq)`.ݚR(+I^4=وcM BˆUK4oZA htYXq׃(Du~HIڊ % E1 b, d ,y˴αBFE:q!R=_GN DE Lt߫w"s1xǸsusqQi6CnnWíݖ~\f/vi.$̨k4g8rN)yFy;u2y2`{a#{TB2vK{po_"/@K/TyYیQab7gL! buxlU p9rxgY<^Q)K2>YSV_Hp4ҷF_^!lDK>_5 =HnK29Uu1YWG6hnvwq^qZQ+ ґ5 p >Z\QAҸ'Hju׺(z#ݔ?h-]ocu1pEqJeN. |PTa;ɷz `qg5<s-A*/RHqsQ C_1lyXv};|f?EfDwVD(SC'(py Yn8o< dxD8 KFםYެ1{_VnMS+jv\[1jtkᠦid|4Į0_o=9a_??V+ov6x0uŽ|Y_FHSn||+`DX{ZQsebRAw3:(Sq.4~VI+MJ;<@mǤ6'`:{ǤA/ F/ach9L;Í/Qe:B^XJô6QbA"V=n:u.8ߎg?`:w%:F(p `ҧc1b(~plq   8"(3J+dlK<sJ2 ! M˩1*Hޜ[1ѿr4|r ócQ2>a`|kOgV 0HE:_4%`ۀg~yviGӒu v1(nk{44yAx槶̕џ5 =555&Ԭ-^s1ÇjttiXF%z,?q:]0V¢5 AyqL$hwY_e';v݇`lwh-Kj>H%RQ<,SL-+UzMÊ:8hcD$^y=ƅZea?Zdloc])^B!MkM䟪PmQep|*M  vzYXC?f4̦4DU`F|sǻfsAU}=$ͩ>zk-kL~^o BVD5M|[טn0:ݯӻ7#)Y$ + <cSX t:H:`}\@'BIzR6,}'MbOt{EO*bML:ϊ|vC*ƧqpGn.]%D}ZܖxKnI: ڞp©A+s؟9*%xixQrD $l^Y@: IxcZkĝM pDưoM jѣ4qX! 4]BM#OS0B|чxl)Lk,@N<@lS<#v;hG5**8\tgHN;Mfp HScttnʇm...DH鉰Ũ(D:ɼ?x)dWrAoQ3Q)D9 Ñf&;%R4JJ7'~.긒dJӮ Cl[@K !e֍,|Wcf?^1JSĬBbe,wXCIx[9 lT⏓pg~%*%+cBi?%i%[ZdS 0Ea'l\sBT)t rSv9[,y㪄YT`m 8bUJN NH[.;NNk/dK>|!D'Vۃ)X[2MHc%-dxl݃oS. g>zn)>7n&vnt{~sUݪˆ}~%COmL!,e O/l/Xnx-g{p^^"NTn38`bb8IO dD1'H,)OGw?f+& /r/fwF,h'->i召3pinduPHOۗ]v&[vS?'{F6j.`ŒCSCJh<%7v4k4FK .F:rиXwoɦ筛~K+Ȓ+ȟDJ`` >}j:z=nߓyk-6+'m /fMGwF ?:Z$ς9^Ͽ!m3^h(3!ߣJ7Jh {:y< |߸r:M4<#Maj̾\-lh |KXtDZ>Rq,6Iwk/ >_RܓLrz&G=(a6,,O&ݏvSLu՗aA{)HVs- Q } A5ϻ\hjv-ְ뜪s.R3?3`BIKh @m~V)S.1N} m(4r-,4gTG @Q9|dhmY}MVE}GqWkTuRHTê1wv-T=NVWJtM лE.6DkeJH茺k|ʟ$iWffZ"נ# 6J*A a"`(4~+.J wyR=֢Kq06,w -׏k ĪgXTk=ytaɷ4|vVc6)z*0N^r3*vBN=gPW@>UCp2Kh"V~ cm=U24Jqԓ(pBCDBC1Q _M#`Ec{PQgH$^yBڸz=1z:Py( HJquHiT?MgD" @-t%raЬӦYSl 4;Czs0#KAaټpݕIWƃ4?#n |WEi.ߵI;ip57xh jj:'Ɣ LL< ZlSm(^sAUQ- ;fU3Oo^ק \J εOMiil>_~(/Sh+J+RZaq8j/kx^+޺͒#Eu==x6p#SD@MrQ X#& /P"乭?()L @ 7`D%v,Ԃ.88g`S9g`\%I*E۔ZqpSF R㍲}qPC+:.R2 ZMC)=A7vp풧Xy)SG"n#RQQY6\ةh<`~TkѴ䄂؆NCemGah(a} \:BBDUN-$|*c9y\K0zqv)_ &%zGa _wo$֑,m%#CULc~UTm2oŰuُ&_ B)!/ ɹqb}ql_]1\bv ,9NipV/_[sܻo>p3ͥ3̥E/s)}";qkD 13+:'VEi|\m|>Ps.ChQgPo:6UGc`+y8 I"x!bcVq+Lưy\r*B!~x)(m*vsFiSD:_/bbڍk%>,KO mΞ\'h^ĖT0F?\FTFij`1!q}/VH$eJʰ^ı"+l~KW1Wϫ<ƙ&3J%7N߁F/GTlF-d4fMq܌am5(%4 "X^_o?\{Y5ؠr Y(5F04% Xb]xO}7ă(0,R%C8hK0 I+x /Р)Cb)c1xָ<_8k,T3g}05p!} d-aDHOrXӮ&ڈ= |A#}su պu׸Pх\.mxmH ZjiTkL?3Q_83\PMuEq{w9A%sA\j&k tNJ0ވjZͲM8m"m@s:7vB9 "3̍;_{ǍhABR-v>Z#+">[?0Ckg"@JpZ9u ղ?j-o_RX# Azy[ l('+KrbԦk,j+.<91''GpBc,[!H@cmda:cOomS|qSݖ'ul(Re?P1CFӉd4 ER2î`Q$$Uh*|!P}Q`#X;$ \ 47:FJ>ƹ|:8'in0X)flbrN %zUzuA#0tgc 3fBx_F d03 a2v)NE 9'_ D*6\Z%\sɺ:D; kkKI1,NSAC˞$i$5r 5'I+2^슦]EJL aS$ߴ35`܉LMQN|(ZRک_ʉw>N? /Ǚe\UXڭʷ(`ڗ(p/cm @gPF~͟.ŭ3zURW܆u0mcΙf[5$vq3&x`Y&( ͮn" c؏!21D&E |,ݏE؀*b"#|_ɵ/PXPD DAŲjQZ+R^]OBՈBh$elL_eL ׌0_$ #&L+m<^+- WI7xAc38_ә"~R*Z֘3$_b8sq0vT<)!18E_ž5Իy ׊m&Q[wQ>?ϘX`6(H8o$1b<[o#XS?>SOW7I2./ӎiK QI`+9Kq@.Ss/eM'hl'GOw6gY{Ǒ+A;>dU3,ǃ(JL;-ڱr,[vt'$`*eo:1e%2fbe d#4~|;)(/bG5|62̼I֠ iCr@Rn4:FIG3,NGl߻ q]?,G<>?p{mΘRȟKO4I!FL +}'1kڲFiBŘ5ɳFJFT+ЮNiowjYgTmt++m:r AporR[Y/W,ӋYqsV׷- 8\s54Y. ^j51TQl-}qn~<)߿oyJ=;"׏~~3%aP7lSq~ӰVu}/su矤"ݯ!ªn㸽^xh']8<decŢ!/X=u 'c~'o8ne'ыV965wN9Dm̘|CTIub!yQJi(j:ÀB)擾!Z@NuuO:G{/U=(XXCT%iǞ RoR*Ng0}4)JsIwY=滊5~]Ǭ.fש|3uڢ< ⳋCƂW?~lRwSUpl ,E)XI5)ȣ7] Vp= (C,/}dd5qd ,P^{/Y0!g "",TE((7Y# ÕA07|1W=Rh>|]|X*1zٟQ /qGo2< N/+;`] 0XΌdE siMJ4%E^鐱Δ-Ǻw%H߾~RMV}6}Le-bn@,,eH%o6Ӯx 7ZuΛ6eñԊW_nN\5_q v[2"ͦbvtWߟwl2Nzs3C UF-+ @BLh,)_+truwr@ntХgǷgd=}'zo\Z# "Ȗ]z,vhWRbs2 4x|%CJ8`йC܂Өi{ PFG$!VJЛMh8[B 򩀦N Up F8T@OˬW 䡘NY;f/9UoR-y &HT:e^*q>z?/< z#9l1k$kX|]$Z V,3hpka9&: 鯸zRP/gz+/_IcmuYP<^ I^cbtz:XDT;x iqe3sr?J*cuOY< /}&0'9H(JBK\K lAG2&ّǯZН\s0"=>[l]7 H VtBb:$RSnvŊg% [ȗQ\G>%6h}mmH`&ꤩt ;,jle>yH$ȁ#ZOd Tt{6+2ԺKw=CX?dAkPk+6{q {yR`#3w 1uN͵v`wCvOO4sgr`/O~fsZu[gӳ̽N9a?{3w&v6sw 3`X}lC-tQwQ2æף>9M 4<@ؾ9G`i W=ZOVCPlPhx TK{TG\c(B1mC<H^bu5qTH=Qv`ϴIk!28ަttMAx`@ݺm=ޒAeY[[*l.R-b%3сCB&`f#DǨ_eZ uimQF*1d5pHV%"+-\zYA1zт~cCWj8*-7)i0<} {DBy ?STV`@9 &SIjO*.Sjd7XQF zoJ},Z4ai ӆw4'%=8pF]~#6}t{bO#4~!HK ?1V 8YsKNxS8XtakӪsn>>Ͷ5ǟ׃WvD"CB-K6޴7LF znZ[45o-q uhݭ^JJw~=*mg<^/y^qSsJgq9'}ꍡOʾ Ep1 ްχA`d01Zc%- Ët+ɳP~T$Alnvws%LiiҜ`NR@̠̑1*Ay%v$Xxirũl_J[zŇoyJ5YٽK aW"dFE"+ plCΆ Vg{,)2~Vi*Ǐ٭^Dy۵NzN,~ffI;>6olE^ba3*{p%"K&MP8=nU_?u݋63c(&ԊF[}GuI>;@eM]+i~=Avw/lAJO zqؐ:dkhrfgR^DTQQ4nI42}Z`3RpˆhE_Ю'n%FVȜ}N=={՚?tw g"JDn6!cq^ee,Pe Ջ !m&ˌmRԯYuoUdZl, ٶzK<.x| ݗ׽Unԥ]9(0&f]KFCDK"[IȖ6Jg7MVY}meS Z&`HiO'*V4h'osa@8O3cݪO)+:lU]MQ!پ[ k嶴"2)*#39`,bKE<*'@v&6`l垭І4hPZ3M!Ѭ EGOgQ~>j#pK=9Xx;_VϯMV赑D\tH҂eD 攎2!$҉,pV4crX34"25#͐Ǒf㛴ƟK%c+hhz?s˯^Gn6Y$i4._fTcP2;xw.ˤY^L/w"]E,IglYf۴_oKL٪g {!/޵đ$vuC>"3#-au3fB4iȦj F@CEedd/"i|=JcuD̜|+Id?N_?}:iL\=']O؂^o[ "!jL@[\d'31aI` C>.ZU/^BR_?qN\mZ% ą0˭0fQvT'IjGIjD:\&ĒGc1S"/R)#M< /uZIti5-$ kijW/`^e_-PBo~[y].H{yA_.4{` ?8ڹMwc9&o[<: !8IՋ:?xkܓnL!FT9&>y=_1'<`;:p~<) )v B|~w/fQN )imy*~P}{oeC6{,B*L>KQ#J RҌRBJ۴:_Z5` FIuM4ergIǮ?()bY %"!p;s^H0Ù=W@AbW.&O)AI@_H9{'Z%_Oe9iAAQa‚AD1k;UhmbShNB{-mQuڀTrn8 K3^;L8Iͼ#֑3OhAՍD;j5}C@o1`|Zgݣ8I&"e`rba@+Y ۮlDnO`k'ͪ{BkG/?8Un"gwKA ۭ#9a!Yή`ޣ7SL/0f|RhR閑@p]qɭ;鎱]Ad&*Rq`ka d{""JctOn΁ <3R)ƝWw1_|̍.8g?ׁd?wK !;aN"u<,1G0 ] Njٝ6ޢmҘL-A#Vgg~Jv<"hgP!O}q!ܒƭd_Dm)&DN>h./M#{- % TG@=Q84 VM;/+EH˂@rpͼ/(޶we tn9ED ]:|]v|7Mq]m3X_A嶄 oݏWNGrQa\Zø#pD]Ҹ=egwON߅L8m!!t{17M@:49j4iVg{LܙR.BdK4ex@]'tI̊Vg/T=U@qjS9WCș)[mV۬OZ«*=DZBAmΫFk$wY3qN\;׎ĵĵ&(U $d^VTuO^G!;q 3JhJ|{(k36Owka`@ʱoӟnrgwG9y3QMQQ89 G5GᨙD9l2 JA+ :1i!JQT)K(g(A(g(O,p 0zA9=&l Y`2KrQ7|bTu i4<7Zl@xmؓì>GIl"p#s WvhmI.&t-]|(+kKX!r(!^L k橆܊v!#(IG[cVd٤$G5vʓ@ AZt,qre\|ooXMv`5jk&;~?7vm4蓟OsQcl!OIK% QZVwXuq0(4AhH!OxCBbdq'2e$*)=vSr֣(RJ3DFH 'Zh(3?iJD}w Juշ˃dmꓕO?) "b=G){hyK8Jƒ-:(*'I m 5"L`NB c6[KNX,Q?Mdrco<7 HF0e$x ,b^arIFSMc hc}N7mTSwO?h'4dSrw:ܰU%STW,$Ix': +Nt;KBHCPLrXr!sPR,v3C%DΞtc%_Ifc5d6VX3v5 ߹ާNw \DJRB6t8/ @rkl̂Ed3^]LRD\J3a!;7{7.o0MܔzؘKI{LLVgb:5gbvR1Dޣu>X!qrκ"@D $ JVRWi62É@jvA%ŽCݦ{]z}ta̓LB(5vzéղG{ `*U{( "#KZhlf+ݒ.fז$Aiwԅ3Bߝ]Cb J 4Z44Xm7@CzÐ#1Sw^%J'.k"\O $v27z=@ `A8-F4 _S:{5Z`ӣ/McFLhȒB8 TC>f|-]\ΙO'HW珶)cތ i7Gj%X_J $3PȬjQsRwS2e^v?J5]Κ:FDV#; sʱjuZf.*cN9‡Bω#S!9-K($"Zhܻ/ .i1w]nys[ >k#c',# |+ʜJefe kBKIeNK+m5K0CsdQUg+Oj1۽m{AnY鴗Z 3OAH=?R}gT1DL/\u@h8=o)B(>FSk"hJRMF H0feޣ'BrB$.,a@e̒]o<-k2|޼B/}sZH|vL9A&y:B"' e-455f4^BE3C0/Y8͆.z-^a>j)8>#ÛC]oJ ?s`Ġ 3og^'_[d)*C.Ss1ϑ[Y f/' ]r`Cx _>5艞IQtIٚbti=(q}jh7,8k;UBi~-sƺ%`lzLKVYBR!Qi=Ѽ;C늉Hϵ]vs8U 8̵RF= }YI)HҊ5i@ ֩S*Д Ԯ:ßOTI N>^Xqt_)37Q>n>˝J-tR|h;;PzӀ:vv01;$1f }vyv8[z)ׄ`O~h)mΞվG YD 熴wlMoh5^jU-[Wټ'<]^]HZMUr>>?m29lo\9ٙT Nɐx]4zC]^8UfzKp;;<<`1ҹư)}|"p)^Sh:e=1'o)Sw >-n8v_s<-] S|p v8u;$wvZ%M\Ch⃵aAG⃯)>] nL$]/  ^7baHZ"c52!WĻ>P0Ah;@ ڭ9K\WsRZ!SZm h!ڑs؍:Jri&hCtmM,bNۭ_^qfl=;xi& 38`!5O^zW W n!_+ͨ){M +BFg\G7yWЮ]KvR5fNnX@?bvۨ IshG@FX:ܤNnvO+z{Vܛ'&'VG̲ K[%Ҿt ^= 'rRǖNkG`"%Md]ك ?Q+QCR<.(^ǬE:R҉68FB"r0,'Aс)(#dO0 c] YqOLlVy]Yoɑ+^@yđA`0/Ɛ eBW$aSj}#[M,Rݬ4d3Zp{^F *IUX,I%֨|%KX PKzոig~dam g^zw>n@Eyj@u]?j$7ƺxԿ~8=5>/9}(.޴p3==f쿇/ԮâAu,acŇ&Sn&3@/켶^Yf fT|]L鲔VVYg-}# vU<Yt_ºLMg[PX/C`3km}-4ol9S1PFB6\l9 Z tM8[ky2'iX5;l.T5`!Q)4p&y2jy0 h(UTrYR06H4P1`j0h6)w%A}sšZoN:Yzm] /utޚއWŽQ@T mneSb0 9!HJ=Zv$ȡ\ &vTk+8ԾI^F?ڟ`, ig?jCzB FbP5Vm61HEm#XuVN+jꤪצ!&8eВwA]Bλ 7Wč?]yd̿1GR7# + fhxY/C(X+lN&װU~AOa!]IsĆW5LQ)H`a -!&$}ZoXEVVbVO8DiHXZ!TSh$&e&rSrW Ͼ;FY_rrd~X=չE<}ӛ];. Dؙ,-6XjSEsCNXpzeSʼZpbm{| >Ĕ+_+@yWrI2{\\ wCd n.WP,4 ^dTݶ|{d5HsER0+*K%a +uCOk|?PgGMGRTg6dKV})u4ր"ÇrK{>,eQWwPaAlE0eI 1HsXj9'GOd R*ik0S& Fd)ehpM@lS&攉]_M}{H0Mf=s]dԯd-5PNJZM18Ra3e}+lkcM*7N52PtF\KgztS~qg)}١{趺tœ_]A?VfrW>p޺˶CÛxV,z{Ntht&a)g\1'fޝwkp婡^| e_kZg返 T VT!bUOm7-eCjoONԋDa9/^n;W|iw.aC c^[nKIH3Jv/WfvXbPyH@;:m̿yV.0قFtGb AglYU< F3_)\˭|p2pB)mCA2t}绡>(֎Їvג{xoN`yljN՘t./9ylx R@1$}ZhF DO Y8{PQѝʚ/ktqFЋ&Ca-<4Hû.D)p<+} {_3r &@\ZcD'~b sBW¼qDU.#(fN:FX;Gd`YqR\b=(w1`+E2C?&L Q'yR#뎟LT($A2UHr{!]{(- T&]ص$o?^!kS4]/Y%E`*F>z#s FKtqrշCz*zAPҞmSmO%1z@>?~:57L,):C7"RKr .KC?2w)fb-u%qvq: yax\]V08.?|p^o[8><5:GL՟ʂ55\"%V 1)+d k~mU:vÏW|?Q0]$G2!k>+x8ba20rЦ /ë?&Y8v>:eW/YY קV\}~j= "ulac=8n]A|.ek% 2aggz9#%~(gSB+UǓk=r h)FY;5rnCTUPrlnCyJJcgz)CH/c݅拎qj|ASgG>Iyo;DKS qg3 &A@*W+H4ͅRތt5^ë{S޷qQmHI٧r @2Kϴ::8E]`5i=Hv(;3nkI53ܴ*&ՂTF-łm_U)Ǭj@P_Ӿ>9]T @K{tjhRhX*lNF6E]" bBlƢ8j9FTYsd) 9Lq/:Dp7$f?=Q-:e';n{G.uڿ֋ߖָ8:<橳䳚/P絽]ޫ7[Nڲ^]kl6vٟkogr?U۫buMC,2J8;ba9'ŪֈcI3U)){ӷ8a?7Jvif-ӚYˤfֻC?7~#>k籟y=*6y&{u> aPO!s 9oĨ0\ZQW-fRZ *CրQ"VmWI˸A|?5qֿkF\OI*yLU-r,˶Axy9r.Y9.L^3h4pipt^q_fqGohD; E*l(\v dDP 4n -Xf> 3(XlM tN9> go TݛÑ,w.@.-57M{UHMcT!n:hUos =5̘kb-d)1>#YVIMf-NZjjרydmf+ӼSdDK?yT~[ۇՑi҂͕X3+1;P{}js hFa9IRY0Aux j{5j/9 MG%E*MrOM7}qO<8Ǚ7K1/b'>K=Ÿ< s<^QTr$潢fTDf4!%%{Iɢ}3qFu"QN(\CdWJq%^ JNOp!60KRo37(Bs gi ]6!:ߛHkþE5"?G@TzqZ ;aϊ)(6212)\o${4bphm_ik%$bIq8eSqOVԔ{r %bFҘ6c}á؋cM\ww._ \CS?c:\]4gBS̪<:} ~}ԥ?LqBB8s/]`Nj(^Jl Q̤HC] ɨ^JeS4|L#s0Bf;i[ru4ۓ+si-Itՠds(#'gڛC5^ㄋGdkGrBJ݋rZw+%=1R4ħHѻ7qIႿK.x}/CP6öS[mNנOb\ Maa@O^d{Y)O)s2nX-=8? ؒLVصh oVUB(ozcٲշa>"Ŷ[, d@ʡea庢ݞzi8"6Ӆ\'h5>Tźj5f7 _(YK0 c!s (P .4]\?zoE@g\Xw\uoӿtuVnХĜGtLp~'CCy>hsfvx z/,UM ''N<ĔZc9>0'b)䲑DWbTn]+(gqC۴f*`Es0Ӡ[J2OR*+lV?|Nj*4#`oY.'㘳 /.N EuNE8T!ɰyqf'D`hޜ1먱f(G*lL*f۾[k4 LO1s=kOU۳y>{6@/eZx,Ʌ{*lFaϒ3?e|6>gh0.?&VR*ѽW|#W}+Fj0yM"ݓ2p>]tUprYg![Ne6_Ew^`bƼY ?][ UzX q5'X'E(@E^@veS]] B BXس_hvuH1ĠAsx|-vw1*>_lY 1Gh䄦kI.LGI63)y#˻Tk>$$G>_|TKSTOKj`\M!Zג}O%fpjA@ݏ#.[aqbR0nKi$ۏΨqFE1j7qpqaFwF3MzzI̟'5φ6L]*&%p`wkJ>ŸzwFٓ%O>]|4P\Hhz>9q-dDw#Ϻ9I"D範>r'B* 1$p837 ${NoQNql(_}6'q_&yw>;J{OFn9)7nGv !qdg˨`oyW)nGvw ~Gv_ K3vwdwEf. ;ۡݲ}KTiv&Kf^ Eo%7^ nvw> RءWd_4sA3 g LvhC%FO?fkGb5tPyb}U{+O_]]]m85L{MyWu[/;ν:l2W_^W_3o[z-zuk \-]-ѿ,= NW#<`478 .V˻Jy40Gj@^=!DH8z+M|guឮV<覙f(73WFvz< Й@]L;s+^EMp{=D8C?O H=|zce$—Y%v:3!zwFSD+ݜtsWŖHctr8wT anH)!5 Yf[4gAh9kՠ F,[9?-V̨8)貯ޕ@Pj~uH{F @ *YoGCc gWL fT[sϖ~;?MFvShNl%'[1xcU-y nC]Y2s(%^-BBa/9l,&(6tE@V&y6`He΁2d(V|M'MKpI=\r!dR kg_=\KbLG%9C'݈@XU״j,F騜cOP\#/E, ^2 a{!=s,D1V \(:N&8i΀yseAtϊ50Q"`bD=䨲Il7%d40Mj;6h6945k#B1@84'[&t^SH1~r<3W+K3 &E'iH C-N#R?̉wт%1`cF3UK$mfg/jy]MuԓI>%1쩚1yRr MBcS Ο\4gEL_6V4$~gu1cx0s\<#u$lDԼǶ%[lH0iꍌf25jYz"ܤ0>tsg:)'9Hbqb.ݯC߄p<-Nu7U8uƽ'zyzLt8oCrػwTjs:\2 F9j72-)(V=νʼ)Tjj8kVI:޾:Kfcn|A_s+ ?"7{ W`_` x#ZBtOpa% IXF0]o^tv5h3ǰUf+%_bs(/~JX-H|d*B72>;-Q!Ŭy+#PM@8hsB% %]$bTd8'^~3Ã:b?S3byy/0v0I-X%N9Sp;ln}>({>C )8@!x?pl;>./dW*ST.ڦαjˋ3,֗ sƆ..J2a)3l/~=`O">y_(TPc2s![oݸ|..O!w'2eM J:]e*əl.QZR.z*.e]ަr%AIu7L}>o,Ň̛T'{r>Ji" UTĴ!QV@ɲr,1֬Xފ,,KL4q<أC!0AW)l1Y=z"-ޟ%E_Emo||4p}'LިH=ygW V1qFjEnIB DE~H$;”K1.=,Όհ{ si|<9Q+v?GƼYI🟫m;W (Wg&5Xu׿:ꟶ~$WшuƽMHgЦeL o71Gy79uZ$ǵ }%8m[HLMt=(X2"Hr^͕_M8"QN-IDmlqMjjWmDEmֶٓ2[]-n Ym0}dHkilaԦEv\SʜIZJxلJ<Tʎ j"fg|+gApqZjiZ܅͊u=(SԄ#}P,Rl(xǶ'x,bu}%j~Q'uNsv.6i|pH9:Z߽v|͘gCx!D5S40]}$W[ok9=Pڦ@Oz6͜p-!Ln%9L@ Xl_ ̯8uZy&Zkh98v{]C6IrΈZkJ@;y"%K:Q_SpkG"ۛZ+cL޵olp7ckJbV J,|յJr♄h pjk`PZ!].i!T0gWݨ.(G%භo%isfPۤ1#xvuTϯ3RUl-: @X\~X\cqL"ͩ;ꥃ%.=u1R8DS&DH>xZ4nL勺U/U꾯f|Rgaɹ˟rx>P'sBlBg.5\6̆BDv6b`V[ᑄ,-vD$ޑ3;B !L7Gո~w;;~͘;YL,wHr0d&\:uuZVАݠe7c!0f 2:ZVaW-3<;Ab[SazE$2ɃZ#EZm#6w߽jE?~w,_GCUrq~zdk 7jB-Jy=ꇋ帯tb:jTf*mPj1^@5NJnzʾmi;?Տ~+~ZKc'//E ?Ӣl7WbP]TIz)G8^̯~]yו. a7~¿S|ec>de!߿f[lM%w)}/x3J nls#>\ǁl*!$e_.7hs7`?(&]j1i89}0XMbjT0cKT{l" 6EA .zﰀn+I\8/XRw^{x3C/}=vJVwQ|ǻ?O/Il1˽uUv$8l;!BfmO*A9 f;BHbo_T(y%z|li!li]zTK^[:0UR#hR*m4}jgwBWwoϺ.ݐ]d!䠪.)ؘmwCrHb*\Twܘ>siN;Nlj3ibZ3fԩ4/^lLz4`iR2l"?ݲ-z }+Ɖg3!uf;ڽ.?ڔ>s3$6PPfGj]%9G 3 rC[]#sbך'ŎC{~,J=vH{7*ztTvFhW{[AdOAvWfdIdu ";I?"e #EM]l@+\֎H6>ae-#mcGKfNLD7=jhʾ =qhW{9];K ;goՌ^"s[>x H뜊t֣aB=,Bu[ZrPZkRAu C04NMFF >&]tƸ/zq 0)]B`X.6hw̑b^rZh ٭3olCvO2+d&O2)G2֛[o@h\!t9CCvw\L,jف+ 5h7i߂%iC;]F:/ V/S6O~ؠ]vMZOAG\ 5sRQ9c\5hנݴ}CjO22<&L堨mQ6> mc9|Zv(Р]vSM@F =MhUr]yֹz&1eknCk. E?|*?^owvɫ~_źRPN2s2abXh_ï71T__Njr3_ģQuRxLxl%Ulc(!igoqQs eX R,!1@ In}ĘmB:_jI[h JV8c#AMط`P]s(}}$$XGA)X#% &u9"Au+Ew"KxL2F0I{aoT F$`ls_вX gз֭KکT~ъ"=P}G($}yeݬ;Xje:4I.Hfvn j$%!4Rpr +lQ֏[{ꖹg9>9QEUeYX]Бzc{ֈq]9-DHm߫}cm-xOAjyM~W߼z|^m)ԜӖ<8Z@|ƫ/ޮIkz5:U5O;mYvwR݇!lM2ذ|؉!T!1Hc%))bͦ(a>}E:b&nrQ?:sMZ:I- ֖x\H>yՂ+2o1"u@EK3ٳgEl<8Ͱs^"QsN)WJ^C}mRm;AGcJ= >5QqűjzxxzQ5UgI$ 4f5 =WA3^ =)fAk4xטۋxT(;#@r9O1RhBPE0{]+Y cBi*u?p8PC)Fp]w*0Z}ML5 $HAZJ#ithrEUMi՚rM&،1]WcȈV" UB9S(ȄpV5xˬʼg'K+(u+ޯ E u[) -Z @B59d+LXe192mM2Q&E{s8L /֌ۍ{{qq&j;JN@*55n>5ON_/S2:ׁU $OK@gdB1 rX&W 30;6N nNe?P!UpӬnc5UdZXMB #A"8?D=~xR;} ϻ| }Y eCY\F) Ch|p"X8$+(C<YnWqp1Y ;T4E `pć_sT)DyǷ.ڌq}c\С Îd$#$ĚsO"ʏ:_J;lZx-IB$}}HQy mF?ّC\T?;߀DU@ zcXs"ozڊ/Aɱ˥|NʠL,摇Q]5DpN]9ex j @+JFagy |lpHRhB#xH;[~Nҿ8?GԧLPϜc ~iKj)r^RsLs  ϶~IQLHJ#ĵZN8!@SGSbJy v:]JFRo9"8OE s…',m="1YfWRS9r PXIA9I52N7>e4psEin_4 H-$Ŝqxmod˗G37P՝FoV8Qc,+oU\p_Yu=?qJK ;RlO*(š&q$.{!WEb ,$~-g0I!(`&qcZ3E˝اWf; "Z9he!ɠ3+.#<'C߂eO:8W[࣋H\%}pSjgx~ݞ?M4Bᛏ=%@\P Gi$+l÷8~Z]d?{f4Ĩ=)>N\P`B^œv+kyW0\z%%!0Gq$֗m25IsUz[ **}+Qtis~ʮ7r}|1֥矿?UKx;(%r3"fߤ<1 OLؕ}û/x;t_OrչS`{{Z3#pFߵ_MwWV@@Px$Hzx]qxT^"b7LlF7HcY_ (5,_4<>} ou@z %=3MRO_4RI^jl1u+zkn,9%~ݺ܇y>8W/9tΠ_j)ܩ/TӘͽh}ugzntg>Һ u8RNU))q]6#I>Cy'vW%5Z[C0Q$$)7[ʋ15XMH2V=\n8>J t-, OR]{6BȐTE҅G ^\l>|};<brw_td~z|ΤJYo2(%'Z'+I4m BU"RIDI3Vkw-L4ko4u&-+kAa:A0y#rLd8(PqLj"ʏ1bS:єN<'8ޏqh܈p -k{V݇ pdc7\\b"[r.>L7_4}W:Sl_ϿyO)?w{'3pŒx}„4JQ<>$hOk˚  Zm+ Z\!ɚ|+χAw٧ÈDkOPR\,X-1擮“fJ/`fhաY?xu˿߿^70i<}!u3PV*lc;{i Lw982,?}QxRxa 8eV"2.fXi /$X=S>K#"`9y0iJy]^AQUb85aLUܸe5kcojF4r\Ui>$7qg4I(75vF-ҒzU Ca^ Q`fWεl;[)3瘂_(RNYX YK0@hvo_* ꛜAwX^&}QJk|7ߚ+*uqxlW0GN rF uiҐ"ti^jna,'/7 fQB | .*yeyHg j4+0=ma!ߐ yzXeo5·b؜ a)sa/B{@h- R(>:I- ֖x\xB>:7o-"V=#uPB*G\:v锑a' $5Q0)6ɐЬDFJW %ndLH]@dp3TRϭ3Q!b 1")dF8Qc>x P\%l LL`-2`Z=cHY B_ 0h,E*"Ը@@9MW#6؋PD".@9_@#J 5EF;` hjd[RVl8ZcEMJA}zdjs%s)6 ЬBF!*ačZ;f$%Z?*U`Hs l#1?,Nc_MI- "uzx;%AU=hL-{"):plg\F0C)A> @UJB57"MI:RxrtZ,LXr[ǓQ~0NyHhs/ÁH:e_߳qΏ,m˧ tӹf2 ~k5e:9Kƚ샪ow}?>8q*`ܛn\dipL=g"#a'YxKN= 9qBSpbVҍZ]Hr1a2+FgBi\b9%ˠ̔۫67۽)U1]>0c{w3Ӽ5vK?֛aPt x &HvsLGh [x袕jA+{/#N&O?^1ጒ#]ĹDГ~HҾ{w tgsx<*Q?}]Wv;3JFҘ}=Ozwtw읷ٝ~trx_};6 } e>9A[>|fPlP^}E^8Q5Dici $|)zҴƽ\ZQA%C xiG -hUϤ7 ^zC&tyc;XXJւ4|;g]( @q^ [Jg9HnDhsAB MwL] i9AW8Ha٣[c7`5/ilǵ *Fc| /tkZ1D'#@gW D!m,O$Heqr-YMDg%?!ޡ@]ϓsLYKSn")O(@8}. ^I!'.U6#54-@#eg'u%%L}q|.};s~;3To%PC =&x7C)):#Nh`ES?hׅ+ƹp8vI &ASJlP2rIIUEqIIi?jɼ,~yAըE-)YLƆwOC7ߗ:11tHnkKwPT>]YJSi0^rN ,gP npu@VR< i4 )E'(\CE"6H*@0uHv Q{G&*I.zMP+R✘HFQ,/C D6zb QtFPںH& Lk} }:VЊL*^_ J鵌^çI4r{@fB>lM̡,xܕSzmW5'FbN3q jR}ͩK>BhԼj\'+.NŨna@X^D\_]fpA||Wavxz> p?3k^3uHre߽:td~JψO;;eqWkyO'4Qn=:{# F 2,M阷03w7/&%SCWGt/4)BxI5/p z9; +L(i/6lH^d&mf3)h:gJ9p%`dt tAr՘( % 9 OWʌPO.-C=9n9nkzZ,^[f!AT*."51qǒu|^YPڗo83GmNg\G$ڼ?ATA/o䳦/WGyQ2‹Żjŗ5n{l%#xDx2FQl@A+,d3Q-u<B|u)pIv.|}p7SﮠbPZT凊3UQ*) kJ:a5l\ ;9qw.C}zT©mxΫ֭57x_\^\^fB*sFc< nk(TDIL:]cygoJ/pal >'HKcń"<'N(#A54zbfO' 08`J ~Em&FR4FEj-W %lo~̴۟>(bsgqO2WM昜HWMh%x:WTXK.Ʌ:e.ùV {4R)}<}`[*Cv0pʩXbz+J8! $6LsY(+\-&ú*Z7N1eFԑԾ>Or}LR.v8KxCíWUimR5s)&/͒m̺1"^0O4)mK% Bbqz=J޳r{l݊$5d!b@ف%&sXkZ>-/?l6-s<`~Cnܧ0Yw1G r5mv'OŠϞRoǿ|p=N Tż[PcƝbRtxqp-SP@;,8uyФ{&UEɰ7:X\A1|P@{a; Dьm dئЄVWm@KrPtvwuuU}N|衾fA^~厎,H\#8 A(Wxƞdk*yIJ %|>HTda}$nIe@Z|sk "i"@{D{yx ,O̒HNb֯`9\#wEL:F: ۦ;E%KS.q"hC"ZddVJ&N.垭$% h͂ ^qwz |Sber'e*Lr|%ӆTP\M/nW@V5%@W59/iյ[Mz-nwN뀥bz5t]&e\&ev転*X.s'J{~eJY9+-0D! !N$vR7yqe.*nP)Bx'UmYWac1L?@9U`ts#%ڟ6ߙm[-wV9n Q7ǹKq#u:P@xSjhNɗO67զfǕ˧ P`sٻ8Sظ f dqNdqN]|aAM$<%3(сKJy'he4E,JUT/ߘy#gu08s08@m .I$cMqzcۨLgQʘd sm$3Y/ڗo,ٌ!MBnL]ȍ ZCM ;(ВRJ\}9Yj.֒j%.VŬJDb)BFJ k$VƘ%(́K4r\0D[0LC\$nZ^+co (ƗX@UHf .׶0s[evГRqpV1!y\ek2}sJ1;c_;Jr=èbxbl% |ouJF <ikE{g=f_gjgqoXɊo?F?M{c,-mc[ǣ=?~|zz~Ɋuh>_O?>YEB;jT[~jAx݊~n˯r7r4r˩ɶɶ6WiAK""?=rX`^++nl]LȦ&;RQ/GafxJsp8տ@lO Hl0 9\C?%zi #A @"R\HBZǚzK9ibJ Mw K-^ !z 7Gx5^Bi-Lf׮bq<"C%M(b}<1Sb^g~U,"h%1ǤxG$A ?YPVlyM 3bLғ,Bsdq}"2Q\dAQrd&˝*Xd}&P!R@"(ߒؘ,xIH SK$sgi,yP ܔ!{A&B P:Q/jXRk1 a]D]s@1CI"(\0U` %~W+҃jaիY%iEhuYU.nђhx-Ď,e/n"}YBEMx4 $䫛$xӱeJ"W*UV4e c."F.7 cײn %lt @Ы ̲jXi9zɚ>>DtPz0I0n-L&`#S3uP"v3l071hҙGi0G#C02r䘄.E&]nmWkn"H _3cy9b|w{n{ށ齲tpt;쏼0ub4b5¹ |{Y)FԬHJS_J]<5Iz3KCU$*Z5vl0˩ohoN{JmV\-6$Ŏ[fg[e XU]BD~|o?wHp2^'e-?8\QNǗU3NY\U}|s 1h&ѐ; ϑ> .vyZб3w}rtk`7Bp3a ƊԊ!aX}@Z pH3j D" ,W@\+"{lSqYT]bxKQVYAn#껈\Щ9@ M9.T 5Tׇu'טd9φOXeJW`1IylQ$^ٛ/YH`|-9@9J,/57`8J23Ke9L%Id{̿T 3V3kތ]$L%A1w~_8~e\"z , :HU{͜D_$. ,"Hgx{5}(JWp\pƍm{3nufC u6;f LtKJ O d7<.Hn6 F;/b$:T`R[.婙Tįfb@7}l:I, 12}u mrB=KӉ=+{,P0`0{s$w5,ȕ\XtP#R|B`=s&T"X?gH&!WÛ$2z8ͧvWeehHFc2X{ø?n | k$oͦ,*%չ^\˾ fo Mw!@oF8ziOZ#R*y+hk >zp ؤb Gx 'G5mQHKt4:|UC},$էօCEAsD0ID.޻܍NIAOI6_|m kD{_#l!u1n| ~V}:.1q #=Vz{`m&- Ee$sNyPƋIn%ԹUH}_ӎ`&88''rX=i׈&j۷B[$"{oU_}[#B^WZvs}E}:! h/hCV5oEIE9ɭ$x }6J>YqSˍ$Ny 'v礊Aa:9*6%Nb!N`H'ӧ@K;=}٣}.(ۨ$&]2L()kFd4m0aIwWnd[~ՃP2piHAP#±v=- )8LB2KZiR4TBiNa1OqToI o3aQ25t;%\s.QL\HDD.׊c.d"*}YD,_Gї3:'cC4%T ?p[o8i#,O[b,F,9A1*!4X ib[+?n^qڰj+&YV>髻FXE t2x ]xcy"67J#B"r$_VF" ҌV Q_ L (ly$#߸1=n餞X ѥsE$ʸ\$vY 0Eim':A24^Y:ZQKow8Ds2wp2Ms L>OfR\Z_&ԃj .C' T2SD8Ugsm8dMͻ6NR1˿~ -;b+:|vs%>oS |, RO0)R=ys_q']T!b cAI= '8DX`'gaDІCqOva I '.a9@ёLa9Qޗm<ݮ,YL1Kk˹~88}3|D1@XB):Ӯ#EH2 Ŀ(8MqYl?}).54bHsPnhJru)b]< 7/S焊da %hy XX]G>s:kemj ,~e7e/k^<0/]._۹ԎiPPA7pۄ{Ou;kc2+(]٥h]fz7_@2KFϟk,u.sޏ5,ϪL8Zj\/uʄtOiwդ77tCfԏދwЛē̟ @~Xp=(^0?_x{x7 /u"^߽~/߁_0B ;&_24f2^ZΞ_u^?/9weOqdd4M}AM^7x"ⵯw@dנ߿W |r!a>˦2 p}2}=t(90Cz\42{dAp9mljLan ԹI~ZX /' B2鸌Q$%ǒ#E? scWm 3>a%̊G XQƳ˴Ogñ3t2"І_V ;.SsǦ +jb2R ^85S-rU&N3;ʁ^Rf֐Vcy !.P!w˽~օ 4[.E>q‡l$kmOD5f2U9k'6t`A_w!.ЋH9)$b.`&\6OeB~nu8y Q:qL=2'k/32~{y)=HO{S,J&'=urOmwOm%杊Bw }"Q8X}ީR:X!S<"x`{lR#QOƼ% >v DwsO`lӉLl?E2K]*#؍P|D|eiC" ZbQh҈twt0)ʑJ*t6:sN9F9'wɝS*($QD)pD@'r|ߗ?L?2%cCdž_Vq-q;m>EQ35r1z}]KqͷD&=R*ӞiuZd/xBnEiuZd3$#L"Q̡8R rZ=ol2m(.Q\T 2"#c<>B2@rE%u+蕤\QXb6`6T\a BkAǑ\{*RH>АU*J#V Qq xr_0.$\!о ,AS-G*9#bFq}$HE. j+B!\"? oJa'q%H2_U 9d`[nu +REQxHe8tDVzG)%#bK;iهyYE""\qZ"-GSɡvq4A !zCi+K-Ou1EF#Lf3MbM[M4CtmݱǶ2O:m̀Vp~i.w 5kУbyD1MɒB`HMlѨСMEL52NK!XL(mZ0Άӻ Un8 $EFϕc2 L%I#2A%%@("FKge].iFF1 ؤyn$/r[574Zؓ+ } nM?#qu7J/''wΊM:t&KȠ^DF^:b !̡N IDF 1Ee0nG$S[ nxEJ*;Q¿WIt*LT ):2-:o4,Υb+ԩV`Vb+UBϚ<5PO n%ϐ0GIOp_m@^n;Ro[pIpEktxCㆵ"ڀ쨕rM:AmDkQtjѷNi'KtCL>U(PZ&hmt!8U|Am-WVe%J!0IP" ۆa%9|1m.<>:!P6uBFp _^ u:6 4jՐN)#3xr\Q% 2$^ KjWL^A`vkwfl @elO#)Wh#ivڼE~[NO|Zod@۾Ors#T4Sr߯}CiaWWERKG#.*#`Zџ}_^KȰBDuxvCO)IDIm|{tgd1(&1Bdßc3NVqYiA1Yu#1y`r/W IFԠ.9ES)mXLRiAbx'4Ԓ+%dDC4m \KH;ߡzlP 0TTq+̩=4[>mnj3")PkBH4"RJ2oۚg*aFĆ@r;@_W R|r4+HO?k4eMŃ4ϟ=xkSJ XmM[kNr LOOjN`ݪ gw-6(2bs0Xw~S 囟:y_| T[8 ='p uB|&x n6}b2dज़[HeI:zA;[$<,R#42&w)Ċ7^`Ik뺚;M B) Ƣ90UbO8l)n|ny׈Ch̢xx |yW["ό#}^"!l+T<Ű6 7D+Ķ_!_*5ly;g@= zIUS$\iAFi$h =i׊48E2FjԄk ,O!>~BN'ߊl1E>i2N@s;E2NIӅgAi(%֓{ȴ0=iUD_zAU IzE$'@:2?xô~_G/{xGcqeMQt1A;5Pm}#лxZ4zpV[Ij(5i7BpuZm/Wm|ktQp=<'i6Q!6/w1h-;%N<=7)2o],{T?HCqz~|(i $ZPI 8 ^'4@8jq=;C n!(g  B^Xrr,ހA g\t"a,9pI,yf#9ʼnњQN錡\%N0jmn &`ZsNĀM!(hL H H.M"Q¦Z ڔss Jn6lY%m݀qof 0pEv\P˝:GMbZwy^ӭ9|׆0"`Qb'cb;e!:*,Ii*Sd*hABt3!h|QeCIE]h [=O ]~jD"2̮sƜpMmsɩ?hM.kEW(~d$ɥDQe;$EP0& ՜0;c15ir)7{@Հ(rc-.nǒ$,>Zq-I$:DeiU<`z6]+ =us^G<ޘaofi19F7yX|MuNhξцl ^]WU!}CmQ?>y+ԢuSuzgg6g=h/fST*9"#rmS7߻E E/(+'64eJF^!z)pZϕE/qcY&FC;0h"5fx>C&h)Z"6aV0Jfw-ynDz|~*#m1(@"ͪG unQVB]OQM?!TֹCH uyaFԋhC</#uy:(H/[PCU>d2T0x2\e5XшDMTPDƌl#Q8*9y-k ;xE[8m$^q<q,^WtA!_9NSD1H)v~|X%T^^)BI(U4|Z7âXj]lQ wdDHD؈kGƍ|ͱyScV%d3TGLbb/:u[V>| -sHuEK1z ZF=.=艱&hց!n Ua, L5 (8 ;]3(#4ְy&Los$Ms:0#}gszKЃĐW9E9o&1D*1AIs~d\B|=Ƌ qdD\$J" R&"!|U`7Kaw[6bf9 z)Y]O~8~|fY;uxՎ^*d ӰФ J8C Aww湄_޹8x=kpˉf=1\=$+^,Çe|!Ohxg{e^jT/8ݺS$Eߛ|\=Mo]|tSCM4# sQ1lNAEДeYҔ rK9H\¸J1/N3F9YJE`WA]BJRFVYjuLHCVNSAdD[-KMp5V0&\A_}mI;6}YZeLT9]FLi_Wa2ެHJ$ 32-i+ZZ =jb4/aWZALIJ%AuOTy!Ѕ^-ށiEl^ Hس)#o;7|'0řP(!z6`M*(4- '0E42wOΓF w(\3> ۜ[N/=%uZB3|Pmا,EEA+`CNoHhv3J { < Qx`XM28!j%PF:4(41_}0K%fY[Gp<璊=t^ubL%e}HP\|n\lpW@? d7M*ܰY7T 3Ā捻/|Όo~z'UP/DKkTl{~}6y 6(z\גww#a0LZݡPLқ%s㓏oʟ~'y"E!FGٓ\(,Ão9Ng;=Η#f͞xY55'B|=ŖHsȖm{wkJJ-msG8>f6 @]=$];*Rt޻\?oڐn6>\o﯋Qr"wF5$#,YG18Hb!Vᄼv1SYƞ&j@>k~=v.-_PP4<=8{<>;C<0!wep= M""QI.Re,LJTcnj23`&5`DjGӼQr w=z[Q&3Gm_t=Z6m-=PòURNlLL ]9Wm4.Bfu+0"fRҡGzV%QeoRvRMcnؤ1GTBJYVeQ:I I^RadIIVݝb)XʕV`g4g.lMc@BuHg>K䃺mo)Q_]Zw&EHzF3w͘e7lo6@{&1C+E!(E#a2JOѨV>74@OՋ͎LNlO> hk  s;*Riw`4R4ւ,Y@B[5?6,& Ϣ;$Ѱy>ڤ<#j~Ard \R1Lkod(=`/=ּ뜖47˿";GҀ?XdL&3_ܬfUu0.>1| /FBairTvx)nS`OoGFdۀ 6 7fe7^Ս>%^wl 8_ͬ"R7\J߈tHR ˥9sYh:ŷ%!brD,$8RAH$A6\1 s/aF7z Ƭ5܆rzXtN7YRE!<c\U}%=R%[ҵ}Jӌ#~޾ 9ARⓔAO;@8-z8 OA)ZEy b!2V7޳Py!C!,Y dν6|GmO93IzQYOjRA քy4ClvnV*b\4|6kz90ӔJ`Wpf@0(C8zU3xex6 |uD*ON`QֈQFA FA!xNf0y6$H!2FHKoLƒJѓ@IYQ8R( ԠLQSicJ@\01 @ƹHiڝ_H vY\D TE])#IQ8C`Hý[,xuQʠUX4 cfLL4yp]#77xAb`, tH(xn`4#IY"p=qQxHSDM,C)u6_":ܟ܈4sBTe- k/@=:5Ӧ43 }-b=:@V%9ꇆ+tHGaDM Q -~zDwrb8cto#RٔEN{l_ th ^ea>B%97n䃙`hujwpfze\X9ojq=yII},04-$6ͣuqsBlWnO|wY 5koV݄C35̼HU-k3ʹ?='Ow{ĨULEO܏,ZKJ:>|aփ wr QST)PAF!E%;n<2x7AٵRvD74Q\h Ff˓j ,Y`ĈNKQ[j"P,O9)dl<$aV2Z9(Dş4VNqvjdVbN1ﭞOc^<=ZyMcEQ$II2.O᭥)DV"Gk%)Ub͓x#&p5Q8<wڄ@`_A|Md'3Kvh3D@ȶ3{Z9S(jdBC#92Rh3CR0F J!4В V&:3S JT,w氒԰T *i|&h.eWT"0$nV9oV)<*%jv T z>7PҙhqVȫ]+xVκ%۾70CCq>~:Y^~۶,",:疔8j_.ni?]poڏЂ=i_L?ImVŪ,x!"l9|,ZSX5Ƹ6 jqP:kn3,nՀJ!8 g*L2=Va]0JZN}@j2V htCpݟ|WJj+r3.&9݃^,~ݟO>C®$Lz5*B@W#IR!GJAxx=SdsSgŹ;[.}QPčg˵O>|{&8Yp[픒AEhi#8Ja4­ʱ)NXȳv?KvDS (\J Ⱦ(.j<Ȑ%u#ruJjJsu};)֔|58w֋_*6sѧdf2e%-%l^r6I"ur oD.he  \'DG0դCV{ Vcak:T/HF8V]Pը\ q>~NTɁEu_=7ÉA&ۿY1XbfɼUZ=LdJ)j8N_ME e [ C$BtN$ E)hMpU>*~ioYg-(ӘUsA58Dދ 1#1jYT i*{Y/)ؚ-kW>STzSk*1yCQ1#D-BJ#D'ڽ2cWt{=9 K ť'm UjPNQOखAZLGÝv YTyaQI< ڊ\\Q0ꅯ083;^vrͮM'S54)sz!AI{2caJd?b5ӕOz;12A]ԞK($,jmq$q{.04\%k9;8A5kNP.WY1F0V!F)vu3c_f4! *,)'p}⹘Q ? &m0V nhC0gc;#^p#Muj vZH؞6SBfnR{U"\:ح8hukQC]K-JjSr@993dzEeY0Y($͖5%9_sB<|q/^Ќ˦7u;=jN˗3R |C#8.4.DX^+ c{Kx;j/zx_| $c#ҽ@N''!TRSRSSOON*5E)OxOVpJ0΄z~>!Ӡ:ĻbRG *{F Hťr+[8fj -cu+QȈ>dvrbwZO -1YEl#v9'N9gIκQ@¸nQIrS9FăO({GʢKӬc XqPkUJ'BB-ر>> ڡ|AD>zg/t: 6! nӓSq6w f(cMsvNL[|?&![:'߼|9|etÇ>U V!aD?*_\oE-yIߑT]|hrɋ~1ޞ؛7ҩANJkJ|%vЅvgEɁ@F$^]& =.,šNj3*t, l 0G#kNH;t=-!I'}ktCdPmF0I\ z#63{?,lP\҉KNdj?wTjh.x=Q \'P 1d-E2ۡ*g׼mnuoh [YFAڹo k5=S G 'gaIFE;韞7*4FFEE+m4^`^ʦlxaPs7Qfŗj'{d6{|:q9""!W*jIŗD= =42|94 ZЂ !@vF7FgzLhTf` HjMIH:>-ɵj~?]oNz!#]-6(z`}UGWK50^\qiti|~SdPu.9h<:/ϳl?WQtwV(ɷlͲh?=.r2ErΖkV/yg0e} 74%>T:2@R,QN9a#7A@rJqh0zo~8ڣo 2J~STI9@f"%&q=|QXkE.6[x1Jap*;(3\!p=emrj]W})l äC4_=wDPpPIs~!pXxH:"2nǔќFû+@JQ!Hi&?x9@ds;/mU-2$W}j|$-w|e{\~x;*AWޕ$Be;[Rއ=4l`1G靗iyJܦE5IA,RT82mC2bUF|qL9FN QV:ۉ o2OY{CY3e窰h;+>"U➨( |b2ȼL"z7] '#MX(N .GЊI#)$ }XyIBl;X cu$yyz$Rt y4y<KvX$y1g6{uD!N%vl#XOi:g##lI;;;PSf93Mw]2OaMT%rŪwZuNPG{MqBh0<_%Q?\k *1ߖ:u=g$"ھYQfwZ XfH/fL`4{^ΘүrZ*)q ׯ67ʕUiNx.p`M,:#%\MܠMc%Zmxz<>S\AGFqCcm  @XYV+k#k }Q iWy=kPw=$dW4xf 1Ttc탴y1B4XG+ ZA;ߵ+!WorDCe9+A,G-ҔjY4R+qR0$͜m.$WvFb˘SzI#Ѫȗ/bݠTͲn*o{3tmt>7ᵙ>Mlp|Wy_|?/Ok9my$:Nݼ|+ɘrw=[rۣt~/UGy=;()SpLMkn)=~]-*P.MV& ELavzGj[[Ngny_Szn Mn9$+V2%{; Uk黵AalZۀa[[B} LDP r7H;x-< H#KhPlp #E)$a CU!!H 8)&yAK d57Jj;4T9lЇ!`,$ L(,od#*Қ j@=և$el$uHZ3,(z?uLLAHx8Dl` Lv%60zlHR@q"k2IIbg0 =F֓H4KH:hVtG6Hս"]k *‚-#]CHZQ^MSƚ9 epSϳSA€5/2HfU+J DIFBa9!/E:d^XfB$ɮYkB+5uKmg|Ԍbޜ1\'͘\%Vhⴠz-VP:ENƾ1"ΐ^2l=NpmdD& &KpC&30EHkc50n0 Ɣk,JBn!b`2 v5G/`g ikg )w8tƐN<4ܖ+O0s(sU9ªC}|tU`QBqs0`~uO37>-7t`+%u1ղNZidNhP${VqKYCea@Ax^ߧ~Vn6C4 A긒 H1 +%`[Wa+v@å͞bR_wE͋7;گV1 ߣǓIt6\xj]{Z86NP>@$XϳZ|8xZZ!&zm,O8yly^o;PXSW5G8@&ne1TJ옃P- =@b3 Ă "-:>phli|]|LiI?2Ꞇ5|S@)W-*&+^Ӹrj]s^a `aR~?!F ܭ\ejx^X83Ѱ*G BN0$8@:݁ &(U`@`m9Z+ kfCsH+M,Q͹~rjQv+%?s:LMUi9Kj=rUpq!cI=*cmY\ƘҘɔTwIf 5H]aYҺFo2цX!6}4 :E"O):|)_05L2f(Xw~UlW(Bt-Pg,%)2:SNI)]|jh؜%!]m$3NSuzKmp0}U zڞ*=C1;#x-}͆A5! \+zà焫gJ~\tE85C$Щ):  #qɋyɋ˫{p02%/Ctoɚn@Dqxk-$M N9սd 2z7(kMr ઈ }QmY[#{'X }_uLIS| @fPa L pz)K*۝=*,2A<ɜJ. Ȱp"4)m,kBD0 3JH,=`i9u(LJ,[an'͒[E}CqIMEDfAݨ4jQ,4@Fi~ũʧŽ<~4X aZt{[7 -x+{ -Z3ZnZuX,I%Ũ;vp){ _9F:'2a1Du`a398䔤"',9 jW*(_sGօb yTHAXN@r'()o0*h ^߱T J5Vbp/1#.$1&**hnK6&?_oc}WVĪb|7[]=OUpnSJ?|T7??\0/o.9~qpc V\T{?"Xaǧ$X),#L|aNgF"%!\n v6A#[:IlI٢H5Iю %}UhP|l0μ0@1|FPms:9qDHb\4ڙ3epw0>ZiDlpgB:ӻC1 `3}s/θ[h}_jQ3Y g@a<]PQBa"v!S!s6k0S#wG8hqO2縷ľ,D1cpˌcңP14@EnO{vڲs&mRH< Is(_,QG@C$9@P)|^w?@/ԤA 2i>%E=.x#J4Ag%jVw$>휏>Pr)=ֆht&rhR )oѽ,M#h$/T0_`*=~Y>:=[}W80X4t~>l<יsEY 3nGon<=Qx4keH5C'凒GP>ɢ:A[ 59?%!%r摏0ȱH2C@2}8:Tbnj2b]sSG-\%zo8ppn'YX 3_h>ʳe̝4 , !VA)mhKx J%8Cj]"CpN}x)~%t;G%/)`,põ/ 4^_&hMWB`%/Q5='\V]J /4"m !A^_B(/zUtD f\`9Tnx(3XwOǒN#h6Eh Лt`R!o&z[} 8JpktƒX-PcO=a4eiU&P)RM i0:!*!L|C)Q Hm=1L!dՔXh0Kn >6bf!A 6 -7W^tNj܂ߖEɚڡ@ۯdu*㼴PT2,++l*U[VY_ (,AW~Xu<ܛ|?gtX0F-˹qqJyた`F`G:s-4 Նq]oƚW 祀3t1E-$p$Is cYWvk #jNLki_d[w;@$s\k0.DʦZheTD^SוR?lW1&1J[2@ R:FLX%@m$!T+,{ٮS bsO>y\*4`07nnKNKL"P,u|fehYSn:׌)F.q.esrFщ7y'齖'o;ךRZNN5Hʖ,Yg72n,9jE|&ƑL(p== IYB=} Z8AI~mv5+`V&\Rb/E> ^>,fwS_D t#h6e0W++mlbq|eQu\&Niۣq\mEX-naˉܖ} buS`-*Fv8 8ե.m[dAZ&4 W$b1ɞuT>XT NjuLά[U[hNq{vo4Q}nTQ6XGENKFmݒjݚА/\EO)NNp^)}nTQ6X3[(m,hUք|*S)}}o݄Ľ141&Qw;-Twgݒ5!_St5*g5*RQLs71?o} zQ#7[=fy6w#!&0rT&z} $*W.rOf6DB/+RUb1"4yvDQZZ 3sDLYi]e1Yuޘߢ]60*ZisЄәD)Ҩ__m62O6H౾F0A8!ÛN|-? YX[>_}{䗌x;r[X,G|z{GWYypM?_}nzw7Kt\a'77(ݙ$|WR`wKPŖ? b3*#R"%cdNf`aZIyC[</[ʹߞ/2Э>@WQUfv5^ޫa1@*71AD nL¡ )a1M"-occ|]ō^gTbcP?n N*|3y2j]su挒_3*FӁ}p7[5haL 'H'4['~"@Dh=˂ mը' g@cKb(Q.vN{0Ql348!.& X'7xZpuw'RPJչxx`RZw@oi0`kἫf ?ͷ~VmsI.au2zfw-ϣQڦ{A (Tv\zWHvG #RAwE)>d0[~CwmXOZj 㢔ǹ#fܵbqy5e2(~E[T# f4h449QT k4!4eE4BK5x$R˲eY,|YղhHSeN bqZ]97YFܦMLQB)ݍ!N_hfr`ܭ>H\mgkΕRZ6na)Qn(ȆE%2J9\̉4la>NG C^tZT `=8¤oKhR׻YL- t%\ .:lQ4n&Us_sVSL?43!|>߫O?\77MoB3eg4oFG4?JuZ;mX P;_ٸ}5n+|181A,L CE!Ycs]njYVIk,ce (C\X.z zdcy>C.q].[tv[3GYDsQ]8.}ibĖ)Ti{xKMJ'詥FץlZ<(ɇxTfT {=|W檏ޝ?xw:nlmztѡf <_X<] AeofաTu\/{E_:#{N:D~@h?ܳΕHz?7/qk'כ4uʹcQ|ew5b)x)OFupQ|X3] 'sΕF-E4С҄ et5CiENX&N)(J䦓u*ݢ=Ps@(Elban鰂q41eMs?A旌+Ƹ;Ŭ܇uOEX r(\tN\xsh GŸ0QZMpY-SBD.`-&wB ϙ. [f@!$u{!g gZ=J.>ø=x]&`}zXL>[**֧,߭3ɍلtiωm-8Eek^m-Zԭ~噶@mh3B[EK6Uf)=Gehi~u,kkrVE5O9~Ij8/)gͺ؞F ApY\) E_7Fw^ܚ]zWd[CuGM %5@ P.(2as;= -0 ߀g~w`DIk2R3)J$4VIj `H&H(%ȥ0P a뵽fB|QL5ى}CF9/P 66_lD?o~IJaR9dȚzSb¬XZ2 J`)NJ29c?8ߐ'@RJ?ӀB;ƫx/~Hw~G oţPsIb7P$C8x#V)$;_ή|GnK3`^jZ,mFإoU7 J2 hG*Jo0#0% |֌H@G7#J~rˏ-Ve}U8FtZʕ~4;ʷ;yQFiNl\?l^ȵ7ʰĔfzwb T\)p!KPԶRcߩY}-ºt9sU'&%}4DP3Ii2i.Qkrg޴k{>AsbgBev[ܳ[oHl߶{uۣT:1H/Q9Ϭ;UNPk H,N|2\ e, YD yEMc;zMH`i!_><2lŠ$$B EDD;NG}-jRF!VMe|\z'Q1Rre:QcG0MV wӂރ?i<|\_M?-v`K uv/pb/j>(`ّ=-yWכVVCX@[ARpozѶݮbBȒz+&YىϏ*-;wd)pMW.̰Nſ0o!C!u άTLT~"Br h4tIA=nx;3k QeRe%B#BxvcgAޚUH{1f(1:mGG Iϱ2(Wqԣ聪{mVģS!ٳDCONCW Dd^=s+oqו~M4-HY0%  VؿD.rL P (ϑ) 2'5c__3孥Q&-;ƹ 7mqnÍ6ʕ Pid`ƮPF%iVX0khpN#EQL5DtALtja"g!Yƕ]Oeʵ,/},H]25JR"`NNV@yԹD[83Pb D>` 1c!p )4:-GDv1kCB)&K-fXI5jΨbUJXYJ( R$[lj-\ #B%"0;C)ڽ`ix}Pƕ&bA`PyEcX 1+ c"1TE 8'>!՟BZB )xq%42&sN Bc..L $E.9r- V[EkMD+$yD$} L5u: C3$DT:ع 3A5Lyxv79`DX/TqAĆ.j7A6MryhZփ5;) 2NzƚXDi D]C,=.oWy"hحQ3d5 Q?( !zN׋Qr8 ^GK>-u2si?j­@XOJAk$Tam!$|4K&0{Np@J/OLxu= ш T9 Æh"$ڡfv8귏PuV"or[>*6\K+N{uv~UV)xUǫ dWUjy&*},d@1,L޼}˞uNYM0G ǫ.gd0h w`.C B/eƁ )0V'c8P`J>M%w{r)H ikj>V̸tz_b= іUp/607hjoUTaj@kZ,I S*+Ls֞D0վD#i?+!@4;K[>DBz.LqV3U.?eLo188Ջy%EU Ffȴ-B%+&38F?h;=U >_ʺ)Ӕ9ՇLZ#($T_gYz2;AdtN B@zȍh\~kOr4#dHv( 0?Q?0WX6KOϷ:{h@I8 -p$$DXDi)Iʪv\UI,I֊2gxtVMJniIi:'nVpҐw>y-hb,E7t.lyb56D\#%^:U.iA&艶1Ox^3-?(]DZ'5W}?V~GeܔUҝb Yo,7A났pHJ S_w{_@֜L*w7w'ppAP f%s3va 'LXn'ԦsPRREk:a7)(']'T4pGخ=9Us~NB=PQ0*L̜H=)z]qs<,s״C$I%8t~f˕tܗ;28‘''mm8EYRGHQC׏RwC JU{ bƱLK K(pL LE~%C9m5pLPA$$f_ Ƨ=TM}T3B 욋>o R9O==2DZ+)6>3C )N /d{\.y(фY1)TL8~!ob~[5pb^rٗ C>YO_"qx;\.cQтsmMYJ8~(Y {o R5?>~oOC>PJϸṬIɗB{ـ G")\Hleۀt5S;/ O=]nGb}q|@b0UU4B=Di5)nb{rNrO lH y BKy烈 e}*H(,"H"[R!y{ .M^o{7_oT!,^ـ[5]o~Y-cE&:]^·=(֛۝dS5 XÔZdw[+:nYVy0՗&-dʎζtcrm_~=[~ ojڵ l˶S;UظC9&dH%EҭbWSK=s;F?ZUJ$Fƥ gN3_jiea[D'}za|RQ$_WC"%Do.F,BQ1N$,X<ћ(辎K)31T ^MffqdRkXlC&xO̓?Z@ {1ido &)t8_]Ngf6 {b&0$qGas$qmPy~6;R&Q|l4-ZWL/E$);RBZzٲ!&}+{>'NF^)3g0ˑS(sTRk蝫rpeF. fn&f ;#޸߭ب 㛞!v#:s͘Q$# L&3Α=ێf7^v%:Q {xGSR3>jˣx-y+ܤu6XiϾnh5fsel|̦Q9P'QFLtJ<l/xo[I:)UT9kM`I ۷#nqe+Kl4fYʇ{?QX9'f3MT~\OȔ.iGǫ?>LGƩmb\⻕,S\k^pS*1fJ^]hz`=w$4IMĞo+M";}5~)QкEc$&vW4m\@"ʽ^nnt* ;/;Wڎ)~Y1s>ğkMX Hۡ$t]@`NJ4 D |P g/9% ຓcuU}!]B%!1-4<#.& @X+@k$T8?a%B-$śͺ=rȕKA*=L,gjjg׮N^RErʶ<nE@ԌDBs>-͋]XC⇙7n/ͦ8 )iyh#c};Eաhp*T6V+9tt#S T B=in#{9۽sBA q,8BNx(Dzn'w.nC'o}$sAe  F" RjJ L>&׭L}uۅt6,`G1+[vdNlB[mwͷ9Äzn{t'X %A*̏r(pdb"[+rZӽ;lVw(IW5\bdvX$ y!B~&*Մш] \~sB;.[`A3b_#&p xzWwUaOn^ulz_,GIp3 3CufoY!l_* i++%WBjH@G+$M" `:P\}__]x,VᅝAZ_9jp1CǙ6Oos曫`o8\P(d!0 2 Z) RE*"t Ex?Z&mML::td@ tCYA@3i #B@PLHAXȠՠIR]U/˺$L݊fSP~^,*Xz&[/gI*9>@,4@e &a36%4ɊzVDz1!)ābjBC%a4aFP82ch 93|/F wntg>! z}Qc6\#uEjqYZf-`7G FnW}jcc?_Iچ RAzh'ivyP4{ sDK{)s>\"WjO*10h DntKߔ1ES/m]z9xz@!́" Aw750ykA_ΣIZ%&*C_+##u̲@1Ӝ#!c4 1xI0A$#ET= @H B@jo4җ ѯޚ?ZQU~[4Tuֈ{&Cc:Kj]ַ|Qچ0EgL-o0ߤ6#>"QĮm1)t#m1yq};M4āRσ2yA&[;<|3[$ݻ8U0џƱ@j Ү\eM˭wf Uo|RZG^}t+Z{JF՞OuK}pQH.Xs .G o-0@oO`[f2.- 2xPfNǞ",]$5 j3E!WG"F))$/ybGGU5JHM! Q7ŮP]N*Zx17f9?۰IWBv7 30SH!*'"*Ϧ3>* {_4Oe9UdRYd'hREǓKF#x% ?`~5:n6 *l&lV9êٺxLNzzM&pc?$3TD 9ªNE.u%y#g,Ӊg+sevyo,DO")wY&qc^.sFU;=DhM$vdv]A1x qlgI,AWvm;NLj!{MT ts"9stnOk}0Lin_t.rNOڠ=+ǞoX !xkΐ`DbwRl2k0'X%uHH]5j:{::-IS v;jBˆ#L5C<OA\b {}uNEl>aң\t(Wd HbmdNқ8`@7y2[ 3<nnLGr!ڝHQj5CvOUՇb %9]nw DHHWQ♻2-6 SN)$-e![ᷫyVXDY9yP{yep4D E=$-tTMq}U?FX!8+;0g9+Џ[)b_~*{П2?5Є [(wH#)IihM4dtϑSc&N Zxs$gIss|29S_6Z`I={{A@bs[7êPިNk5=BvjFN7!R>2lƳ3)tnUlsLȲ#*><<z- >.[v孻e]ca_Mu}d9]<#Id-Mj&4{anf,窬!k>xWDFv(2ů .G&T.aLo$+GF@"ܶY&mHeMHM8x>a[=s#MvN̉PSQ_Syfib/WWjk Fy>k&818B?%;JdM )5{^}e$kŘcВNja-6>"O{۝WP+vhv(L9ڜk cf?ECTan&Gۍ/tw= pQUBc_VEQ-3F&QVdd׶@"zF'}L 2OH5z(lz !>)4GTr ~jT;>00t?dwąrh6oS|'ߏlÉ6(pLO& >m+0 ~rwB&t31>v?$m,3C/ߞЧ߆\I`OLڟS5ùq8ٹQaHp#"2_zďo֥y,`)Ok"ςp?AHľЀX+y(pU!/d Y8B>sgJIZ)9gbN g+z - _gwNC\>KTR(N[S}YŲȓ* uŵ^,?Ks^s->Y71]8QN;V헿Xcr0Tl d̸?.ȦEp{mW [s9pe9°}{D^|K:`MC!/d:8X\.jZ!]NRcW^5)Z͕my1 XHbR(3I ckB1hAP _Ag;b* J *l`~5^@-MSi.] PFC#v:\0$vdѲ&z1lC|f.NO{Cnz~꥗fB_?Z=0G-o m{m]@__ y>moKy0cr`~EOoAX7%0j,;-ك˿aPJ@psEȊw2F!h[j|F8\P(d7X@2 Z) RE}Q*#P#Rvetn>#Qnkng^{]2ۓ>21Zcj}{,|[|=fdݭeÛy=Fűu B?cԘ!qUU֤zgƯwiF`#(/q|QLҐ0cO0|)%"}yw'xר%bD^G"i&)"ˆF4P"R1"KHOC(Yg:gWK診5h"-Q`f[6*S7 79^ғ+~O Elj\{Ӈaqiͅ1ohx![va[W~"N'Z-?lxHy ƋixنR5y3V3o\V2+Ik69PQ_/ؗ>!p%W%x;p3>XS."{nhtdσQ*fE2,?D)hZ) sjbq<K`_`4(F_#zp+؟]XQ=uf6Q[+$PpxN7o7~4^c@KHH/06$F\{XEQ\e4ma kƙGG qrq1waiH8#'?l`Msy1o95d(4Nl 8đ"wmژ-޳q$W}9,6 [{.w1x愞~LZ;_5ICz{fhp^]]U]JWh# BJ"i yI#y%dI97I؎1dr+Tb>Vk$-.'(5 Ё${lr>#HD DJR㍹_Mf Ⴔ*H1;Z Լnva~AtUXp4sXЀas5ֆw P%zv6Y+>bKH^zLj?-gZ3,2{#2f1H/kx{0@=9Ihznq+-||}xz^HKgy,sZOCo o/b c$<&>5^&rܳf~cyXobӢ4[eM%7=S o_&򀬋jx|>> Q;sO;,"]֝<<ưp /P2kΒ(d]w,d/L*rD"wЄ) 8&\?:KûLb<>+ưog.c`-b9[7 *y',(Wʼn86l„SW]Pt]W,DIiDm缈eQy{E笥2>a~s&2pc X9}5WWޭwlƿVJ|̑N-:(%Gmq<X[RP 儀= Ui,}#d-ve4 Dѻގ(qW >ޱNɖ2 t ^-gߜso9Bg1Zh/ԲBNLBHg7/[71Ǻb9\/1%~KOSSŤ ǚK)+qtyxiM6j9)KxL?&ֱiSCh62a?NljYcNr)D6)'u<{u: qL+3 zyu::`{OWNi,/B ^B5鹿#:="JҞ*Q`1Q&4gQàUGH@f@VE+:& H fYٗ%dhEd72='F#Y*:n 4%솆Yb Ub7PTɭ I{V,WUiGw[L ךlFNl%. 3IXYƲـ\g$F9PSFnI푊労ro,]B3F4F3oi$2> bWQа '~%`ܑpxi{C M?hH !wsg*):*0 F9pյ`DrX<R"^D|JPke5I"'k NR%Wf󐮛,cRI2Gi4 |#.mα'VbӂƫnIYsb?!-{ B yK*,FqΥcf kW0x + GT9DfZuީLT|wFNō8JqǑ+EbVMz8K0h@r@4sC Vk!p"L=aA1q@Z.ZBS"xM,BA> jAOuS:Gxjx-^i[o;)}v2n|g6 b'{tib56Ɨ;Lf.<l/l~oۋg,K VW~^cOd! E4GG[h7k[. RD'w&m89V^ҵvF4U\Ddpîv([. RD'w&m@/xdo-TVBBr͑)Q+DhݲC ~fGsˌ;xovr7)Uk<^Wv]xu%|W73{Ĩ57 '=KR{R={IgVJJR{H{B6Fx1 Yq4hI `Q ){;?S}`c&:$}}PQk+F{-[4V܇[{.h6!q&NuN:,N#[yzDFQRg M*v <|(Yh>]ҩlXT"\yXY"=4׊{Ie*\⮹U$k kp|kf*lf-mźs t˼ZC RL#O)F9GFPFRJe4*040v.jJi\|wk@DY,˧{?1Gd.|Ք㖙.mZCM G4g:Jy!:0鱱##Xq@;9pUUa[f+01NH8!RcO,AyŹsV5"7A ҃nEb}3ZWD Y 8s qTTmB`c#BvtF4H"f6C59E0+kC˹ZSH$N Lf=J!ݡ{kt9_^0R2F#0 ;x|9HVZ/_m~?[ Hwϓ #1 rWl&|e{^oxgз?\߹[QLJ[x>BGj_PdHkRF~!+yaxe{Q`u-b#Na( [cP35A)"pWFB&hh5pi橭ޭ ܺy+ !m̿BK#i DZ#3HU@a$Bvx"-wE:e!8d:14P)+.X8%-]00'cKH ^hE$zpt{ʢ3o9m5!vr*H{*R,*^T9*qÎ+28l2 v"7plӛkob2w'Q-ӹ7g7דjq$xZ;yF>yOyC %B0iZC> %WXwm@8x{67[}֋XE?\ޘ[s\C0㏹N$&ٍ\oM ,t|:ҍ޼{wx6\,.c9VT9ie}ODq^k'pO29;lQ3IrY.SzFa%BXf=g&xgi0jA۱gvYRV8*\ӟ**4D3OzGb-NiKIɝ"I#^K4yR򈗂*UK@u`S.szGp!D(lR}a >1^y Y35 @A烈\ʆ) +sUC)Ja3WϘ۰`#d9l[^/nbZk02]%$%$:ẛ+AM*qa .'h'e*wmI_!EC8p6A|~IN %*$e;~CJ>FP%-zWoakhxZLs9IܦER>2xQhbSv}''[')3qs btWCP[G N/2W;I`(RYHy"ab ig+}r.(({'g&#FU~ ,2ռ)ٮOXm%u|[!8ek, jHMQ*19f{W d%*0qȿ6zLފcmӊ߼r п|_ex41"qw鏔Y{%kiixSS (;H|y"E/hRɃ6 e8UI$VHV3oy``MN?jޗ8C_R۲M˲ cY>w!/_򘙢!҂T#WnI.[|p0L-9^!q00ddq[Q0@Z™˜0kM 'ćstU6j{t[)[9BQNl"ו."9F5#V&3eHqx- `anZ̟sLT6HwB E]a=VE: ҃XyhU 8 A${,1x~ĩws:$8Ew/~Jdm%D>WאN owж?xt/[,cz2T`9f쨝BFAXZ0e$eAnrZ{id!J?'uj񕐋VS'/PWQϙDRK^T0xaMiѱ9^RHcqap:wD}BDzMh BsGt@2@qEWp-jV{bDP(ظ+-""Vp(9yktO4sWkε .PbZ@Є00&(8X'Ht(LT%ktnkCSF vt\Ej;m|gGnz5,TZӪns8 ߑRp'r%j"^{w eFmRb9DqDeѥUhERII_$%}QU, $4J[I*rY*)QޥV_q =*NqQggѵ}Q,/ߛ9Hpz+Ϯ~ +oL qPM"ZAܻf6rb 6I6$ǝcVQJ ֘"-&J"UઆeXNTwuPռk!!*PoϘD )遜ۧ)mGr^dAľn>/Wq%akI:@DBr!@2D Ȳ8$qD^E;G11 ɅhR2BZ3ԗ6J߮=sK~IZWp=ѵ+t:C,ITR1!Om_k28D HaiPfbZĢ,Jck'yߐ 0L . nà 2]jD\g] H0!o#Aw|s٨$h LhipH[CqQ xAs ){Zݲ!s]J%H`,1؍mB)֕B4P5]L!*8ú uQGXWSaF yvT Nk) (k 5Ars8I` *n zc#gr T%_hV_]A禄PϢ.YdRQQP HPá<,x@(((0 FJYd  Q [jf fѠ~2X3kV#ٍ `Yk4[IJrcҳ)fMba nM470&D) 0!뺕 "ʐkh` LJ`Kh00Ͷ&B)`I8fքvې׍ wrH8u| 1bvo-`j":bB?Fl !2pfv5CvgdƛĆoN'}{R^fDn5XIt@xf3ǔ^iX@f0Ct$*'P) ԓ??y7n~#!/Gz~ $v`onݯ9brcSn>H089pp%zg ^_{ 42[nfw5(-|:ȃ(+ȌcTLUd 3.rz?wdF5ZF\dt?y8rB _pAUj|$cAw;N/|_N>7g!Z.v/'g?2 V8JJn%n*٪Nm@57[!{4ƒxƨf,RZIR ci;L>Iӆ%t bIDd}^T)caQygw{6^] ݅ɷ˪->E*P 1РD@e@ĥ{f<T "` )/0tk?6lSK}"#}!EpVV(JtBh鵌ҩӇ僘{*/Ѷ[8] zYdX NC#H`h8@#ǢcfV{gCJv^(+MZÕUzWGK >mME;0sƨb=͠ aL2:FzuXvOog"|NnS,vk5|<+nnbVڋ%nj;=@Fet??LfԆq4|t,Mo fSQ65 Yiԑîuo>x;/-+sՁ\V56` е>1 _p,=lwi58^\YqL6Q+7ɧMzPk"*vxrYf5oaђny{[TM:4o}9s܎=eMKk RrZ_.m -?hӂ~fGaf#Djm]$%`z~~`̵«kڞ^.!2+\u!/E)A%ʮ% ]uK$dM]i}L<Ѕy/pdZs]e3Kp<6ެ"7`L7ʿ' ]7LHw'tweRV9F2NJHWi-`gN.CBY~z:f+0`\LG[@rG A'Thܧ*{sSiApfzcҥf:̻дw3]a4tom&Υw}m0?dWuާ8h#n #,JShbcQ]hV[enkkEթԁzh/oi=)_im4eҒdEa;zP^fRRy^fa67ْ?/^$HyQj73r v3*U<1T  &"y9Tch}Qnfm_7C3Ly%~4&؟˧>Ds;7\9/I2*Aޒ 9,'?ř3-XZnůŏd%[1ZbZh^Fͻ&E_eoC>P#1jaSԠy*hg2+/<}2!g\8Q)Hʩ}]!}fo8+A\&њ'x%~((h 5&`u;<r8⋑A~z\U׾Z1{Nq>dlF3FĊ34sv+N3YysqU~jʹ*UR\5~ţr 4xmaJ7?.;&!ccĆ3MХyg^w%{oxLW|¯骇MW yaZW|wY`ޫmt!V2NAF-$b+P.z)2V)d%EdX & \1asM Eki[ǩu< \u8Y4Zp1E u'Ż0z턱.à OMX.xp[Ӿ]I9]4:pDܠqAh͞>X:ˮh.)UGHcG0ݐ/h^ʪd U&k0T%<8g}[&y"dHzM?r900l+Ywg0~!2Ub0 &o`'i'.tR&xgTNX,՘kU)p`LyRrJcj /gkBkWB m7ˆ+CA+-@{A9kR4$]W(}G.ag`P;J%`rsJE`9LŰiUGaڃ;0˱j{LPhI441?"3]Vְ] DkH v&uۍ! =^ VG^G->Hk#Vb I' v%R8/b]Ggpٌ A ^߷Vnjr͏˲o:3aRг;C6Њua9Z8;p_v!kwE}t8ODC=*ӄ2B<(5'&1@fuNY@'&BZRIk[ <3a|+Il~[h]RGx?x}QĽF\vڶǙ{q0 -G1;sGѭB.3!5jx`Bfb]%g.&XK&EP1x2͞> mE)ي>.u«z6AMl I(llr1e*%'mv-U+iK ,Y皑܌Auc%Ɩ,ܗNcslJocA6ϫKzi_Wg١dUAxg::DU:sz4bOv1zΦKj?=aU1 ,UM9Z)/p΋s;/8skCH3ǐ ŐM]2k9p# Jh!W90jr#0& ܶMoy1y1wmF܈IJ}\1T6y!D%+e62龝l o@Z%e3ZWe,KCFm9)gI)PB ' 9mFuDP!}tX0,^Xve`8ժAFYtI8)~!$s,t daT"8h94sVdd8@d{aI˻e$>G5Mxr9U5Mzub\\HJi=2҂0K9F6{rG RgȬ9#+vc},^+G`9yT-KDTlߝG.Ǣ66?r1?ϊ .jh8$cPs `30M9YiF4r[=Aa;繷%DCTy@isL+w۠5+egz|d=QMJ]l7 k%pySd%)9Y1mo &jZ.{^IJƏiz&%-4`06WےcX^lu9E,9Ea]l( (іKC&)+rFWyp4Q^C ĻY"_ B3..qy9Ur4qح Ray^P# F ^5R;:ә+WyeVd)H{MA!Hb#v+]`}pNp.p\^@큜` }׉U@MzGKڨeY) wc*^Ր !_.4~s=%y Ψ7I3UW3lc}G nn#m@8|'xmE/ Y]~[ ]6ͦ"Uk~ZV2[  CsVK>5 _< Kۍdkwd?ڞ3Ƴ96߮.nC!V0ӎXb 3n5>g-cժ>lʵsX/y,}f7G,m(y /h{>lՏZ5̚;nq nxbU 6dzGfaFyMFYë.⻻i./q}iu.0vΊ̌î^f ٫ÛI֋a,Jہm7 :&m=gmYF:kw0k%w[7=kxx}ÜFΏVF_ƛ+UcƐKj26{ hn.Y[.Yҙ9Z}O0:l?joGW>A&+0>GV<|dӘB+U͌WOӬ$zRqʮwFOO˻zwO6ԶcW{=w 1.%+ZFc=w"Zƪ)r#pb^s)4QNկ=;֟M6OE$`{!C `e\ 4>uI`KmMДĻ%V#,$8̂Oo.|Ye`.Sޏoo\Y-l/7n?s.`77i.Zz]pV{)綧?̉ۺv*jnPUdD8p&# 24aeY;d+?@`Zx A\@`[E AC@@hБ!q@Lfj5;1xSA?W=dD c< lMqE~׍-rrm{S[d ǯ7?_or ?2_of ?i^Mz᧙zG_N ?#7 ӄX^.b--5ߤfy36)ݡLN_8)Yώ?,hB(rޅȂsrQz,jIw :w`±O actzuO )IIĔn5){^c7zm[Oƙd<7J%-UsGiɲ$9Y'& 5bdQA^cUIMqʋ[{ 8U^ rIy1haNC"x1XeO#s`ʋO#΋Ty1aʋ%/J,/SHNXnR^$SJOIdϋ5Wx' )@2BdD$f3dL"E` EIԙq唼XЧʋ5ѩb'b#9U^LS*/6zxIy1OŖn^,xo}vŒ/O%)6~y}(,R!o]ksƒ+(~J y?\ZJU㺎eR D0$(w) "e$&[HF(MwQ{d(Jң譛V>jh>Emo߅jY.n\RFQWsIW lNhd 7[Bz_rt}bϋ`?1E`_Ĕ$IK#=0vJ%jO+šm,*gİ#PG{~Ftg=3x5]0Yi7Nڊ? ɴ5JbZ ẟ=H3؊HgFB(8•Nw n S }quQ3qٵ *ԟkcoa?]bq?nț(7QÌ⯊<8ʙ: fRٺܯ˳~5MD_>#ܜU*9in?`t` LlwDVRWLMe`6GraKgyzpa(hIxts FFHpiB4f4QJ% @M(*\?c8;U|PR;> t<:.q%֢dڽrndQօ_ό`C+Y>͐D}\G1=̑Rrws@ =AWvC\[R/kVĎcq dr^aՑTogv0{XPҮ*k]FŮӰW-rR9s^J˖ yl[3~C+y}G2Ye婻'Evl< Yv9jǫ̋!vETϣպm$˝HWӑh)+ҧnkCCu5,.gVpiݰ26ӫ.Y\l]n>Â1~ۉ̆XѾ!o#_0i%"?p,unֽ[FPHCG-]B~Ym=Xt F=lx3: MJ++Hv4cj<ߺK҆zQw<vSd=2[g:jDK@(׍z Olt -E`.:껑EG0p[ыjDS tK.\}m %f{:9%@v {Zj=&ml ~ke}ԟ_g~ee N\(! K-[JxV-EګȺ'Q )XG$jiSkcJ`.(ՠpť ,DV ?zh:K$=S]_^T;J }(Z=]և,kLCz-Jږc+B!.8kN)9-CZڡއF)_?tEI@hMW[bL]dg't2ٜ !%0УOML;3C -+Osw^ඁƝxt!uZNIZLhtmI|A)څ Z2amcC;lp_j~TSly&&:lFbwDA(Z7zS*,osX :ש\v CCX(1BD5elxP'5Y GS|PQĨĸl 3DxJ8Ќ\RǸ2+nHp1c GBP`"ݲ/r*h%\,qi$)W$N#18(,Z JQ*̨T !zgACeY5u_'p^)fC_&wJKLN?W0E'ۼɟnjgi4ʋ(» &6|rO*zs:/.o4\r?Wi[pZ7V^Kvܕq߭Կ~P&t:'o@loNEg=sUrѓhz5̊8^пMxH˯Q&.00(1<#/g?_ `8ް FFH,c!e3X{  Ǿ0ySf:lMT(k5",!<㉢Cr5] W=% [7*8݂m~n?ܹ{t:,>5Φ)i_h_{ 4#ǘ X k26'*dQ Bx?-,{C%޵:ߚ1y 0Z0̮az_*Ř3Yat6%9.ў 9. ޑbZMˈAx y}팟[g(ʟ\|0JRM>xW" b ^c.yc(6&Nj[AsŽ;[vaHMg&3i ,b&FN$ a!BLFj \6IKf8Y!pϔ e"8n*C0,|W]mO4BV*-@0,qI*;,M7Sd PX%,:ר!^5в R`%:iBaeIyhC(2)`)ho.s"K~Pk!, s3>1ÈqOD~=EdZ, '-"^Usm݊w`Fy/Sy6R?VG# )oySׄIP"@#/;~!HK/Ԑɂkl'cq h̽57csgy4̏aОBDgDVtIl"~ߗ.|R7^:޹ TI6WE{];)D\OpeZ '0It4p 7LJ}='HUz,yԇ x륻aip6&?y`^Zaqݕeź^,sQGG.w6hMNQ\HD;#,'V .2EiOQ~Ɨ@xXB9h8]a+1 ᧣(3<}?`?a:/󿚐1,XIԟ9XRe;\ގ1#I1#sv{rU{ᣂ=\8赗M g4Whg~1,h ^WO˔~Wemjx}Evta0M|YaN3 BTgh%'T+}0LXsɹv`Jk4Cu:MYjpRjǸܡ@"EexAu!BӗQ)45:~-p,EBeIL6 e8Q8`5*U$CNdU4#)ͨ*VtϖkeH 4ϕ2o Y4&٭a;\TG',?]B/T/Pu.D  #.$N䢬K(dl 1Zr|c 0N h(&W2? qӱOޅ2u,b^'qW9M\ϊ8 `(Ư1Gůʙ{thŽǟ(?{OƑ_!e7T߇x/[< 1deBcEʱ!) )R:_~,#X{σ@e<>@җ|S5炄D f/4sM⑺b5&[!(+9Mf~\[{iY>0ǀ]+u_c[A2Wo\ʉn ҹG,>*ۺ538*y}b2%죖o ?i[}b U9AS>~@o!%T@Rf acL8CmF[v>O ;;c<*Q#tm.Jxm2Ļ#ARmT +`|/z_ Jo s{(CXyGúx@ ^cP0c0ҴZY>/lLejAv9;h1W?LlQrD*K1YQP%g>p<eAX ڧ~?)#[-H̾bk &9xF٩6M0)4~:b-םk+ . fN~ϪU6J|qFU&m 0eab Rzy^o2~Sy9O?ՌzrU訢PN3.WS)c9X Y4`O`H !9 ״%ПgU\jbhrӉFW@JuEmv4'ߡ̤+ }$X(03Y @9E-Kj}-4AQY4fXcl- f4hȲp((c(okW4VR ̺xbGI)d3pB)wQL[IJӜ>c}3OtA2Gs"m{ _. f4]#2//ǒTZvߣNl>'% ~?A֪?x fd^vmLf=kіQuWficd^dLLv$uC90BXoE`"%ږy6N@uI&UZ&@Z'*k]-83R5S誢HRi0)kf>_'Y?8J-:&Qrwr #bpS";uI&t=>Iw+3 1c!Re*N׿o+E.s1FVmYi`R7by4c ::r ԫg%/]{g|]w{_9[o5Wr@_Du_KTfЯ4/} 4~rF)xݴbŠVO9;;}ZSGxOƾRD4|+}pBMz-ai]o kdݭ?nbߥu{의;kXJ}Ւ~s9?$H+[0P0%$QPas{$TJexJV$?h( gjdvޡ|"=f-_vd>>s0pnRpdDE+<&ֻH,2/E `r!g7 |=69\bâM|ވ`lੂ;wi(^9uQ<ٜ3<9E7-rM:~zY?n{ǩ HTo{K~y섥RZᣟ' ADVEҘ_~d'׋9nS+}ď gT1%Ry2rSsO`g410 hWkt3HiSFL4NA53̅'?y =cj s敐:3wL*N^˕Gope?ʬQiOR iV:n:N:;-Z}9yH2,'nQh S( }&vuQj;6Wt@_D>UyulV~&T„}[ۋ?~Cs`/>%キEEsuo rQ A:\r*ۇY5fH0WPё(,X@(:`Rlx|,dibzĨiZ=Fߗi*&KG08%akR6oa]eCBx FQOƒ<G p48J=[N<~ !Rd d,B`D˜T H,ơJ19d!h#tǏ[R?C!w4ǡ ΣhCaW#ȚBcڶt0T[q v|QxKG&ɽgV ǧE $M9]yt /E MVK1Y^lE[cfWF/T3R!m iZXh sx{N>DfΙc/3wHh+$Cf+H:_>~O/jQs+2p& %[ddRUcW->|HUE[ƤYzȄDU`Dqun%qݽM/^i7uǘ ˬ#^(Ϡ:|t⒓5 #=Bb.;k \ywe!g^[ w50XՁ_3?I>cjHAxD%eC>;2:S c/0KiW-iu8םᜣ0*[8Dgs௨X zc q, );@>"^NʁgSnOo];̈5@9&*(>X7Ӕx&$"@{$Dmv}"65ADV+ې̹E<"hd Bi*PɠA+*(!g $mB}:M+#Lrm=6u-GW#e["?#tAGE^kgLZ4ۍtL0:)8*0xP= Yf}L>G `R |g0|"llnFzcG T(TpہUSWQqQNp GDVM}f! TJjq?s'wcDYa{?ɫ׳h)'k*{ y~f:rrഔTQgRŸ5xwX+DjB;Ö(ň:O ЯRcYmvߘ8r ;@``#Gў.}a p.@h o}i*I c!)#4ŠRe]ۀvߐI&00vHOeVAZ i?|rimd6dQ|+m+G/d^#@>%`gwGtoQ!;n7D~YUdVЪ EqA<F[vDnpmz }w!cAT5\1n+m!ngͭ5ٙ6Bq)+.o.`nvEnt'_5rR4ܵ,ڸA,hbǛ reÛei,p͌#cڮzSSNھm 4ptu;ovL4<Ƀpn-g#*)lǫJPJZ 2VNx f )*J^hDqJ{vlSD)dnvd{pġ>4*2Su܁OqIN!rhܞZ1Z# @>2EOЃeFըkt4nu@h;E2 /KbJK^]ׄrZԳص0폭z&x/,o[;w"t?0v1P*+7yGE-|QHnnD1ѶiTڠ1 ;g2w .oz5iX`cL6)X(HIrCӐ=1m [Νu>o֎ s>xپB^&}qUK VĴ7rFZ;atT?;[BXO9*odat+8վe%s~*߯aЊӁCY+@uAfŚH$T'x|٘Ą0 l|rډ̩ UDx)`@RoI71ZG,5qHl}x8mgAY7[dAw(FU P/;"ejVT荃Od $CĄDHE{縡E!NCTMi>XXG?~ۧmmJhXeBHe sBQS+ 2#QS!2""dt ;wsݏ랴^%bhJv\y"_#h^8Y/@(Ygcv&X`k&; då Gkְ`95SsyY_$;1#G}S9(2SL]l΁zG sF OfH_Q|~.fGt;@FI}W4']^93 ӻŅq[M7ϋS=;u^sm9 _Ami0J(Drkĉ@ف ܁|}uB]?@$ ځOf3O!=8"“ vaT"y-mpK!=)&|k9zWOF?,.?ٟG2dki1JnӽKq7IV%)j߆+z&HKq a$_|e|Xaq^=s7ewRRQ ȠSp-W!<Z{-˺O ^ֻ2jPM8js&rS@h4_ =(UQ6= GS?z=e%6dA3aO+[,ɀH*9ErL`BQDKneK>\n齿n,,Q 5 #ݢO_Nիϓ;Wd^e2nJn,Y^NnCr hik/,Ny6^:HM(.fhx umf5P+ ?^LF$iodJ`W̔@n^$aǖ íۋV4.?ZgW6V\-<^igG O.d-l QR' IhaUZ\5 5G2\(X7S@("Za Hꗿgm)8ݳB MMT:5xF |ڄ\^6x!ytA6 0Ɛ=caOf(GCvm-c>,i=Rv~'Yrb-N1 Mmd]kO>|ߍ6E{^2&vzy? _9ejBKĜ/CCݎ֟X_wwd|a7q\X ^T8~XZƧ%i'dωijʨ Ũǁ86B1o$+[Q#1_$V9C4%⨐4rbT' ]k{0آ#r-&m ȌJ:Xz"ZbgxrEP% , *&)YPCmZJ9b @~􈩷=BX CBt JKROu*N'b@ЩAD/ P+]#skSTSnxLw1ŢA-XYbؠԸ;ǭԦt-X70Ͻw9@RQqDn%R*FF#8q]mc&KF+w{_w5%swh75;rN ZnUzr?&OᇟBM/X rIW!4 GѣHgaHD"9?OQ"@PzgbXjmCR,3PZY)DҪ=1M{R8=iGya|LZ3.bd`.%X{ 5n 85RepD.)dާޒOpiJ̝|oRSJ/_~ )I7]{܌mp~oϺ` ZB ؆H[@r!CZh^0*Cjlun~mY wSZkeKրR_G?8 cқnw?T(E"d$LF&DG2/o.8]j')zŲLeKޣ?(pN&58A;& ^6_"Αx;1jq[@ANYw2W˧m֛4ȲA)?um n=mpo닱 y z ί9(?j[Z-zxoے?׵]K s81=阽=һjD`B΍RE2Su> 6٩mwx岃! Պul,PݵiCTUcnVf-u1=m:ah8 smZ'{7$ a :t$RsvkT<'% R \|x0Af;؝ZC݈?vF|wi&n $ʁkx37 sihϠ:EvZ.JzuwS$aآOqcv+@rnfͭdv(co%څ/2{#l[ }յ8ob%kL Q%^0*D:&P0oUf؜?mR"g戧^}.W)Ձc0?ݺԾ%~7JkMKu)_1Hd7q-]|P%剡wipѸ:FՑ*2?Ś^v=yꉊ,u^rݍ^mr޾( w[!=~Nqi;mxJ7׀)4R:%[=4?{>k|Rb. x=wk1}ڤ$m*n)@w%"]Uŝb6+,Pn ʗfCYY# pMb- 7S'p!*q$̪)*HzoY#F-z,cS);N{dbc%cnlݽ0k>9,[n[U)`BL9,콩  ꏯ2?%=xقJʃжzUI%_*J;KYM这FPNoypmHC} jqn:)YLi-T2TNuX&l"15N_*iE#:z[Y'N0B*k9T@"BE-M6E@uzWrXCYCP~+pڋpuJ" ORZ+H*&Q(![fޗDD{ 01D \+$VKjSV98A+ȁ0D&&D$EKZ]wM(1ZܞZ4m?Txw]z|j_!䜂h먕Jk5o*zx_YVF#uX =V?+O.dUY/QwɊ: ~ϸiO8B̜JNŽeky1PFWfRt{>*.?jnK?pd;~hR/ozMӗim,bJS _d 9yJP r,ĊP  )BepxRXD${ƍ$/~?}s8MI._v!;#?Vg ߯,즚d<2zp`%@0q&8HWQaCƼ7qMqU>w센H1Y=p l/WJۻ8,bR`X_Vχgoã=u}w>Cͪ׶"!l$em >zʗ*EoʇI>2RTPBA. V}8 qx>8>s{.zj]۹4FP]wl\8q (;_7\JA/7KO旿țJG!]Iz(|~rv˧A|߯o{~2_OAAOVo rg `Yi @`IppКLL6}6oVW$*a2Y*yQ:/P࿨ϨFm5Xpa9YbR\p?F>Ā G" q#½mF6@TzU(MFh !aYm\з7p`}܃3`Oxg(5rk:dfh-! Sg\B8VKX׶v"p e$5޷_ӳcv','#2jIqN8@zi&:GC'h%"fJ!D={5"1WڢmJn53zb:£Hc gȴ|"dԝ%dǔ#ʝTbG4բy$jhlV'%ѝA (X3s"<J*A}fM/I!iϪIVnC[a_󷏿i p=WZ(Եm^}!.2rYr"oU%e"cL[s%,tcZihȘhfMx}xw)mGD8jv#|8< NyxIb>K1}D~Γx|V*3;(EZnvS4\^dd@8EGP~-A^b,??$Oo"/)w\_! J*; f3s?l&~ >'88U'r;4cS D%o"i9-Qir96"ռ/r<\aeADge3tj-8Y6_z+< 8)GZ3~r'}W++OŽ]q ATceA+%ҖP0L;,|(R q)s2Gԉvl{콙NcҩSqZOnUAa5Bob67YY%"Tvo&Ko粒Cig UZJw׶ /jsRn8/rV[ŤFs G#DaR0 0l7o[шT^t+(5 1n$;ꩴx̹gB!'Sc*0KQIm>Je[4c"D“o,Aǩe9託*Ѝ8@%@ީ 7(ØRz^5H: #d+%;\Cދ!k nn>пȓw3X1U?й3 ^ꉳ\zM ڮcvʆ\%,,w#P]lW5C-MU\rj)ڏ֠ls&$t*%ReSg>[: kk[8@(:ǺKYY'f'j(4J]W^ٌ.\+W^Y+fum~^!⚾%Y4tAut馪,n8a9HiĪ7޳-R!_9&Goq>nJ| -1:GvSgZ%/4V!_9SY)D`N.OA hu/χo>7+f9OĿY][Jot; $az'G#Ű:f_^"8 j/*O`F!cYaq![%LAs`ňqam=a vfF⑎gN'%\3 54܉i͈N0ΒB r 3+\) /6CY#Đ%4gļ' ‘JxKQ̧x{'Q/f>-h%E҂3z,Mr$`<9|$TӺ 篖$W7K&ɬf$.iqD:_ #^`3zth W 8JL/iC<.Dp*L1#lWoqY H~ UHb9Ijb:>Yvg/u^{ :dUn]Zkxl\"j5-^`W ^ozJkYo#W#VxWaSm]78~c?^hVeBƤ(TM#Il홥$:gn~V\ sQ^f/EZ͛'L%NЭlGy`꼋.,O'Ơ*ե[f'OH 9՜5ΨxHUrH0k*2jS ! ?|6:`vg%*n8C#(;\ ˏiDLʘ30d;X]Y뜘Iww =vMla,9dY88d+\F>ϵFS]jCûA&>FBV<[-bWYзjfR+Øp| چ'-הR&:ePjHgHk1ONZc+~>H!}bbrFQ A)LJh|rb⏋+4}8CrEZFɕA* '|^]8\zANn3⩔*e9-"Z3a0 -/IRG$lF ADgeC,x_/gmͼs0I?_/W.yZn:U Dv [(>T)x,aVv(藺.3;q"8q8DWXT`Lj㹤Dyss.ekmSE /rpSEԉq5БG}=*iairi )+4!j1FyDa [%G\*s7jA0ɖp#.KQ^R#ΰŨye=UXI "9^jDSB^<.QJ N9uFY8ۂ F*$LBFۼ4 Lq2kq8> 0^Ί8axo%`Dh@K@<g`T3i۲n3qg1q18szlQ K P m2 $h1ƕ);@p--uLh:3MdƄ8 3̛LQEنnDq#- EU[mؾ*/MV)! G}&m%Ih2\Qd8a0lafӃS.tWJޕ-P4/ bIЇ1PlK :QP0, ߴ@(Ogz9_X`Wa {0`wiF)6I=H`",&jm%EDƕQ$ C$axU͗O!HjH~~դ` QH;OଡTYD@ZWHlbc1!M1C^UN7:@0z~ |”{Q[6M6]Cnp( @8ڸP6]5Pt6\?q;Qhcݷهׯ54gLW/st=UG cו53?pq!tiwQvRdqqmz{;oW;NX'x"Ӝϯy5JV9\OL;`TF&*oT ŘTm c<]y 8!rk}U:h76+G 0A BaS;Ly%YU E "Ȩ$ qƭ'F?oL$81ǒ[MV¤(Ff(Z@JPTg6 89zR eԞAhhRCI! mhQBz)6 <Æzs-MPJI3x~ #P1b^=N4 ]a[yqf{b)ТGXv3fQ&| F+e-Prp6;7yլsyWrA j\0xѢ绨}Jֻ,uBJ]=YЀ2"6& VN0{>7zv BR")!WPõZ=:)?*q{h1(4(MҀSZ_Juk\Z8 o/d筜7qŤAK%t@SzTԥP<MiPM^CyMJfOt%[vlڇ[ G5XtO=vF ؼ0[=3G0};p(Jp}rhHP{g{ԴCNLwǵ7 >C{P} hR态|7 rvkrۮTVS: [:ڎOIJUdk M `f>p!,z86_Ex0#IL{SnU- Bx)H¥i2L| ddԘ@8bX՝HIVu5_ɕ ?7rB'j@n'H>0DAD4ӁE;01FDK5<#$j]I ӈZ%WEҙL.Ϣ^M &`hو%H;6yŲ{TArI$ERz#jo=F4A~v>y> $vo?6-3}|>ǫ?;:bq^w &IЭ'&CoOxT/G5ڲ˪j#oίŨ/8a5lۦ3&!>ܢ{nR8R/v~/8j j=ABp&$~3n,_u}NT| Oi<}d!4=/lߘ:Հ=wxBr((Ѫvy17LR.9Y9r_mYau#O> Jo:sfʋl H퐝%_0@JS?ěkdy~׈5]oP >5,h? X[}y!~kJ" HyKqƷ I҃ Cpj*.!Yͣ\Z :P2M>x% M#ZR0VR:*Rp='AjNCr҃Q<"!0pz1VF;UWsRBM*BmMye`+Ti+ұLԓrѰR3I۩ UB~*P %U,Ob-$0Ok9ov7'wӿ9zN#Z?O8 d''/?8y;ڜ#ܦ݆!l]^!]^ZJ,+O^o-,2)CjmHMZ₠Y3Up'8FN/LvDo|JFI)4h!6Y%3A)2X2xrܗ3X PCÍҍ> y!6$ȅlK&C<415D3}8 : !T=v&BeϪ LշȘl Y/^V8K\ 47<۷}1-^@ry:\ǝUZc!b!: ds='uSHoR)G_I2݃q1NS$4K1R3AU@vm^t ѝ^-z#h) A 8K1wW=¿ ")87ϼ]*X5-M ׻A^͠vaIJ3=gxZ $+) qz/sP+-rwwWgzqއ, j 5 $0 lzo'9;7] KY05xѺg"}Ȑo1S5F 0m2# nKbr엯Uކ+UC߶!ERתzunWy$Ul #P}70wy HC#XOaC.@p.6'pP:$Ps 7B;8Pc+YʖW2o+\[?8y5Ol!D?7qp\~Jƀ 61'ކ1ػoGSa=bNx6vLz +^5x!fm1d ܞA0n=;Qʓ=2گ*/nyYN;:'PoC:xxM^Rd|;l2qtsg-饅0AVg3)qht?"'b .WݯQE+Jh^?{۶ѬA[/qc5$>EeFIv,%ٔ(RvBrv.; \PZ?k` 8B'O[[7rM?)S~-(gI7hKh\ s 4SL^w` ĜuV'Z鈐O^&xLMa.E] ֕,Kgb]2%8cUonfUMx Su>UxY3'~bzGru &`-S%L0M2f*P$ڼ4K\̨Tff1uf dx<#Qqע]&#[8 BC΢wFXQ]ltBr^c]l*CUfF Y5CI'%WPdGsYH ujb7PIX=) .֑|#DmR$}%HoVĄV7 h*_@)Q]E-k\s .fb+e"_1td!n׺#X:' D6 cYF/hk3L*&gm\0rn]qXD.XXFFc1^VOXW&H$1zKD'Br6XB"|呼nr:!Z -)# FxI+AgꙦ&U}hK *X^ RF,B|xIأq?‚xlR&Pt}niFYIUهeH b6 dM\l)p=6xU\nOcH5g`m@;73E^t=~+IW]a2y^W=mUxlͨx4& )װ=ow5s4qar%0Po6E`,I亭9(d , 3ݠZ TPbyل6օ86ePJ(Lb3;X,ð;g0V2Jt}e"j*Ws(} )ƪbqdĸZ#OM(3͒[24ژp|g~["'~5'36a Lt48bej&i)/Rԕ `RIv.10/]ϒRWto+Raʶ{\I[gL\0BE+A]H=Q>7ƫI p`Itugyvʭ@{^m.2KD>G #cmwr1r$;N^"Daaq)EȖiRӊMkD¿2)+ISaU]aZ⤠j}.0o+0'2T9G֐p{Νa-A6"LYhRɩB̕3MpXa~ɪ3c7|35fæbL?!bIMIѵd-&NJxzI!k̹5OTFìh##3*B8H[8 r{`x!uWq?X|%΂ȘjKDnͬ98'%p*UI[;Lb,_@|>}v}"Kges՗߶scE_YbZkP.Eޡ\RRV\ ĮN_Zp<4%ۏ@7**vLL|/0N(~: FPUnwYl5|lPZ(3*Khb'uNxI%AO%&cܗ7BB4g%_J ܼ[ sv սp;L}.KYє*(]itp[g?i]?GCF /BJ=_FtٞB*#:R+aT᎑8-0Hmf/p Ppdev e @pq켴!bd#[o5)lnmUvpb&C,)iik0$ m&>ՍK7(UA ,ao39 HK@ho%% IDYI w+fbJپ|*.jLX'0ݬxاn'9_3$4Jy"%ZCqR8;_2qjNAHH.gRTR)ܟi+,& t <-d%.)qcͣqrΊPr-:DPb5Bu01Itgltj(,"0K_1֐D D3!I(P@ƯI)9*p3}@e':e lлth+j\,O F\TCRJB` B<@EJw ^\ MVduQ9]M45|k162~.@RjowIuxRNѠuD*hƮ$Ʋi0~yb9xbO$%8 uBZZ7N!X$Pݑ1p}(tp f1%cO 7'"e WX#OUFD<cs㔲 )e/#T?H4E:㭣f|ܱI XYD:ى'bg8#ց9 xhkK8+0]rQ!PfܺxeTb]'fwXlxtk!QIHV:Ik9qc!GTd\ouJ &喎 /CLH&v<񝬤`lq7u?o\QrE**7lSsƫ]džpNG6Dw*MEV cK:"Pex͗E}ϻͭq(yViPE{"#P g]N,\y?B6F@`4]L8qusg߿b'Ý[Go^oezMǯIzi3}3;wvwv_̞!=stv^oyoq8{&1W˝yظ=u?vGi^յxrᆧѯwtKRkۻ~<'yzg&AKoFuw4w5CƸ== 1o&=q.MQY ËK]0I4MwޞNfz_ lFg`zUS>5{h04SLrlvu;03{px?UY"Ԛك!Jh7?i?r)SUWR`(B$?C^;q |0^ϲG[1=6A{w8^}<եnaJOF^i,4>OԔnzO 쌝MMhp:Lo1rGoah|gu!ܗtg=5ne}<ά{Mͭf`tH~:_S D~MYm,ԕY_hr:˴/7dt OK :ú4bu?uj2֙SLFƝ[b?=(xQGYOvk?Ow&m@h͟S|s*)Y-ǙڈUOKYB)==pG36DElu)=l6fǩȯ6np쟍{kwmnC琨2r.T3V@jpU V(CJT4,A\!-OEQ ֗7#ɣ88@tfqRX)IL?_O H⅐26 {X+d%>?w"̉"$Mg!w|Qa_-Z- RI?ޠvO=vWĀ!O)O^3h/ߊJX j'Z (&ond뼔dc[y.#(((TNrykFE.G!"ˀ_f,0@35VQI>ŖZnsi rxUXUANp'8'8  NKukkn3$(T歍mkF. O%:*Y#Tp19fϕ-E. k_G<|$|H8jiQUuEת+V]ѵniР<}.E=w+{yDuϏ/Q ޳N*֛NwSLK^L;X7Em6Jh25PJ/zn%hL`2B=Es4hgRz?l:ۑ9!(T1?8xWrd&LK[2"JW8d;^S!qvCS{ocM 47E>|R@uFaƌ0gڃYP%:4wV;#pN!jC5 ch"|g->ǯ<|(I/3C'H¹S}{gFǘ ,&H ǘ 6n;!j#dir=Ojo̓ډnXkwDݟŭ^&+Z{v; k@-&߾"'s`dS`-Yc\? W~?T G/O4E=6033B t|D-)b 2f.|6I;Qmf$7GoP7zPk#Πw53SPA sT:%aCŃKX"5o!.CnnW0ZFޤz_um_uwDYfI|} fil6T2->*n!tgnY [$Ҙ2&yk֖.r򂥎TWTJ'!k ďpE,P4\Yh!#)h嫓yi"(xi&m iX2M-q9h1s4UF9~~P{Z w׈y*ůgXI]|aG.5c'صY;Ɉ;8'9b40GG1ˇU goܲ5ō:].q2i-~55kSq%smNDܙsg+qcOS?C93vE`U#@AU\oPsDXj  y@WCűP"R[ 沕OJ;U OYz8pSb}k oL}M]Uu6I3Xrl`49eQ(4j {(HU)1$+|Ye,$$ !,"?ӟFecw!"mg}QC ;/aeBa,5bAҬCLSU W J$P SI.!xӱx1VZ'a$&(ŭ/A~>ぬdS(n_`N$5&UŖ@_ æJzcC'pի͹VX05Iv09"b숿#YBP!-J!qQ9&EA[/N @4@r-Cd̎բ6.U\`fY-"&ޤCjCk.NWTs%@4gf Ś+Rp& JY p̬0v Nzr 0EA$$@It6ް6"Bߐv"$U9Kqs"G$׆9t\d{Piw7hP0,%_s&6 4-l[iH.|; =7ʐ~]Foe$]v$X)*SH%k$8N҂%uZVvN_v REMqHtJ %fgs#NPOIvbUh7*F\i#H*%6Q`г6Ϣsʠ]x] V!]qJqoBҵYgV,{]|o9߯+l!y"p<}q!/2>:~E{w}4r|M[\ b|ȯ>F!xqۅN@BiC otPEx. !ϽQ|u﵋nV7'֬)49k*A`!! B6$ܗP *|DZ̡JyTI+hRp_YVqZGUϫ֪aWT5=h݁:΄,xQ43A3պ~_k z,ٞ٭cȴ6f D41}rM8Jշ&~c~Ȯ6n@`ح 9!OL"\Z&uߍϺ ZH i'hZ(bZ۳&zujVnzӬk+Z00ڸq(؃qL5}n[ޛCqoZM7{nWP oiqXDHnp>E>B]'WdI>PT7!9AΥ߯Wv95gU_BpzVnO]EI\m.I΄8_,mkj9Zw?ZOC5_ˮG~:?6h8W-<~sOǣ7޴7>+M>張^_ogY7j̭ڍ_姭&>ǚ]Q5vlm휋V'=]eVο|<5[yaOQ7Wl=ɏir+[IF8RИ;IDܲi|YN~Ez/!'?fbbkɋf3;NJFyr-YP^%uGO:<>2srܛ -s(ήQ@<81m6&YK)iNFGjό.5CUY3Ka ɽd%)#8&X4 ޣ2陸'Y։h2F - *WIǟtGǏJ[29~#&#RVQOLb AC Rh8ȫ%L]5yMcV>XrH5i9JƖv9;'uT 9ҳF(,/6=}[֠/֪Gqv5V\ paE7P+k8GllL]y^xqIVP^bld]RJT㌵P-BfWsF\bOTϮ^ c.`mHTɞ<'ubhZ7o !(7Gj=CM#m9 iټ䎛 N|ShF1)ٗ6H9Z(UU '+_!3B\qxROߒFKK:=tydogۣD*Y>k NY ]u.Ygyu=Wyu~r$Z+/?֜Hl,Yןk\mK3xYBR~fzJzݦw (VV[U5F} uzvzPHHC RZ6OB_x/ggVXzQ om  0|+.ʄV!Kˌ?ثg`fgsE I`/1 q:jJ-3Il+9h/Bt#/P:aQŇ4_zx&Ke,A> 9rCf'I: ~'h; NmKNPbYsA0? jF+ # 4}>Rۻqv娷d$h%v\-x_,i3Mqb)r'P% cRSI/bi9e`="DX^%|#oߢ!*͆%xaݍ\NJQKVKP| [^,GqؾZeQR1S=y"5^KK.}b@oG5ybdbٚrݧ{b@Z}Ew侼zUWtzjb{薋y4z{M;([@?=G-@ ;uۙr [>Gy1~8et$ڳD#(@/#=3DIqWb݉XF>iZ,ڍ۟x&DmkM_ 1bq_:PQpѨ(D}N+"۷(S8[ڍI5@we#l99"y?胣"C {6$ /;P~ - Mi1E8ACi3)^9mQgWUt tZm@';eh4%գIkmwFoC?8GMꚖݬ)˨*ipNTXJm!ȩP@lt(Mt"|*L( G*3E& XwMÿ+o펔"tn +pq<pШ3}™fYę6]8YR61p u z_ $Xϙi2b^]>~'}iwG!MiOn;W?*BNl((;E+DbHK=Xl]v۹U?hYJ͢&; $Z/ /W[zmC>]Z{؀KGs|, S{8T GnZ!Ρ_;dp- J;xq@E{,eB6gK ̱=hN0;Xc ѣhfpZCw$hdlZfitmf.䊴_?7Ĉ;WoW Y6/.S9Wy//"=JKG[aע7W$ÍvSzӚaF׀\Fj9_ 'O[SO[B*k\})Ihr}<%s~qE,+>TԿyy'B~s78ib>_[oNťҜbyW}}F.}zJj1Uxgxoq$Y}zN{A\0"!h5˹ zLAq'+rp3?P%<6NAy1+5jZ.|<׏]/@i@ *UwASk@#JTᄍ{ykwmVټ{2iǣ\}#V>F4w0E5/IN>l N`8a HO?:W mH** ̄ JQZ̃n5!Kʽ*egF'Vf\HǍe:2=^p9e2[jC-?Rųz1m&8UG|5?313kO+J(e+C!H6C2:KWdK$*@xt~%7 vkH)e6Ft>ʩ* ,GFJ$iVѹUjXB/`N+bShUB@f,Gߥ$T7IF ïc^30#%B /_l*rAdJK-*T/P8=ѣUtbXI2n"@A cG+{QCu7?/$N_=}Nopd6J@Q<Ta9jKp/n m :7t3#",ܑ}Vh [jR߂RۿxML_ol9kwlo۷p˒._"'Ჶ4)˯ϲ1'io'i=fDBTAJ :G8>륏 mp|AbA#[ZQDyvv8}ǝ%6l%NMSkOM 1xj#tQF?;Zjyq|}ctxu|̅2Q%ǧٟ-O:zge*!~˵WRyڈ;rFM ^u+ntv!Ҭf1y*3CT*T F*fqr"Q㛯7m:CJ\Sh ]sR:_AyI l '뗇LI~Tzwʼ vTk'D2տn췸UsjOxcR[B%VIX쯳*I 1'+࿩a-W-iAA.";b~?v]DvD{6|m쯀zateDr8H p̷` yF!$U:.JB.V8kdtJM M-w]76\~|u+"| ,"m$cVH!R VqWh~hB'RX_ގ tA*\ j=Ȭ4/9|7,e;dóE\x;i31DQA'5}A,. +M6;[^Pkݒ_g!aZϿTZW.{y5s?يlb-bKW:~0GUt/?U:>~B{=H9ܫzʻѠxG!VuEt[{b9i9=@],|Gˋz{3jfwo߿n\9Rwhyx^ww%`;)&IQY3_'q_R]/eYZ*xi'T qI[^5<ЅGS+O~]v vF 1d\A1q ҝ ,.tF &=՞Nx.*~žI/uDH!pN:S<eCiF=[jUc<`@ bL y3 |ԉWOJK_%RL6@gpϒsNn(PzQ%dP`$eJ2j9NRDctAèB5L4,hboc5Ғϴɲ"c*Oi$9c'0,xa?)qԒ"ۅѳ6rJVco^ oa5F߸loc5"T* 0V))p3" L c%B^PxЕ\ư9@mݗn](ș{`)?Lf =-НtoXo_fX%]#x qΨozϒI N/LYώ%{.'qO{;$ug_X{;("!<0 CR <]$]/; c90\qIN4aVjw v*dfn.5Tuޚ*ɨeJ*]bvS8eyTō r_}` hY!>@]A4t6Du%Y:RӰSUBc'vus075 [ H+z5wէ{gqZ`9eZlW#b]W$J ޢW ]H N{uT@vS%Azn$)M t{Po%QA; oЩ-1ֻHәӥ"9MGKP~pd2\jYݬqvTmu#"KCUoZ^_ض5^b-t꿟ZI.t w`LD7e\Yٻqο$D*(Ege>˝g[~[A:`ͷz~]Jy.J̯.I~QB *8)]^b-C˹{R4+wtnvfEd/&?tT-nv;bj u\Y\Z9w's+/Rhuo_+}^3OH6_vPu!%ݯ$OoYǸdr*Dhm 1Ϻ+S]̞QR2`/*, Yy|8yIُꢟ,9{"~2E:&HS.GW}4j豳cVO=Lc" k oZ P*xCʈ4VSi+?;|[y5zhxƸ(tiY4 k*#vi_68ʭEc N2@&,޸r*79p焳3yP:0n[RMIlCm~\0|Fsٳ7(<5[]JV-ndDP*]5g.7W=ŭr6ߓJ[f9owy6|aJ;_VnVn\]%SB7iY)MB4+MUVf֎*A+?*2]=\25ss[Z9 "jInu:tS8?ysi ŵ=ԥW;d8hm|^9e^әDS.cB,ϴ6)P4@f xzC T0m9+K˖J~Ģʍ0=d겟.ly0TTjWd d\զ&:(-u5?aU1l}> 6c,JPx(L3FOn-x+ C]q O8܁t0m ެve^|3K;Jpɯ=l#%j*tX&F[# %u2YD.>}~. /noi ir sL<N2ωQf}dE<ꨦ 7KՀr/jf"CF8d k騣"8i+VPD.I5Y5'Z]B$..0(")4raa=PtMLjMx(@2Knec(!X[^|^^^5_nodE/Ϋfv>+>ߙNVjK uET%ITP׳tpY_N"#"K/;=g 5=v]WH$: S/_;=ʸ7 Z-"1A˄RQ6LbW|k%Oyz`^ !_Ynlgw fcY7/l`w1'5/`k1?+ @E`k>H`=AMӷopDŽ|N {Qq ?XmHN;#AHP|8?Xë0р|pgu _W1U̅# S&zQJH!'C !.^JA#UCWN@ʒ.ǒTQ9Ks$+xa58,(z:=8 ȵӁaPhu|(G:J)5ԉP_%齝Q{H&5Ց#+ ,V1FmdbI&tRnerMxXtY> y1$'xFVGW#TQY5uQf/sM$Ye&EU`J$/.jի=޻T !{,l?5r`TvRO!<-&k֔T1*R8WU*%ݼ}VՃ_~gd=xq?O:<"cwqyDC ľÅLxYsc?ylq9g 㗭LXRx53((,-W=N.U*3OM E1^G]F}2-LM>QCڞA'^e=) IV-ЧK3\qܡ&U!HcN^QЖ[WL)тQ s;R7<<}5$+*]LK 9朗ť0䙡E)#1RM@LߡֺZ!*Aw?ܘ !m 6Zcs-tf|얖voujx)G9Q'ry[HK [67 47Ϝ;`0?Ǒ pXX;_e +>WFqt赅315ٙPgQs}xt^ t+ǭU,~BH!T 7eX|iꎉhVN٨vY@A DadXG{D {L₡?A= ,vicAr}rFӣ+TP8sap&bLA rXVH2Y$""fR[ 9~KoMMHqF5Mz6 4o d`LAXa`C"??}eŋ#T8H_w"MTɯX\~\u 2KskLD&i3Ry\ߞkcVƋu^ dQR: ϸHjjrT܀. )UʘL?;'AU(\v~D Bܾ&Z=cgVvIAH(/`eeJr/ "EŀfvI*a(˸)5Ԭ뀞Y=~车JyJnD:PARK/ƪ.)i!8(Ȓ4yBeRpD@kaI݆fV)K{8VA(?_R_-nd9 ƒ4edUKI4,J#w%y%Kʫ%G hwor8Ԅ1TO7_ٸJWT o}o9B7p cRLi6xWhO_wI[In K4$ଦiNfb)V U*QX͋1ΩOly2,Y2-n[;Sq4Is `kP!S˗D.Qw *T|#|Fٰպ=qIk'Z |RG %q{KqeE{ɖU,z5V)vռLضkP1dxȐL6j 'HAw}(r'- _nS> ZGC3Z=@\lkar7T];3yHKTޙô )Q ^[\d׶~'Ӣ|ET:[6MV ^zD[8cb3fqQ?LveqυqB/ LC#}2 ,D HHA`v/ MG{yRR&uanviƓ\ǥHSC#$%tӞ)_ռx7$A{YຓW6WTLնx4jGdxj ۵=r/^gM(Qv8@(l!)Kmb4)]a-rcf6aXF!͋0s蛴ޗާJ+ꔌiJ4jJ$+dC-hfDؘBkp;o]ޠ0ɵJ@Baw%|$'Dt"i&m2կ0&6Dm| YUUjY٢2o{m߾V67)Α`Hdq25!ho&ɇ( 998;KwtmkmW<pv%1J= ,jیrR 4[}!7[ĺZ~EWwm##wJ# zw [XA^~ZX~?˝{ԟ/iGxyo|s%"5k_ 6u0G+do5^D&tzqif^ge6G@kѓԌF?ۘjH0 *$:Z'ON=v$~@e ?Fq*Mkng487q~|髷?AdSJN3:IFL[us@Nw/cQ2v.'vAό? 3l3_^..^Oɸͯ/_r}^V/6O];6-umpmZxBt[Um %T͝l xUr)Jde(ќW(C^}Gyd{nj^?}w?кGٳfn.F lףw>ʜU` 7fZ_??]w7^\ǗuMyP T`NO7b5d{z~~D@?<ǒ 7I xuԁxׯ?·ڙw9/^wz_ "5.zOG1BHp wO8[/h>vgMs0ݑk +(ڌAOGE@5f wsUԦӗrn> VN[ ؓo Kg2M4eyqH|>]{;8At:P,IC&6Un4:*.$,ilٰ$<L{+c2$ɍ  f @pDACQ0]l$5Fw7PhţeWfθ_0L}YapNe8yx51`L;i)+h'G7eP>\+xK2lbSBQ x=k mGKtT#PQ7L\6g,o-X O])XBl pfl|6c۞ /?٤GF_$x]&)xu>@~'j(e:="SQ/0 &5 'Z젿M}H SSNj9s""Cl,?o }m.s/fbUmx 6ŰfT:{}ʍ[|%Al4ӊlyaR w8 F>NӄAH0No.#wgD\5S LtHϒؗM>]Q oDIytMb\yPof+%wfI5bfv|XKgiom)œmC$ޗ|Dme9[SNfd>N>ZfGD]IZ@bDNߕqe ѺkvdFx vOK$HT~|L-cHu?tiiR8fj70r'jv%cwAvjѡ7jC$P~J70E||bZkkAS֞{FTh7ZJiwxu)/ɏ⺐k:A)"N]@aN5!8i% odב0RNxʤ03EhH4etΪ Uxyx XcNkn ;=4}>p.o R ue#R奦X,*da#Sg}iP%ŗQ6?4i3RM $ $j݋!PvQrSfNL)DH*I\N,JbnF3i(%EA@5 `MVlVl+M- wD`.;k!ei\ɬ֩^歂_8XU`՚l#R hc!W _wiL[K#e^*ZxEP1V;j)]u%s5%-J 6vX5w8ؘC7ϽQ7c?[|?6gtZ.'r:n[i{qwW|k󨑿KeJ|ZnOkhlMb֤5HӜfL*[Ce!F9@J.HDPSc4H/I>@(UJx@.&cR)Q" 2ʧV TH0|7VfuS}J*x =mRm)B.^i0ؤ zR 3wR띢4RR^ =]<gdALS{N@7yBzXޛfT[_ץ9=]MnvvpbhNsD&y'q06$rP2 \c1)sTSjVjB*S!A0W4/}j9p(C<q>8i0DעǍ3<}xlp4p"V-{G28Q޼8?FcfkF㗓˂8P/|ńk.w?lY}qseˢznj)_ޥ[0AOu\-5]@2K  sj(%a㩲F;I ܳހ?c+aB]_ls/^P,_M|mMb Z`*b3bﰶx 5c@d VXHi- ,5wd Pꌍ[8P%BpBsg0=#-)|,'A` vWLmcBٻ涍%WXz/҃V:k$TPRv( I@Pd$?$|=ӷt 9Z)ͯ͸+8)Aqr_JXmʸY\OˍJfJZ߀q`N2Fh⬎q#Rb̸`H7`ӃQ2;pd,oSϸ»Q,s޳5hxv: VTblFS.Z[uLPQ#%7%oWYB7_ܦ(DrND9NVIN:y S jA*HpF3'Ӓ/T 5\QoH+fdM;j2O?9רkZ!BӄƽVd ڧΠ-A*mx=!FrWEWbL|1;)B&*c >K܁3mYv?z"Z2ΒZ>\Ig++qgE(QQIV6G>ޛ̸xk9v "j]=f0^ǩj s1`M%y:I VːPEYb1mhLƗΙv ! !0y2fUX2o.+}AşU`%άcE7Ϊl@f,E'LkN)mE@Tȷ-m#Ru!7q+sq8Vrݡ'{++BSqSZ$>&%ӨTڷ@MHRX;ɜ4&DrrahsPhu'Ӥ"7U¶9]ڇD;iF1JRCdlJinբš Ɍy`FE36rXA6X.g5Xάc%lΌ5w5Y] v5*D)'}< =7}չ~͓Zl۾cZ3Ǥ"oKhxB-$bLzo١L2YXf>\}%5MX4It)StT#JPm$e`”nc S+s$W28.dۭGBv,"v.풘.rjJ&k/Fogǩʍ 6"[e.+.5ii1GyL˾71Eg]|?DZLn^S sb&{']?6>d -Gj8jeo|t( :\`<](ifѢjrzyvs;MvkV*1xseY\-6g^2>޸`1|h5A "4a{tn\ vUE +O͍UI_ a>2>|zCA$ޥR%h^'tR&B'\E# tΞeLFZY?W;ΕliF4X`jc)l ^b K;å!,¸zTlmj Rկp_[0 ׾)_E,nTyW\FukVKV*6>;!Jc!+uq%;?Q276yH+^cm<A7٫ *o/>+wml ^1P]:? t"sܿ< nY~hujpg -p>hr c9z/h% '?og~octkfFBoI54~E&mNT>Q\oJ8m+tu'۲TO[N'N u-2̇`r+Wsc3>4we\n]/ZmY+thUp0ܨ(jm\+60&EJ`8)eHYs$\s*Rb)sPHCV` TUٯY谨Ig?}g5M@buTAV1L8u!%NcR(ՄMNI7.oJ)nXLEŪ3ٴN~~냑E9+F!FKe;hI&H&gv5k>[ynV٭3xy", J )An&X*zb&D&Aq'Lu' x|tn|o- vP> ? xK30^w0vc£n+o zƝ~o4No,Ԣkb^,*7s_gK.u? k׌R/ /ǂG z)\0,ib"_ \(3A곩w;|h/:'I9:벓$ta0u89^x|1hF2Jh8vyqwF[[īZ4 _Mvme/N՛yo|@?\};\_/\n ?.PzU|RB?GgЩqoqvۣU̽Ib+or O쿻L& d s⻃^ף.'^fg?9.2xΆy0loneT mp_sJ߯b{R(W>x:uD*XEv>fl߂ޞ|&@ ʿӻ\l¬\ꂏ4-7~;7Xi9u %U^(| EavzXԱ :@@cpH0T= 6Դpʨ{y i{0=7 MW˦ȍKZeO qClhJZ*8J187 VqS P KGaH$iG7ke*ms=\N߆s Cݮk1_?| lVRAZ; ɠ=$(g) Κ ӄbI,!$QKc"('q^[5mjIM`b?vJ3hgPj/fl?MQ/׻9 [v0}_$0ƫIuӘ4&8 0N 0&-X0l Ah+ԙjFSPV ,X0Ji`Yp) x EߺtoR7ہ5uby.̅}}vƟ=׸Q1M{5~TP&%^`e1G4! K%N&@ƂsZĜ:tU:@\i \7(i+%^IL^jJx 렧b7soo{çlI&MSm9p%Ov=HR:%i B,{WܶH[Ow*bަ.|JJ,i%) "q@$eqKM x};y.ˌdVסXBLiV|&&ZYRY4 d T#EibLP+3{Ke&DV3$է]cNn,g})UbF#YqX JgmS `5Q.t/\`Hi"âEsbm}l6æKx!?idYpvdb緊|gϙRﴊ֖I/fWH&ĆmEFk]|\hWq=oP{)A/ts/)0;օ*㆟cLJ WguzN&ݖQh&V%3bY $BfĶR3 G&XF&s.b) n쩾brUeTUB(xCbDI?ڳ/ *17ldk&C6$3N+]ئd)Vw&o|#1[Ϛv|$O_ [ʒ7\B17]/]ib\,~eA} 'Nc"[vQNrs9ecY vE."!Hޙ %"-r=4,"Op& E`rCx&$6; OM0b %8Zл)' }7{*Kƾ 0Vzu"D:Ih5Z{MHlp#o$U>U݊ R}`co&rsEYflU`(q LGro b3ƪWT.3a3L |s:!w 6ͽL:6-2fd%LnMc1¯ohrÂk#3>F{DҐq*V-Ʃ3HI<;Sfm.Ch=z},ol`1_u̿f7y5{$7B»ntX7N!Oel|[Tg!dq]%D(<9,N'V#5(_'+[fŹ8ka5)5v9^TuʼnԟcSÆʱ/PҮr13fh|̵K,CQMg#'2Xq<9K52֫?쇞UѸ/bNBK?'j+Q^LĊ5ڸ1"R}+y3b?s$ZfMLl;/?y`e-<ʃzl3'ۍSS~r5{` *7oL/[_Kĵq?w\\_6K](+ʷl $?nlP\+FKc}65^[O3psc?{ES&wR>+)?ů Uub[5W-B+e"2E,xs, %[ fU^VТ. zҨEqԣA5܀Etj{^N*%27%hO ):z<bOVb{>{;9yx=ϵdBCex<yyp)5Euvw@IovÙQcB=*7·A8:ԛ#+ۓg]KlbCYkD??r^|jD qQ,$;(v42<ܜ9!g4ۋo0Y}H;fjd[9[9lG_ v-(c#w_ϗ[l@n2X6H3Lr$pTrz2P7nIwqN/$|Bv @T>.& uxb;j.2g?\~xqTi~Q]~oQ -& ʵ흇k?Ǽ2OFƼeɵή'TDGƧ.6-?;IӖRO˨ k=G6\_]]uHxS$$ɿe8ZLdM `s:4)1qظiάG,6n岪2\?)LQcfYbe`UFӇBPX}U!T]'iCbףSk_k5BB*ܱ#fzغk=hJQKq+u:Z8ɋmkye޶A^16| kG*cʚ uŸK|Zegb_Stw3djd:|?o?LBXu csjں54,gU]&VJX{S,lc(ywgu}oHtWߋ/$1(A_/xgm wo~n^^oKQkڻc6 !Gm[)MAk vtވjw첕Ĺm(0ݖBsvRf$8DI"0ҬwWaAx0fEX /7e[Nuf"; B酺 }н wO<dcO&V5fL!#4 xڱ"g4Dv3(IW:x =Uju4ž!O,8!z@"[2"Z (.0UMns0/)fyX7v+=6ŏ퍏&L<>RZ߲iCůʤal&(Ol;P8A7*ډ;6rL7;noOBĊ&'7thz_z{aiB#h6G4)\0.>%&DoA_+e@w㭍cۚS\Ӊ5]'p;tL>~:-~ʜG!*mTrYrJǀU*`Q"#Kf ,q)^PZ(\Y=uwS֟ < z\JKbj!jY,`ERs\U $6#s bQU>DC\@EH >d2\D?)KYrEбȦ=,Br!]$ (. dWW:UE( l*brBBW1W$S$,ZZ*)[BY&ڃ9i* &.FðF?`mڪItV_HZ_86lU>ΔzW4uIZE?7bq748<* e{3H!Z5D+^r#FPB(N%|·flH%vJ±Jr6@ ՔL!9BxfwI,A1٫þpl{(k v=OUd:=ݏowEJi%C V_vkknVElB̦N՞M*9yIJEG[v${.9 R %P )*CLЈTR3@ycK(߇-F>Q <4sp@} laj zh[+N$4>0,γ\SB!+CE|C>tk0yk?MlYĶY*ta6ɍsĎ*mngPXršVQdJIےF;Nq)KVIz-^鴌f{c7Iϙ9\PUGmWúgʊt0Ro}W~ ̂1΅ǖ?q~w%Y*y&p3L{c?vsbusΗS&zhp[ثt~eK_Vw )@RTbnq#oe=ZmGDz J_5f-Tf!#| ,]2~i!j!cZok,PmV=/r 3*D;Q.k0[(/*GwyfwɣGv]u\Bvss-ݏ0GnuףC%I5lm̰\˵ݶZDu@A9Ay!/ޒ]+$+i[qbYeKDF{J#^u>!~\)fn/uƤeAN/5^n 㵳g'+I%cwaZʗ'3TM7Dh`\ MyTA2L!"ϹHQ CI^3r`450:S+b]<˃{uRDaYq NB+Zr^xF$N.[륒rq#ܦSS-?uBuRP}<ьfǜfjUQ%~/*v~9]hUԼOdm*+<<+'@`'k >o"MAvN2r|qݕ;6z|`|r3"2 !(WzTQ !`":U΀޴C>:Ah׌1hȥ >?4PiRQA{5qM^SӽSwҜ[&,|utLjXwTZ11 K OzPTH&2ku12k"˹NZI3#s2,!βrVF8JZOZWti+FW g79=sMfCyHI%fEERo `Yfa\[hjUV)B£g39Ԡ7!gzxuj$0aypΉΌυ!K(5H8QD-FH4J5S](jрzbZ Z*b8XTXɐAi*I)MLXWt!ѳ%d:ݶ8(5/7s__3v́5U֥G,3S'kOoPDB6 `Ce҅~;?_Qhy\o^3/ {؇>7oW>!y&mQ90Nbg6;ɥC NEN )g&_c~>?lxz8#OUcz=C̟~zMXWb{Whb?cY>%c_>%jIzU>2q܂3>8FL2l (3堤s(1 Z9i2#v(xGpoMZxHIq|{]趱nۅ$\B:- :^-Lnk87_TKg5bg& &UiGrX@L&\`EH>鄶م+VP3;?Kn18FU.Q:Q'Uzu<([,b2PR4>5D+uO *aY l{Rw=h<O>B.<bKG繴im*4Wg|aXSTLaPDԌi;f-dlݭϰ7hjīfœN]Qc(DNþN ]_VSZ!,sq!Qܥw$B])QoN ,{)Cm~eR2$N&Hh^s'rT^nÊa=m*n efI ֵɒ]o?kpK:ad3s.{Al㬙$q{=sxMϚ'm'+wq6pun̐=2 -x93i۫rho0'RqN}}]U)^Qa>|3<<|) *onu-X GܹF[0Z7i|_@ӣ'\MbU"mC%$u8Rڧ={͍(RфiNN0HRhXudXz'<h//ڕ۲UCP)Pb#6(n'ByNGwG_<膜Ӓm^CWF賈3*'pb8h t'1T:SʦwϷd~qj~펡ZQ47 [8Ռ+b\nMTXRKEJ1 )/bZ:%%9!lPAwoP_jHY-qH)D3Δ  r)?тoh ttWےH|f?=8h'.hqڝ%SMu lu,׭{_ ˁO!& 1nw_~Z\ewo C>5qEA8!(JUnZxwm1>TURxHl[=#4Zxwob$y,le{n2sJՈxCѝ 9W$V'ţ@vXx!0ȞK+ {::qN=׆#) VːmU?p9pYF2_1ipNy@Q^3 Bѓ܎dDKI{y{\ )QSx|.ʋeVQ^ƨԐ7RufdNaRNs!qe4M;ٌ[#gN T)KmjQ|;mQl:<^(Bsok (E"Xi"}r +U4 R6&~i` EWōթҒ =RϹl:tj5s19Ԍ;w {|;rl+t?K'~pGWzp ; =\ۢowl/|{ ߖOd (Hjc{_+/@c{]3Ɛ!Zhjxm$7B#(GJX7Ş M&?xlOde@kDc{jfHvB%~$XiL/nu.,֥>0=/~Kl7yԿz1dCO"ͅPՖgԻ(Siέ@Fgy9F'l (*s"`0JB S ͑&n+p4hͼ}[rʜMKv&SYF,zvT!-<6֨IBY Nr!)V1e2JeBܣg1989qTrLnt"1 &QV넫T[H׷nYK!7iAzembd lortk}-̍Ǎ+v[aسpsM+fboW5b?23R7YUXEfy=|  E ɼA2E? vK1X1k-B˰z庋1X9PV]k4*ˏ ҇GTp;(Ky0]BkM\fq'YDf@ {?` ߁MB-=  ylx_:t釦mԔ9aR{q6GTOnƣ?8ˉ846MAFUNJ0'E9Iy.ݓ*@kIo9C)h|!jl:pUXHSmןɛ{g~2ڜO(ب?e ߇a\Pa՟”}Y-SU T` 7k7+Ap+UQQ>hMo ~fV8c<f V\S^@tHpؙ`JkM&Ob'ڸ t][v)kAw|syd|{gI_`}t.bYCdr P(9)Q KMoᘌׯ^y rK&#f;$FT1r'-ccX-5Dp0sf}_ Fiү6._r;,EeA^6z pO8&cb{rPqsFILSـj*՟OGqGqi>(.ԣ9mU(XRw>^c?|:X<꿭 3u̻Kc݉wwI͑>3&quZ{&i;3&-iΦe#Vj+*WbTB%p4i#lzg-,wS8xp)yYcϩ~ռT8#;EE ම2 'n8ow?|]5 _o(ϻVI^*>h6x~Gx:;(9xS8y^ Vj1<)⬆4?ךKOX_[P4kIBEL sgnTڭ!IӰJd{&iMn-Hȿ֖)Dq8\ H 偄d?{wu")= miSJEOF օF :/-'V )f f+TVWvU,W0?2N>=,*MXT26cT)N:=Mv5҈9B7dNajuedo(ѐ9'viu3qiL7)e''_S࿆ٮ1v6&%Qn+gl<.Ju ,1壋pb,TLxio~dOxCQqwtoerjsI}u&n~jD0pPtHx y{,-fi Ua+it73`<69At{; Q Il9Njc Ѽ?t4E^ԠJɁEKu-QYWueM5eކ[Kq4\I/_tMaG@瘖Ela+*JwNOYdB- J)~.QJ RUn"I$ڨbyUR&RJv<9i,/d8ˑ5Vqϼq1 uRecrT@Ko)R_+;VGo@f`E4P4|P:DRxP? 2-5rp( 2c 8w1 ņzڻw7 MJ/_DP):0)ª w*seӲrQm5yA9͔ OD3pEkUFRufb0 Q QY~o軾FBM8f86;M/*JUƫ(Wei\>#TxG -ΕXXkYq[E|(!IɜT'`bS2WEVLvU*kIYd'8e@ S*s$V4qOlN)0ScKŜVvdV ] 6%( cF}i6ލt0O=,;i379S(Xr$"/BaAu.CKh jeh\XV~t<%sg=hKv:6-8e9!JԗJ+WSP -;l l)tq.dk2I)V&028ϘŅ!$F/+` Hn(wE!%\)$T x)>gfW.ђMj. Yv+ͼ˵ApC(u&AaZ+L#WmL6ʺU|Sحe'e-f?;@I71GXCw׃o/^@5`*ftYGkKUNTu1@A T煿Ki4'Z#E#Z%l+"2.,ݕSxm oĐ}lϗrL{|LE;Nۇmwtmd;Y^wjDej^I4gt)"m9ݶ2.5zָp尦4`Veo%OT=\ tX1Tzg Dɶ#e;I W⩖.SZxȢS mSRJ,YXם9Ȼ Ń-!~[-orm޹K)PPK_{KΟhl%*);h7띁[?ǹv٦MND?ݻ%LDɹ g@3Df N)40w)hCC9pyxO3,i*=V+Y Btc,z9ּlI:Ҿ8c!V@MD{=prt &K-݁#n-پ ej̟vZ%ҾH U*^Qx$5,_uJQ=6Rifs̺-<(h*-R VzւeB̚0!, Հ+m|< sE8WyTcLx0/}3]ZO6]Iqfi*/eeo={5J n {N. Q\؜p"1'ra3FfJVkV{]  [8B=iƱ'.h;Z_& ^\2Q&#,8'ꌊ-U~WQ]EwU~"G`5  ˔)Q\0` amP̯aG͊5켶,ڤ$ױ`M&jM!Tv E۩Ʀw& W3+P7tɍrvE1N!h<8P,zPK5˓M/1E=kdSB0˼5q1uD9k!<4Lz)9vJ7Lti:T=֒ëǴEƷ7D{\ç.os7WOffl_-~Gtgբas,p_ߕJ#;lYU`2+]ϭ j`7rutl/SM^#Bu|n)EgW]Wqv$~'DsAa~y*R7 22` VNO˭6 ^>a|Mf&D3;%؟/Z!"1J7bj>nKpzN(9I?vFXb:0VvgX2|v˱V:uInԥDcA{ Hrp;>o`ep;J.4〧SK+jLEa|dצZI! 輰4`:0j)3Ey\ qQ^6i^;mciK@>$iQ4-0-6 ,I&EREPK⒜}fvf23Œ4V[$k^U\2OԖ!91=4r!zm1R)z2PϨ:´t}Y0,5U޴4VݱPMN䔴TwD+j; a-уYvZ=r6w*Qmyہ}؈$/]~Z5B?d3NPVK~Qfx|W}}Aj8ytnw7.?@<|8z!x9äSqr.$8vqk>e"v= /  'c}z7Oxd5> Tȴw CjݢWl$ L6N2tZ3@ٯv0/M?F!%~9LI/ Rg?|NɵgHSYo N>:*Jχr,)iӜş~OڬExt}cg>{q_7ƃa%*StqFCu۔$, EU3.|=Q9{~U+܏͇)4Ki937tr9{J~aJ=X4F~AaƧc2_ء6cZ,|__O~z9}rat5fo/4ݵe&װɐOg0$ξZ3scr0S_Ï޼ >>D8'@4?x=| &워!SO&4\>scwgnt}{3}8@2d 29chY@,unhn/''/I O7|zsttu=xy8A߾!SI0P+x32R{^1NG?gbe>dʿg"3E#{=E,Ѽ5Lbf%S1bXӈ,Ki V}Ź̤ W ܔ'wI4p8FFzHA |+ܢ8 azwf?x?~w<(#wr*fޝĬ?0WSzz!0xLQV+$Em )Zugpuc҆c:ƣ<{tzԞiL!f7\\+ԓ2 OHV -gKx =ȯWhj_sᕸ;={ C5r׹NV`LC6?Z3hdQٜVLLc[ŷϿ`_Y㓣 |t܈g\óBe˳il]L(W+ޚY2qM $j !4|o6K[;׷>z/oD}HLFr?Ӳe+n{+hScS\z\M}*GTӶl~wsl}@t2)>.u D]ufմŴJӂ`I0ޣ4p}¡/R8E9T-Hǽb(<U`*r$VXOhQEThӘ6}Q4M_@( <]Qu BJٻ]E{YA`O9Ku"N3N`d.(cPcR!aSY(7}yu~^>ՒP[8Ri Hc !X( F(Q(tBp ;cr5CD☈5IY}:Pa(3p"!G2bM7p M]z HGμr"l#1G k[V;jD`Ƶ&LcԠ& \zZ~!7}'g -qF hfš4Ls[?c@Upl66QqB$ީDpP")< Hx"Q E31'N=JwJ=t01$Ճہoi 6w ("5ԤU1K]V\1F}bЌ*>K𥂋2hLLI{#=WNEAL! H>E(ւFfAr3K"1ATƖ}Ċ WĹ/;y'N^Dr-Sӯ FAj꧂h9guβ!Ru~la#nt/5,M3=Jq/QҁZ,w`asSBz\MFKjpTz98;0A"\e$. 1k#p@)0MF+GIu@^8P@ |غ=.NXMWՀ1RJxυ!!R8ZcJب14L+NJ KޤMDS(d9HG^a$|[ŞL:"τ^_&0W@`~mւ:j{רs\Z җD஦>PE8x,XD9#D#~u҃FM7j&:8S 0`p Ќi R[y0Z/AX!8xx7F)fQa 4ށkPnE`"[RGnoYB80&TF>?Ȥ4r#j48GlYh><FAȔQ 6+Ji43nF[QT$0*ϙq"aO)#J͎kE#/< !\x|z\*19F9[;eǨQ בbur^܀t0v EDꢓ9u7Ti-h6X+X )09)1(6o-=5L>Dom#*/ZY LmI]IO=|fh:qQC2O~T6J\Ugġ58|Fq14r 1ȚY2}a;< %+M_f$vSѭzƅ(}| o\T+ -GA>]pS&U#sraGE'?~mN(ZģoPmP=,k]UL(*5hVY*\O?/RmܠY~6TkGyER"iH`_BZB(ƽ۸sZP.ag~^ Sagڢd.-Hl26D QFL+4uAİ"TGe 5h)fz߆/jUcnjZwVAћĿkIkլg!X2JtWJ< @^] Ё7m琮)U; Wrv]VYp^ET)so(Z6KHYj'1LkeRtq-Uע1թ~HDmTvΟ$rvZ1N,?aY3*.f_UFETj DJR0V Z)P61ގBO>;^/@m+hTݫ5O%Am FD)֊橲)<#>mCQYpt`Jf#tT:eEʍ^Q |)┷y8)+J=)4Cn9wyi4l ` QhMyԘxWVK[ꬰ{Y^"< EZBRyeDODz7AL]Ϲxg!uYw^$d`b<'1 tG[8$]\ȒBѴ-T_f*e) #2*=ɨJD|uvR95ׇsU!t+߳2TL£ary\LD5$U訓c`\P 1)!3<;= TP@_^*mD7 4\nnn/'ϯԯk dVLFp_qT _ zɇLdA Z+͟4Ҩ{Z8%aBUU0h~15i"NXP(2yxD2E)(Lj a|:i$K'w;P q\RH~3øZ'+xt=m'e YU() \.>O#)A@"ISJpRz]PN%ERq+-sd o8]&ֳ,)Sw]c '|/݂޽ w۹6na 9z=kxz`i}T=wױu.8|^ VM88m GǓN/DҰiVpHzV'D]PRuFDz{靔B ϐ ҁTېT̠FvEA{,[r oxBbjnKBDa)`bJe )N,YCGBu.AHVjSY}s՜wmJ`y) ,س M[d9g }-n.&&3c.~U*)|Ž:E&/gTIQc@< #٨Br23DZ%s/5!Pz^ZW`pIba#@Y*!JTCFts xlݦ[׉>ρ;:FuID?k-Ǽk!5X6H8:%1ao6L1w<|F,ò,ȩ'a,g[?z7l% ?f[;>n t>4Ҋ,]Bnj4^D@ >"-[@$=h#`$EdiH\$|zpen" J~fs"Г* Ioɕх .X *<G'VGY?􊊄{D{C'm64Hk4H2(f<4d&4RJZ h/jba}c q' K%'V@I [h#PZ6X*K3OBPK[iX1&zrztXjybsNf Imv ǏYyC,$R>7`xٔZtU lG-0\wnvxCޔ`Gnv<?~hןn}|[Fe|,U\]/A[K3@Xw@P+v>L Î`B4R!i/ܷTBkaJ8Ԭclj# H Lm6 @*̒=fgE.S bкz\i6GPlS]OT)Fvȗ: 3T]ƞ' Z~|x}8yEqEST=$<%U?Rl3} бP8|2JRe!='K #xf(̌M=^gy=7E8X+ywX< -5?gFjvÿpΈT5z7=s" 6f|caFe_3*OwtF朌$]tAH%{)&hx̀kGJ[iDL_5ZorZZ~W1b_ň}#U=b>ͯ X | 2J!7pDko$^HK8O_\'jO `-GJ#V=KtO|͙$AVĿ@g:%6nuy%Tq63(*O=M+= bőL@P-}Ъe@B&sgm\AnH2df7 Ddl 4R`IҰYl_)a)y -ɨ$DhѴ~ ݰ%݇]ӭYPOzpjQtˑ}_/K`K}!]>YQeH{wy!i+sU3(>!Cp*L:ӆ,ĿS I!y┢bɦH1T;J3~^ZPD7T':8/QXCCz7~3Zp*cubR[Q zxUgԎhh ~qI|ou CqH| 5$TUQ&\i JGǠ ݨAĈH{xudZb8pjץ{ШĚṗLRt)n̐DLR(((22mY'm<; q ]*ZK>Ftuqg]59]K^x/@\.zC{='8n|16zF 5?\5>CIF)4x6fDQfW[; dK)2@{8\QI[5*6Q.O:Oj$etTD>qͽ+OjMr˙!SY0IZB'n񃥦Z%uItee@ R:ʨE.(< [ /Z.ZHMpsB 1(?9`yp/7dƯUfuV=0J`&qªjU5p "1c^a+:?:?ݔKxy5ZU?{߬Ϗ*mO>~xyoaqިBѲ+M17"UiheeZG77V6%#ܦw5#zNwx#f-Snݺ6l*UR1>lJ@~瘑\zjT^ozzT?'uv]|7;..Vˈ]ջ@~ClvU8q!p}2Vxqk4WFhwSEɝEi Ɠ˭ 7/<,o@{>WAO55NLp62L @b8E~Jǧ%f .t.4_ŝ}R_6ey.xMP@4))(!3Fj-6.e@*p\F0  D795 B pwPߊDSafٱ:Γ<<pv^@q8,12&ukӥlJM땐c5-{;^pRTkzD@k(RCpJ K)rK(ӥ. LBzhړ IdbDqՌ3ЂJk=/Rו<59[ˑZ*h vԗMaZWN4)=AMJqKpAC!F/$  ,L[k㜕/ZR0EmC]J{{ $ٞfcHEHcX)5@y5j%XS`F@0fFR|5ft 3&er$գL$T:ϝ=S432z&8Ԍ:LlJ-Y|zLiwxP+(7B]l ط?Vm\߅O{O8fT߯K{ٔV᩾E?Mkߣf;k2׵@Ӷ5zuo_aw?/&-Ev,U`Fhπ:h-C 5qkysZNMh[KB R:ƻU'L~|;'?XOu|C~OѬ^Ew>mݚЏHXnKm]_U_bԚ^}ΪZ{!bGX_{;b3"Aԭi{7+.-sCtnґ 0J'LR6AVFp*;̪5FUjC N[L ToU?r]::LGjl\ ŏ9κH7rm(8:]=X@mC^nEeJMoXa 2+ߓIeUW8 F/.}@-vd_?UϫhW>7G$  j\($4q#JB]{UZxOUV;_,}ɪ,>1U߬.[`6D0a]MAo(Cbp•% F_V8QƋҙB8^i8(wywJ{̮#\@4p6H$(f׹;eLX1mgC -K˵S!7d+ְ4׸b Kz8,z)F'dYCDt8dɡH!*"NיK֚쥯I1 .7)c3=i;6!_8C3h3Ԕ /?g睰X% bQzu$*Y:򹎤:<1BAT1"kK8K\4 [ SV|n&bWfa>AxQbr$/%2 D8u Z*ÊB K eCjҤw(/eSj-9]Y/Rs|yF*!V&d> ̧[2[VQxxWI-pw8,͐DB+E8,͐kFe,/ ppv8} 6FF-=7}n>0 &]/rE'K3 uLS=PG%<¦djSzXyRlL.tlŐ!/ 'S#LHQڒeѸ"U7l=F|#E4b5^:J7!8wWp|}.61O$ְ d}ʕ4,uT[VI 03=gĩ5`Ftמnݠ|PV@O.c6s.7|‰TܙD\'jb|f\VkL*y˺Jkv-R$"c |~v,sX=`T_>_ %=xFy(Ѽv+i"*h7MG94Q {G7) i6#x=)}u-Hm:I&VQ!Gn rJwzCo>jSM3t圳ZxnFn\T:'82x=Puh,G=B*YPm .H Fn&EmO=Q8Sbd|eNKO`\ $*KDzܡ(& 4I7|gsj͚\w\NZ>#{$0.a1\q>Xq#\2X!Zj#sEbLiAbEq 6*- BW<~\ηK<`Hw -`>mqy|Ϊ ?61&MV8U6Uκ55) \qFiAb͎ӷ+C@eG{VK2ȴmjHZ۱ܕ}R`sS= %+M'8o~nɎ#QNgF/k#[?qw~CdVivsH )VuZ5rEMIH.HyQ8 n(/5Ɍ4*zBν t^(I/4+`,}@0IJl(uBemXZ?EnΤbk4/ +Rf2W乐$<X,ŀ^8-( E]}d0\HNӀ-+N)&-2K6Fdk"cmH\Yu忞g.LZt736-oOOEXٔ/vWb3-hSqfY^pwyD0IFL>,vȻ zwĔKf12ۓW?p|HaŅXgEkf1!09Ƅ[8;FCJT6pnԸM-X˫Iql:J!3Pͦ&a oXόsQ7<۫yf֓Ap#P[O.dt2UGA߆+ .Mڑ5@a;HpUgTmm&XAà>o͜BC3lS}IuH 54R*UpDm. =ɨPϠۊ<9[(<ĆZ9ըAAo8h.ZoA5T/[l7<ݤA|;T DuFbv}9YSMBD12b0̇x?M­뻋ye*ʼnmygm\}Y; ϮGBX S^!}OY⨦?fBk3Ӫu䰙o|Q.:Ґ\E[|Vh Fuھu;\4#AUAn{ #[h+B}Ĺe݌кA}Gv,TK nB#[NHRSRSrؘGku䃍=t)l>`qd<ӭ+'E:r!:cDaԇirGU3i, pM30Q-˙><\ia ikxю.-\|zp)&__Wia Y=B # zvs`@W͖7g7{qqXEHZ^,|Ixf\-GG0ׯ49CT{cHYmE}ν -KmMX-&]]V 1k]||a= Ý!W3 Kڢm5JDm[LB$f6"1h]| ;DRig>.T$ ޤKʁ}I7.VpMXfYDkU2rd \hfLw9!֔5_/_ސaB% j g~yM!vǻ6wBHl_\#dTZ:gR8oV[Sk)ܥ?]]ѳºv?-ئ%@r6K|(rek׼Eh^ܔ#0_3ͯ;Ku0*l=I-J\?g&h)u u:$>D]|Erww@N!ZIɋOwS?"Z]F>mkuIVfsqo-WV` fyzZ-p ^"12",ZJ2&$,MR0 ɞwN}9u;ϝ6A9~(f2Ҽ:qۚlF%+CkAT:*毓Au4/$U/٤ԛG$MR1YYYYU'u=koFE/;~T{X!l6_ͱ3#")ɢhYj$z] U]ӌ۔)Ɓ⌦& meR9s Cf#ͨ!ulc8$_NV0fwsW]N@ePփq׾:&k H&SBd?\S<1:x7bmɾ@H<JQmQ,e y(( jr7-m.kBȎ 51[>q%#oDaF]NLX7)-T'ǹX?d4 )b" #jjQD"L1?Yp`d]wz86.86G9ql׋Tzq]ñW`t~EA N*,uvuBZ½s & nw ݵOgϧ=a\/o_gU$Ԕj#(zc҂<tb$N"YP٨SP::9n%4a=7 5%.o46"SSzI$QJBSSDBmP8' ?!䚩js,O^ QlƙW1>SVP+!B-0_eJ>T4K ]yF%)̈́&)4 mQyF^ R@h"/N(PcFzvWw6cctQEgY})$C(e2smrO4)XADKAnΕUZr" +FzTL NuV}Q|e8O^%ߖ^LDAjnؒZZLka VT[dC}UR۩WЂV(b"sR "/Uw`9R k f7A͎"!o?7EB(#tMfnCݞ°u|oqgPW /kȻg?Ԕz= .@-3M"Y)w Pn<Jxt`ĺ\ Y D=RӇJN|Dދ>|tSq?}Ԡhq!Z}8+Ch/'!pϯ</J{yl#Ǚ>bMTp4{Fjx1_ ^`nGD"Ov2_mr,w֚V|ǐBC䷞Fj5,#/OiR3'[OHm5WUUjm̖1)<VU]osU֗8Ox" ?pॠv?E_8ee~j\V)PjmexyvŏZ,c~TɅgژK=ӿ6B]~O _HqC,چ(7UJvki:|~ҴmdJ)M^ۋC2DS'OSt2Bܳކo>'i*< Բ25dtwDda(aψjKC# \GstNRй Lkʘ&Vp 94NQ YiK3NANsH%3U:nHxNʕF= 3MiĨQרG*L. 0&Yy&w&KFh{FB0D BB0G, 1|$0Y)18dA&q5'b #1)aS'#)jJj~[?. KMTh@ Q<6biQlQ rpH RpPxNJL΍.af2F#Z";gPѣ~ ;2* B@G5=69@:Ba$ ΤZ 4S)aTDXU\i>ZkA\) U Ho- ~o;@{URc9\³%jZ'%ƪ^RuHaOJ#U8R='"慉kݷ_}_ۯck@??ƙ|\mTj< ՉS$Gd,_Mސ] @w^o3MMзP9LNc) rtdnR%r'w,˨L~ w 'h-kz)L z.W+o珿T#6gwF+ᾞH++g"t֞zׅO%QmɾS#ȃS'KV?`c76 ,Q J,zRNJ B3=qVbB]m vKZGDJfcP3[8d;oZ](Gp-EufbjlWy8v .֣aBhѓ K}"9b$t%X=ZHd"bL&e擛%YTܓ֧ks8^9?AnIax9$gE c[T` pX(Ʌ6/&`,"G6Syx~ 27SaCe+.:I'*8cEfaIJRi8(/;=Kg*P5LRF7D"W5iƊW]J#8 Eݴ҃5 ĥ?HZ S^֣V0d\dbЎf8>*=ۂHלGBAXP"BAdmʌڮgm7SF4v{bq+B%ܧD$OAaAE$OP hA_,Kj0WA}UZ 5Ơ'|uxJ?+SZ1ۯWF#˄sE jB#P_=ZїI?7eXZS֬1w [2Nh-|b$YBS#P^S?_Uf 9# p _wU."'v:Ɂ?ZsX/.r'd= QJK%["1*søIT|$E$!Vte=q[EXJ5@tXIO :Hv$0_  BާizYP󐬪~veiQfۄ}td5zO/}v3aߢ T\j^;(_4'N0BJ+d!Pb" !y2%rdJ2vY;/ 41CTƍY.V Jt?*yz> gM Rrs)KE.d 8%6e6z!\ n1F\i168=ta}M5Ir&U5ΨLax8!Z~k50S`B^z{)g4f%إ]97n1 *ȣt6ˇb[iQoJ^ܾzsDl,h22hfCV_¡S5(F+ NμV{O4L4JrZh G@KFOY=mr)D!?c=SF?WC\38IƒG`ϬI)&!O"*AASs`WP4'Uh['^yqU !Z#5|'~m{ ҰK<˂ղܪܧ\zƂx7~1C@cV wkrڜ7D]fFs0qQy'b#NacHׄonb{alCLiG"LW|նHWWm{|| h,(IgҚTR3I}YPhK*V4mcL[Lu$U.:KWev F zB 7'<7%H;_Xޣde.}r;!@THE'pvO&h/RwZ>"VxkhQ:7B )Y'K}݁'Z)7/mIdRw@w'cT D)N,DM$F0H4r+27DļaW>DnEII'5 s 9=r7tqxəqAb=M@ ѣq0gj}1oZJR`L ޕ5q$鿂ˬ=P#ickhyQ]1pВ1}Aq.i_efUY  L1Oy¶œm;%%j+N;%zZR|;I%AaGwEEƥ[o>5b.#^8Ji{x}koLŪnEf )b|{!Iy6SAB3hB!O<'lYz V1uC7ƖDy_"9Yo^Z*-nen@ľt,x7i*ۂްXoݢ2*2$g(fZJr;|Ҟr <4[8o!]yyq璾ڛq4EnfK-$A!ÊWƣ>? |ytͬ??G_lu}$`6ӗ&ٗ WJoec1( e4(pRU# z>h|'1#Z:jX_z=GT?[*u+^hDBj1՟ ,5?*&2nʸd0Ψ>7mHcFcsr>2ל+L.9V A{y_.@` [ھc&@oYW_yi!卵 qrtfN.cT\؁-;zsY!5P0;vqۥʂTl0"hr$%<(XC*8%VJ)>@9!![!8V[v|?.~L.ހo?)+x2||~ |ܚiCCAI}.Vw1F,}. P13 `֒J=&^F{hq9M=kq\#r.1Wk펎01lpkE4g$W VhĶMMJ/ !-k*x81dCw[Th{a2$g/^Ad9[shwk*=9&x-D B̶19FhR^1GkĸfHp̣f<ҊQ7x SFĸ"-BA066GȏyX@~ N5t Dƒr҂7T엛D{[짼HEEoC~ױrH!OO/xDƻ% pݿ Ѭz@W>DlΏw`twbϸ۽~s.zC3}7P=;BZimZ}D)DxYqa'a<r{{*HF&Q̓z5AmΏ8Bs$v }bk}Z-Jθ._ECmIѰui(Qó֬oZbhk83:(&'!\;j 9vh.Zr39e;D$;W\D ڂNBhbjs3x%wWik ^PG*}P!vNPzQiP΁[e= VZ/@*G,m.*7׺SsR+M^G(J%QJ%[sEC_oc~2?:]tk~컢vЕ%ؚ/.W=}Ǣ2>܍hn䩒/W9eQn|/ryMG.ANU^A[1W 5h,u}̜E/zrV$ɵd"7mYv\6,SNRej4y\ɰIdMAMl)Tm{}SYc&ˉ $^F`,jK/Pi6sQW:I wS_#`B/OU|oĘhtĺ;vhFg.Ϣxdnԁa4;_Z0ְ$c,)*Cd-M;v[ibZ d\EFEEԲzLtvwlDxѠFᣰ,Q7%g z%j>Vh>"QXPJԒ)ڂ`^v%,>iAyn*S‚DTyG- [P8%~A1}*j)?[НZКrCggAkAςl7ʙ>&4I-I[дr~|\1y "G#Y$Aϥ('@ nz  twNHZ"wBHY {:kI"drV4ISܓuV&JUjR"D,5=ҝ48.5}HSfsr'u ֢`mnCAH$'$ _u4F5I%C;+@U"PէJK6]oضRݖC[%d{eNײ]^VbuP3m&v:5BN>Dg(*!"P|*j9P<@$:,j:jarx_I.hgIsbzLqyn]seMSRcp]MCLr-)>Kws+pm1Q}QPm$$BiYD]yaMjjjkKtkL3I41;I<BGdZsnI41 FY VGt?z5<,ѿeE8C> hblAhAY?+NGaAY?+Ǣ=m V-L˺[A4: V- (,(O 1?[=ZPનnwlAyb^@-g z]>f-MR7WW/ļ2qf ;=GXf'i[)< gWTwĉ0qjl27)['43J,sl!Ωe6RA1^ID:>TWBG)6\2ƁF #!h!gi#rͰU(lEӼ!&xKzo;_nb][s6+*r(ָOg+urLeKR.^ed٫K3@ʦ(&ȢFw til6rf"?=ITcv7QD2_7nB jJAѪ|MO?~3zv0 Cڏ=Kfv2Nzm0xdƑ=G̬`?55!F˥Nxб@Qp|?hޣJ9BlZ_,.ڟjrhpy!iؤImQ/u"jjO6TeCeoܓܖ=Lf|V*10:} WX=`r"*7~į=~yL4\yU:v\9=Mco1|\8Eln0_>>s+ds+\l͓%j.? Sֺk%9rY8]h܉_0Gx2GXD8E逍e(N p/xl=&D7crdZ?zW뒧xv'ҫCC x S0^+k< V P ܄yvΩɲ%V)e?jG1r*(B.Fa-2v1'<#g='} J)4e%c)\AK)nMucIR*d)FAde8JHh~Hpbԙ+Hb@3TRZzR-f .r9%duxA;uT 2F|ɲn">ԯ&dcRϋ ӸL0ԯgNʈN1+Aq2?rTU;h7;(M[%p2s Tʮw絃afh f}b%%JEJ0!%K+M T**) uq[ަ)wTs.MM00)<<,`JnSYc|*2μyByEl MpK s׻  u *$̴P_1YhuE!0RPMlS_&b*Ty}~.3{6U@$ I!qVHP{.ҟ}r^fHXAO|Ow~IQkg<`JrQ{`:db…rn:Fئ$t$$#  2ZLXZϸtڑE=Q[bRJ^i?'gQZxX8Ø_'>ΛW[gE{[w&N\=X?| ?Z_N)nRXJqST|x53`"fAp"eYD)U P"׉`& cmb D:;fD]NtL'M_t9yXN,sەmF[ڸwN6uc4~v|WzbWa3hp$M/>Mc2oP\?Muj@2puSdfVgc(% %d 1gE! iP EUdxdQ*q"T`-k%5o؃˽O '}n&A.whw"T(q3} DoLgr% $a25דXLOL~S) ?0簗I*D;\mK~RlO.?%ɶeEhM7Ztb7XmZ"I/n73c"0{_EYMO/72'm4\%Qe@>eǟ,[c}xESzrv9~^BGٮp+=n?fsyq!/4P^|b$&?r:=ҊHM_M=ƌTȡH>e>d2@͌hWlKp̨=߹VHqTļf^߿YELsnʡj$Ŧ^6wnW6F@/]zvސkF\xwPҡ kހp3DI,?ӝlBI8D{S^½|SX3~뾚̧r [ӽK8t/&QN)t٤{ExsjmSUx /hܯ^DS8 S"n"("'&8 /[,#H(Bs )(-Vt ;/;PL*Ѵ"*iuU5FTBo}DkFeRR ,*'se嘬JL XLNCkHGO1bWL!kQC)4a!(%"Iyp7ouQpsYgY/7ݤ<nLkHn7/*1 XkSm6tJdOvUOFT]֧Ng>F/H^ӫ0r\d#ފm\!Tj݃ ?QBC݆R:Γb?.0;&)^/0a6ߺF܉^s|]$YlTCXRJwo&MY֛VF绦;;C`{tLcOg"`q:߀b"xXKq_%y'gjr؜O3L:7KU_)W)LuڅfJƝmlK )jY"Ѻ`G->t4uW$uZ!VrJgoӫW%<k Mj!GD:ldgR>yX*ޓ2Ξ=}jJ 4MF.RҸ%mXv}Q\ۧ3aR{sTO/Y$J@H@Ǹ[@7S3/AF;} hѰG{.mJ9:| !{ktq*bQo33Д ?[d$Jŭ⤼:q:! t&_&8$ h$" ‘$c+rzHs b% 9ll/vQKJɾ<'$9}pK<%0"fH4g ]C"!xe:zSX)#u 4Qe 93\!adEd\2%kp]sn6o!C=A ;qF'Ak5e37HGt@83k#)wZLeB9G>H:Aw[/+~Q p%Vc0Z36,$Fn Wbp2˜J#L:f%`u,=&`_SW?Ycan|07w[7y$'&+ ˩ \ਡGd&q"VR!Hƨ!!2P.<)X.  0HTA 'B|[KA[oH &cgz\G\vmI V_n'0!DQ\%J+]!WJTC8N9k H{z,BY^ R[֡m!-bQ][Gx\ q` "V#\R\`@9E4FvN2piÜu[q>fōr1R+$=ͤ Tx"l24RBH;SMm^~݄ĭj][{,NL&|)g˻o{z2CX>@s2 fEo>ӏx "N?#7M-@̛N L#.^=τJɍyXeGJR3QW)U5xwʼn~'1P}% P@b]KػJ(}XbHokIR=:G݊sҹ@ZPYѧ-RG7VLVI,Ȗww53\"VaՄw.~0Sq~㿶 \>AO֋=@/.uIPLI vI@蔘X)r S)7r)\?&E2FXܺvɻ<0aI^=|ǜ>"\փ7VhXRz^Z0|.Rka==1Ni<̑Nϴ|. )CQ&M&)gm1-̓' Vo" (0;w?~|.cֿن妼]|+#dDD#sgkdwazZI-?4ީlHL$Xΰz-O#LVMtHȦ$Fina3 O8y+/C_=pX2`;# >PHN$VϢT U؝9kʩlj0Quj^c$`[tLkXd<|vfrÒd6(;B2p2fKU%-.CPӃKq7jHT Ƥ8EJ}j^UnP?拍 [&3{Jf!? N9?a2cS_7iy qx9 Af*d@6`#@40-e `nUޗJR垙so3p_'N5`z8 [oc n]9Ռu=eI{M&XK5ԇS4hkoEuJjAʧyѹsuȧjyeAޛޭ7;`w{xSX2ŗ5~V9>/W?xn@(fd0ܹoT/[(4 WV:%mt4*[[N1 . wID.4 WV:%=mt4S[K$N1jvEYn[hEV`<s:ΐL5ʐLyÞ Ix/ ӗ`}y!6]ܮ|~ xfN_T`S[ǰH9XSj7sƧDu[Ň9CRRˢJM07l>q rC\waݹߴa2/HLnĀAҝ,;$Ar-).L5ca*e&)CĩYGn),1£{zyjKX级=ʼnyXOY*VZ$&̓UPOSo-c8Ʊw% ]9Uc@9Raz֣(n a4i,`ZA5+럭Eւ%YF Frya:1:bD9i!Fay"TK\6c9nh*Ւ~>W`qbuؤGS5a[T+rvKDzn9ۻ^Na%.׺Ҥn7& K0.BVj9X!p!J3ɘ2uG'LzoJvl`InnhwqΚKc@(]MYHfCx%;Ӎ[FQe=x4)_Eމ; ^(])nuU^w_pH{Bp']noppv k p 4J YAPa [)*- (m$aWNz^Z1π'jwmmChU凔͜u6:Tsu, IZ$pDJ=WmI#N_h|+ Ql*}"7VԀ[QT"}J:b{9Um̟7z@HiLJ&eNV㫏Ս]϶Tna>btjس^ㄮ6xj{tddW]BJb_ -cGa&ᵣ h}ZfWp7#bhFM"䪰 JXp 7_U!a*KJe&į]󼍕*]m۫5}W?lGM4OWl\cM }*}I$Z$6awBjZX*im֗on[lŶhX#Oz-Fy͉G*$$2ML㾟K-%y%=tb(RW]'54Z`so@;٦+h)`1!W6 o*͏/\웝~q,卾?w̼{}7"×8lZo{'s^8py".aRQIb 6Kj"DQ̒D ]XE1s*f߻V8'%?rv/M} pYEb!ԾL[t!5ûH޽މY{w+?lV?#,60J_`Z32AZ]ˏ*M}4,l?h%N(N_}]_+uǓdsCu*\CЌX }<|c\(kXf2"]F.Į|MTB[pH\ *s3rt2W+Enֹ֫6@1?/ӹdr(2ny"捪:pH.HX9O{ G%!0Gu/afJ+{N+yFUoF8?R+PzB4Ƭiwҧ|kc"`BГFP4Q(GnUHF}#(CX^`zʐ/0!ʑ1XreY23lBГFZ^cc"h-/.}#(Gep+8֠ 2uA5ҧh:v^N|If_fC(HW‚Љ*=p lhJ'2*8.y?wHpx ^!|Vut+|Xc5BxT|,҉dշ0}>LqM83a%I.]oʎHGsIJ$R@+Id. HpJ9_L@)G:"(KbIzy-PT]1&۽eKZk6:\AyN hwq_1MQ<54fyY0Ic TѪ5JisuPFs&098idMRE9iʈʑ!d]Eg1"X'jD^whA*PjWߐ]%Y[HmTy%'9h4" \<꺐$pNfR0&=gc<+~z~F ʫ$/j%bVdK ͕(Ⱥ/6J,SuQApI0n׫8*jc!i)._uWOx-TK<C]m?6BNPu30 f:r!j3,vH}]gu-@fkGmJϪMy}eC;z[)qBV?[?WX. M|LH4 7vcXNQ3Hy(?B >?1zǧrɦo~Φz񉒇T%'C<~{a/qoֳgҨTvņ;(4aíӞ*x̳OPJZ+dX&!Gם@H)sn1ktmVmʆTZ 쟆DMlZHmHmA@YƖ;򳲛myXfo KZljKߒ 6:Y]E 1~8;$;{,OW˦DxɲZp7s϶)CxPS(6!Ifj$!k1hLkbƹHY9$TjX-i ܻcSH9 ,fȽ;Vi5;$3) *T(c :rqcRV6^eUj9‘ =FxHR+zZ Q5V`s>/jQ 0*aԀI!QEY"3w^XX8əaQfyN-%UH8M} SQcE\7  QeJ%z[;L:멀1r"^׵g5ՑZ+zWa*/*Ӻ}u;gyy%È 8*LMWȀG=@&/;-gNx9 *ڹT*"XG=/ouyt]͓zn߻^9FH_]>V[4nDη}V%z^>f ye*f+y 4SU37Iǜ' A3Ik9bWNtH@i>1ѐT,ʒkG b޹N=noͫ}^W+@"Yo \ը!,Y);Dt:z$/שFpwkfFKtOzN~x& Kl<27RV"3IFwܓ̀}=AFp͠'52(!dFFs'j" 6WAKmbˇfzEK句ǝwta c<6,~}?/^8d}-I#R6ff86@2f'j8J$$W{7w O{b6n/: ʽ{̟7з˴+AMk!:t39LB5mc9dip-0vSDFBr@ӔG3`.BĔدUb+"bI;U4]҃:~ĐJ,.kj=/*"LZjq$Ʌ{n(6 rP-{=Pr)~ss)>whDtWb~ ]5 1:QˢZW],uk)(c=S\:π)Qy4  f#>o* Nѵ1b8ȡh!}=/v^Lx)fLx5s9lɶ^`o~CR`Z2J;fqB#fVLi \G ۥ)IU0S+uJ|C2X#N2V8,RG̮t:hMXJ4Mla! ȅJ Ii1kRKJ- Oڹpié '(*&H+B/lbf2d>MAuTRKI'Ծc cdE&.Ll[1)(KJL !Č#gVLN}ʑeB1kJx1>'')~$9 ȽȾwBkO[8kOyմAr/C a*{^j)=mh'|0M9U:FPv^jeu? d]{cCI@,SAN+ d{.AzYZaN& A)Kݞi٣S}^eUjJȴA)CDR)Vt,UtQL+B(w^JmCoj6 JODC"(D |$)`k}#({4M؄kRJj4ҧtqGG1)'OYIge Գds5e饞%io1vd*c^j%?6$ 2 (T OԒlTpakc"H/5u A{EP䠔XL ڽ.Nktꫤ 5?<ښAHj>q}Ma)4S_SpvKA(l MkшB9s#G:9b!JĆxxfʇx]!$K\?٨3o6l#!JKpyE!q≈R$$:JNe48O)D2B%$K(?{WǑJ/o>a!ӋzFeQGvp4FfUn2]2 ȈȈ/ 1m_ ˛+4me) A;cWtձE,: B Td/ᅳo}Go*d@2H2(nUELk~[aK—;lG"4Y8VB&:!F*IScKi  fF$ub`V;?i{w6G7+EJr)7Oq@ӳlG(9 _yXR\~,_\d<nhT*_!3g`VM!,8 Ă=|{v}l ^CX#̱>-曢Ṕiz4L[DsnGH5;0ڡϻ7{hP)lY: W}4~[oWN7뫲Xuxӻە-mn/oHm|? t mWܟtv>-k5h{'K 30-B9>}UP4NmD`j0מ#i]NonHםZ, Q>v}Q d"sn ݏlç" -9DQRsc]jҘRwRL[JáRr2:RK S XtK"o  W@bGTS*jAR`sd(pxZϮ4q>1n)Ew?:A0ui%X컥B%4冭le 1{WRBF8-E8Y#֙@jU=L3ŁGWI`R*].Pe5gU lGc$8%LS`drAkvV2XpFp f16 B~8ׂ咔\/5gb0Rtf Cr2:_%Q(sAKZεx ȥ-i=6 ؝9fj!ͧlm-jL\r, sR{r`6#$XZEI*TDŽr V 8q kMK;=*՜r9|_G(Ǥa+GIBat,*4BqByP11Y {&ԀJHX վR16\ f /ܧ\`@!2EqT_V#0^mQI`XBMQئR-tN1m"ʫx0TkJ'm ;TQ&kTk%'!m X~&K%kk5uq1Qd{`f71@^= w>vq-==tdn1+ tNhaLn?1 rev vDm uy[?\O]lǀƩ ;.7ɜ !,G5D׮,@!7_!3g`S`'Qvw' $÷g_)ܡA:,Nl@ $ Tn ^D񸦜 Ma<{<莽^( Xf<9RJ`۸xzz,z}4ɵ:TkdëN#._׾ŕE2,繓WݳtAEY-A&m5! "~Jn4xMh+J%}dj1N"{ xsJ&Aۛs'Zs`(P;A_WYy73Ls5UEXߣ% > u{0fc$&Nq=aF!i?9c#EjP- ;fv8՗\MeGW\0m,2QHwB#xa aY4OO&\33_Xf``uZD^NV}yAs]VA f xD^aC ڰr傋X>`YU e'̦; D ̠Ad>)X?YKZ1ǥF!6&@ӃK0֓(-8㣲>7`EjYi T;xpU6Cn;h+sj͜ALLpT2~ʭéº,PʭnKj_-CJ⡵}գoʦL%Eͩyzowza SMac>z[rjMy%@ASh V.UB?o7nYsz]fyw!5W0_U/ھ<5k+֛'Soࣄ9<@cfSe^БUNA<&FY7K3Nh Duھu;]2-Xm n]h_\E[ȺiDFa Duھu;.B[{֭ hEhN9LqS BF8JӾ7iN =M#Z{gH6rͱb|W(F S80c/KɷA`˃L&eA-"P¹5TB83~*ծlr=;cB : g՗U%xr:cuJ{h@6oc߶0=o#kK6QGjAԷqHsIFaXcI߶|76ZLƨ5m8p֓E8^!`*bJ3rtB3_y3pNtk 0gGϒmhee€%J)*I `%V ilX'(c8 CE]POn1a3U8/$˥+-\E;5$ѾJZ9 T{X,\@SEMñXq0 To"A'H(e\>mD# S$X !D,!"S\RT#AlflZ&TvR ,&YUJ5Jbd4场Ė3+4I0VILLUl=ܫq|IWTKrj_攸)Q5sSTïo [<$)E˅5tib*u}NM#@Qfoĺt7\VgQ >`Ӟhj]a&i=c{7DbM3JY-g{^9{+Ձ\pj1,Hc"E-J2,DGWI"Cv) " *@."e0 V~Ä 'w[e.6*-8ϢCO;1Rn4-d颐p*QLeXE:m%0;jGGޱs1]I!;(ʾ7NlJH`$ =Wl7?GVHϴcqMqE'airPSLF޷dH6#U:9GrrN!8>u+*WgWk:n\\9U1"5?FvKsoWν]9vUٽm-221ei3#H̄IH0SqJ&FJZuK>fy>~~ 6we͑FtűvTcbg_$jccBb_@ᘡF1uRWh"T"0ה0b6`Pc=A*EY O}&+t:Z1J/?Q FJ..k9aGȁHj|[-v`jQi4 ͱ& 8_ngM0NS hc7,ҭzpnsoi"oil1HQP9PaO JFk;))\a5\ NB6 5aOVET놕ImXC*g]ekqL5Zr-^m>56_Hl'{W?^^!Ÿm% 7% Ax63 5ս5(9J*zz07v~7Uyi0_il;_%NE\XGܢZ ykNF~]`^, ubRe|qk0o 4fK 갢u"z0Cc7Y_}z9vRoӼQV JQ ,O*VGI~gk{Mj%Ltg˿&%h;^q=#f~jn`E k% Y+lҥr4oT(ռ׳Uj%:1ϩR!9W$N<:"t+9}R  #wנ2*D/_,cI꜂?x >I@! R9#nobV{YXTi.~ZQN)G](_[o?{skh,EsyLkU\}ᇕG:lwwg\_<^/Wÿ3%XSn]֝f9ZհƳ"USwCT_6}.DZ5N헦tT&T8Jp.|N!G)G O!W:|%ه嘦} K`xN`}ͪv_Wg쿯MX>5W C6,}*޿{},$?,W8ĭռ44A .GMKl Z>k'J4?E6kFKq YqpDî&aoZT47NeHzx_*˫/7Z@qzr}g'b`w~K0wrNjEuvngwlŤǨm\tdqvOw5Q7()34xkI5 "RϤ]w w7muEpoW-b҇-Rp_X=^UfnlO̓/7Zw;~1by24TҕK]dMoܜӿ<_ueo^04/X?eM4䍫hN)OGoY7MjTOhry:hcݦ@I0u/Jn}hWb|xm~~B˃}G6ݖ,{:usHn}hWцN%X5CY󡪱TDlbjx}ۦAmNlk) aZ/hU5Sk痵|î maO%fEPj # vY)OihY"aS!IK|ljEZݤ0װۚMm fqފ#{o"))9B(:(U_|JOHIK-A`a' fK0*\yGkEF)ImЧn k裗$c^^$9mI   9ٞL=:zox4Q2b7iT-:DM߶ FuTaZ=t PL@zC.\O!9 ~:T)CGrsb$NRe19%GZ3˱¥[,uW ږAK%YĺE--l"h/rL%-ŌPP)L2L@r( %E$+vJ:ч Gqb[`p\?`E|mt*L5g ׷Q# IJ)U:p,:\I@WIeĔfCAf,4g| lR-9:TB(L;hG;ٓ9h>bl' $V ps 'nqf O6m³w9[`gS!U\}Jh]n_>V3>w<'r8=kSdsrp,v&!-Fƨ! "ɏUaCWKƻc?WbDT8v[ݿlS[Q}_lP"n+);& {Wh?T{ڦWM܇'bɲCA=è f,R2$takbjYd.MgY2@e 6?do9w߸eѼjd@D{PӝJ\Cg`}V#Oa[$>vBᵎ$vrP$+)6Գ\-d̀Z&kpSf* &&BuwOV)WG2녻2 xWt} QX3YCR%G@v2 K<0|^ ,Ql Q'HX 6m|N;TK@f)U@gɒ@-c[c Ɍ 0Ie)t'hr [(/?_Mi",}u5'Н~J%F]6/xxDžmT<\N"Lj<^%QI +EPDm&`O}YyǓi}RFO>i# hX)b>Ye(q<7(Z!(DpqvxciĢTsI<,$;$CT_6HsI=T(>74It}u;1sacC9 RRr$ vm>3ME$( 4H6WE J%6)_-Ѫ[᫃'7bajqsXѪ$Hօz{子 DAH\0"LQ !2Z=x~fm?ME\ >aYFdDa.޾eI6L0O39Ip6VOXξ{Q_~fTş,J}f,RO1cǺ#?{ƱJs8M:yb_؇}$*˦('y$KTf9 آ8dUu긹bR؀gp UBD oAKqxC4-/OVG¥_OwC(V4¦?>X./o|,^ vLg6..QJJ:S6:^˝ 9SYK?LU$!'7![~S3"" =sx)}JL#G~1q_9g;HձQpA[#GIS?J$#)Ș(2VN`k]T֙k##X0r~J6]ї(4sL?_ODDS9`OM4N+8)an만_?Uρk-'.?QJD!mJV?E7 'B<<0 ]/&mSUΡȊH#T{ٓ+N.|\ >& ۡC<գoiJ 9vk'eDlewz2FeGP_Pw>6IV8.Ƹ;G(gW#(6u}oN<0pn ]_uxدY=]Y0\w+,\Gn<ˑFn O^;LRKr0(}]_ Oh+#ȝW1 {\٢UQ_1Q8b/I|GVN/{Id4r٨T%ߗ:EЪ?<^rStlT&|';R4T]yiNvi#%юo:w~(#c +ɕ?ν.U7 sơ ׊eolj7楢SoY9Ck S_!fK|ӛKP~A~> (kwx|xtGM%cU:4i=^_&c-LxN{px0dՆyF((-X_ƼgXq>J*5c fX=+o\a,Voj)2|ɛ\Xso8yVX? x ]hw; ɲD^ӹ8'Ɲ))Uf;˴g|'Q2 . 3`: '{;7Zo:7d@AH|;.'p*/pB|8Й0䈾=3GY+H;XQF9 uDJZhk'F'9CH[PJ9~K]+=8HhLQXkgPޗ轐K+hkYՖCH8O'j>66H Ta TJxejR7'JVQ }jT:bjgSPoifѨth|awjTQI76}C?u 7w #kx~@(僜܁ϫ{nI_ַ(~A Ihˆ-m\WF\ۄtc'L qs~]~:8_%J:\h9vDĹ)HHhO{$AK,?Gawgvj(%Y_}g*DO~wub?+htO5O'EK<>B`p܂=Qy%OgiAq-rȣ9zY .?2XkI;k,/78(s+ aA/^t| ʔ(\ ) @YGĜ ![hʜ`b^a[DN }+dTҊo^]忦RV:/IMm>^oe73Ÿc>_SPhK23Bgͻ3~=2@' Yg Pdc.|-Kc 'wE2d [-dOg w/`NfF!Nd|@ ;Ah<`0F_w9s"IиPK $Kl@.bj҇tDnco=t Fu(EAԓ ^ɔ䔙\2k4w']8.l@CqmvDBȲqݬxHuo`.EJdE yUH".yH>:,f(^ ͖>mwdd48LHB/swFoO]vZKc (Ts"R?¹2p(:\=D:,eR,}6ю ׌h>9NmMcuP mbJ @CVSdm*d,wP<ց|l*7 $[agdIsku |cU9#~)Dn:F ]`qs~ifY2YޛFcjncW{-RZWAqx:ݲ{0LAWfFLFMuոaĽ.LH:DU1'25"|glVQC OB$˲גi'jEd'<'^dpr3 .pe5 DLo hI]!m| RqڡDphq!p-7x%qiBa$ )J򳻧7rva$"!ZKh$I9b11&3^rV:fΝul+VBo]*bt>lvyX"pr*"D6/6'\Əb)Űu ώK|-A* u(sq⏫;6%>ڕ%*,S@"! Z &TIeFvM(Kd0Er uĒQӎ4:H [cRX.xNg?P h30Sdf)Ah2|:Em}[/v-;NK)is&I<¸3AOQ.5ya'M.2X($ 0 gw4t;s6;&o;T9}3_?Q@f&FB;_/__}y!&,!G/fT  <1+ BBKrzJ<3k % /۠I+1RdC5.\(Şݟf9" em@m,X%3XIRkl/ϗu\ y~˻AkS~Y:l9~i2 y7 $w8d9s}9y$FzIOF#Lm%% ƒ(P< &svW.po,sR$gH ֱ'RIv#$#策>22EyN 3./9EyFT! Ѱ.-9~ u`h,#\: 5w L'}aGIdzӏ^,Ɛ+/yz؅6&o~WBق[:QiK0\{A\WI *ȇ%_6B,r# U щNutﱎN4iKE͓0R_5S5O1!lvNYcZǫ`Ϝ1X)Z.izK *MFɂk *#YG)伥=߭dAΐjyj>m)mx!U/w17^yzls'+a1P0Uؘe=K2Gגq,xǙȉ^5&g/yҀd#M߈ajet>&!$S 'c"XÔ2UHV'}"x=hPw{ms΀t!IyDFZ+~x`mSu7rn9#Z%BދK##eD10"D}iE!jQI&w"F{wՔ}#z ތjlEaՊ}DI3fyǖ g9viItp3dt0|}1b'A< b'HpcpXr햍O8Nﱟ ^PCBn3Fsg>)k_mRa({ZrRx9X"GC.,".uE J=޵AUtt#}#)"u TZei 1!ҊBrx>L[fn=H6B>OwD`NˀBVmRH%͕dJH((ϟ8/e2 4k[泥|N'Z\ݷ= sguiq*GS{Fm޹Lt982Ŝ(/+Yb21%U+\dAV/@œׂ),q +Ip$ Yp,@^y6.zv8Bx|2Z|<.Q!(V9r @֧'9K7R@mqEs9q3Va^so22ųFPDzr95]l֫?g_i69|] sC4/5 U|fwѾ&g|q?ͼN!dܽ|%C[s9ׅmwVm=ONO'[^yd|%eG W?-ac2Z4=Ciҟ+Ȼ8i ˘m͋`$9!mPLW 勄(DIgP?R_2%? b|0ClDooK^Fۗ uk{!8^YJoj2'x?͗يl731𐤐`M+ V'o~= s1ܗQ^/G/tǃw_>}0ve)#Nb\%#8\ObF"Ĕ`Eg618J#A]<Y!W~+{2 zႋ;ѫܺbe1 ,k `dBqk F M>7G[`"prg翝FfqY]_u:n"{S]~fW~Fy%|%ir$&j b&dCjxQ+7o^ sV|N:Ն|x AucvJx#N::O^1Q_-p1!3MC/XCoXM_Nqk 芆T }rtq%o9$Ӏ`A]|e[_ǿc[BYN,f3 *;c`(17MO߳=<`/ k[yI h!,瑲0'}puT:;Q2ٳ^d +v dIڎ Qv>*p 0:bKRX| BN!ko7)&3뭧kl\#k15Y+MfYhnrd3kSW d GukfS}|jQ눵?ezDhͯd78EnM: f3hIlf6; qM1w"nF15m]N:՚\:%3}_Qb&K 0X6@2@L=LF\+PׅT4J+\%p&T0GrK P`_94:_Ar{<V棿@=[0A\n:vkGnR>_3T1'2SA f6%$IE;,d%əA.1 IP;bW:BɌ6;kW]kx"/j$ [y -9﹓Ijf&|`ƕ#Fw4b̀CnC(J[βkۆמOㄾ}Ӏ` r׹Fa\FErh`q7DC;fHe>O؀-tNonNV3s+FCR}ǽ -6D'X嫨٘iE>O E\~@VXe N\XX Bq)s"yP8g_>%#وsJ :._4bgS5,A4@ A@BZ13J0Asq 2 ;7/qk5܁Q+HW`ҟt'Z"]=ݦe./_I2)8Z1&EUZKwoG5Xr~h? v"eÉf\X"L2إR\>2';G[Wm#\E$PAF(wi蹠l,{%ѻDb rds1QxWmHO"-,[p;RpNȳdPprdRـlhz pLI,D@4$9Q ȅ"1P sx7Jn+ٗ0z tRb'm~ |?cMY# X Ěnw()iY֮?w e,^/ k5Znnz[۰MՊ`]+@fv_1l4|˥`z,{ 7psO( gQP2j"ላVQ"c ׎E8EF?ov# n[T${§QTw g<qիFo]\cs6SJ.uە|/bԊu+g<[ލ济7 U㮚,nS弗v}j.Z{J&2n.&uA83ńIz9ʻSVk{LFyɅ(+610dȂ cRͽxv޴Ld׺ͻ:PԎ>&^|hlYX=j?"d)0'T Pcc":S!pEF^FBfvR`0zcqIp] U}^X22Uχz (cz|t My5īYz7@x{& 57gC4yqx:[}S|I׸^{n EvN(\$44lNCbJP|!׫~}HޱOٸ~|CZvoa-dkN;f= ށdbܵ1wwΎ&Ǜ7;2 EvJH茳ϊee<^vޛpz?V#< >BZZe-I Qr?RLmJL5yM̲:Cm2'rd3=[@<s8 ֘`Qʪ>iۻF\W,.bb`&8g7AKA_gG$';OUlZ̾dl,EXz,0%wwN]_!25sJ${uJMA[+qs5aHT[Y]9 ת U >vM*T/Q-Gg' Twzm}泆.&1]\ȅfog\n,oRm|T}-~1~~Aer!zs:iz瑲݅ hK6n wǃ ί>ƪ?d({4+W6:L#e݌ecnmy@Qc&Znn֭ UN5T<`Ĺgݜ0cnmy@QcNygڷC[hsRږ0gf434݆+~AU t_D/RCͳ082$VjTk)޲-#]r/]!s⻛,Nn\} 93YZl_]aR;KFyeR8WUjj赙qz=rJ?cShxvJԱJbZOuձ;Q 4I9 ʁS=fN D=(lƒQJ =(`JSzcT_UVBɃAAuJypnFAAu*P߷H#c*Gj#UU6Uyl5 e.1כhVLޔn_UIĹaj,ڒݘj?LǓ`'i)>XEQD"Ȯ&^Wћ~'ܽw vKOfNd-Ov3>tVza]3bz4bym%yR_3"Ӌ J$$"p |hnzDt(-a&K}(Bg|ڑuwq?fiY>Zh^h.=|~6[^AGtOvU͓hژUd3{jt@t!fw%nꖎh#q!\kV*Mb; _f gmh(z:ɨԥϸI6dNF3Cu?;N$/se;N#$eKxZtV6V'a2YXqLdЅR6M"!0# Iwb?@#n;T_QmTsדm ,(mH{rG$V: i> 9%ZБtCWuYPe?+{ КUk'Lô, cKR!˟ 8E|,p͙HF,/aL0 Z( ]!Y@ybGK)[|ó 5oʷaMЧzU t^ؑV#yi?_TkOVi^) \LЌ+_5C$(:,a ;"w5eٝU,PbVTpGvªAӘFN'Zi>Xg$Z q&Z>T_UVbX]r mVSqIӣ@ylX+-8bk`ArXh+(<(+PG8y~<(Y#9cGjg'HQxPAZ jH*i y1 *@f:A%QG4=$Z;10 *.PAŕ!ʌ,nAr~(|8Q(Pm$ɃD,\1jR͛.qN6W]W"RMp&PL |:z=AiF(<(j/q$0ELÑGMA"ZrV-29OSf,Tr:,GtIdn p2bCvG &RmTƒQ}n;!d`,rBkQhn cjbVzVIDv(F1; *SjSSy{PAѹitڱ8X3}$Z;`i$-<T NpIY Ns9yгHE%d*=(oՒK>yО=(0q((3 ɕf6;Nʐ 3 Cu79ODߔPiaba)sȍıp()$# ),2 /e4g/v& qzZ[F`zd͹2ipr`dT?R7n-;Z1_1KLJS bq8&ky  BH+_?Y:FMESb.!y31GD<p-7K`5Ib dLF pI6dǤJvqJ 5fg PT[ڟg>ܛ~6mj+?dH/cq18[oIos/s~{du+ 3`1b3KI\U@ 1&fo.ٟ~twRL]|I&wܯPdjK6 k$ =cWT(+;803(YG:z2JQk@_O}H"`L#N(jUDFe*&8Ld # *Kbe:aR2Ҙ*)bA:>.VJv*Hn5w/\mxh9K5ofVJKRm &:I'_oV(n..z{O۔S/?w_lD-\}@?i^xŸu}w `LWcہ׫f+onܖz3.4ҹWfӧgo&q5 )&Zo:#'k9B?/sp^9}( h/ О_T:XUˉ\N.|vM|Z]^EN#z-52ۄy",|2Vtm&O ~G*^^0_j]jT'Fs~4л?o y.}꿯 "YEέd3!Ԩy(VهfeTdu=_^y7 >-4:"bFN_2J FsZb~ ˵]Dwvcd j+H7TSzzb DT['Ç^} q/<ՓXL&vO.D1ְ]:\ ueF(5C$yt!Cx+h+8Jԯ֣j=Z]_dd! bk7-4j}Mלjkt%#ə\Q]s($q vo{øK xTk;rm Lu~N7iL\Oye5ofmx<]f+כz'Se`'1k>l٦$"'ޑ|*F4=ɞu(кEuھu;]tej`0־ZhݺА\E`0sߺuk}GvSapn+֭ UN6#\>7G.jπ9.a{Z[iz2xAEdH@eAکG`8~TWLC7;xRI<`\BQbXʓL4cS1*2gT)WNZ#^F8+ݕec r=IX c!7 $`,NqIn(VZVJ%^3x, -96 ώ%@g亝dv wkt }+ծ x=QL_I106jl:WU-h6BXG8 ;1$_ʶXã86/[:c3Cmnn\^rB? 5 map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 18 10:13:05 crc kubenswrapper[4733]: body: Mar 18 10:13:05 crc kubenswrapper[4733]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 10:12:54.78342919 +0000 UTC m=+14.275163545,LastTimestamp:2026-03-18 10:12:54.78342919 +0000 UTC m=+14.275163545,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 10:13:05 crc kubenswrapper[4733]: > Mar 18 10:13:05 crc kubenswrapper[4733]: E0318 10:13:05.541596 4733 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189de7df5d565238 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 10:12:54.783504952 +0000 UTC m=+14.275239317,LastTimestamp:2026-03-18 10:12:54.783504952 +0000 UTC m=+14.275239317,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 10:13:05 crc kubenswrapper[4733]: E0318 10:13:05.547955 4733 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 18 10:13:05 crc kubenswrapper[4733]: &Event{ObjectMeta:{kube-apiserver-crc.189de7df6ab053e2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 18 10:13:05 crc kubenswrapper[4733]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 18 10:13:05 crc kubenswrapper[4733]: Mar 18 10:13:05 crc kubenswrapper[4733]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 10:12:55.007507426 +0000 UTC m=+14.499241801,LastTimestamp:2026-03-18 10:12:55.007507426 +0000 UTC m=+14.499241801,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 10:13:05 crc kubenswrapper[4733]: > Mar 18 10:13:05 crc kubenswrapper[4733]: E0318 10:13:05.554496 4733 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189de7df6ab3c578 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 10:12:55.007733112 +0000 UTC m=+14.499467477,LastTimestamp:2026-03-18 10:12:55.007733112 +0000 UTC m=+14.499467477,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 10:13:05 crc kubenswrapper[4733]: E0318 10:13:05.561829 4733 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189de7df6ab053e2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 18 10:13:05 crc kubenswrapper[4733]: &Event{ObjectMeta:{kube-apiserver-crc.189de7df6ab053e2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 18 10:13:05 crc kubenswrapper[4733]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 18 10:13:05 crc kubenswrapper[4733]: Mar 18 10:13:05 crc kubenswrapper[4733]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 10:12:55.007507426 +0000 UTC m=+14.499241801,LastTimestamp:2026-03-18 10:12:55.01878875 +0000 UTC m=+14.510523115,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 10:13:05 crc kubenswrapper[4733]: > Mar 18 10:13:05 crc kubenswrapper[4733]: E0318 10:13:05.570237 4733 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189de7df6ab3c578\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189de7df6ab3c578 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 10:12:55.007733112 +0000 UTC m=+14.499467477,LastTimestamp:2026-03-18 10:12:55.018986246 +0000 UTC m=+14.510720611,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 10:13:05 crc kubenswrapper[4733]: E0318 10:13:05.577508 4733 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189de7dce0dd72da\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189de7dce0dd72da openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 10:12:44.105274074 +0000 UTC m=+3.597008399,LastTimestamp:2026-03-18 10:12:55.279168555 +0000 UTC m=+14.770902880,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 10:13:05 crc kubenswrapper[4733]: E0318 10:13:05.583441 4733 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189de7dced2cd1a0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189de7dced2cd1a0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 10:12:44.311802272 +0000 UTC m=+3.803536597,LastTimestamp:2026-03-18 10:12:55.491477155 +0000 UTC m=+14.983211520,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 10:13:05 crc kubenswrapper[4733]: E0318 10:13:05.588772 4733 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189de7dcee58714f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189de7dcee58714f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 10:12:44.331438415 +0000 UTC m=+3.823172740,LastTimestamp:2026-03-18 10:12:55.501787814 +0000 UTC m=+14.993522149,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 10:13:05 crc kubenswrapper[4733]: E0318 10:13:05.596531 4733 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189de7df5d552a46\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 18 10:13:05 crc kubenswrapper[4733]: &Event{ObjectMeta:{kube-controller-manager-crc.189de7df5d552a46 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 18 10:13:05 crc kubenswrapper[4733]: body: Mar 18 10:13:05 crc kubenswrapper[4733]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 10:12:54.78342919 +0000 UTC m=+14.275163545,LastTimestamp:2026-03-18 10:13:04.78306823 +0000 UTC m=+24.274802555,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 10:13:05 crc kubenswrapper[4733]: > Mar 18 10:13:05 crc kubenswrapper[4733]: E0318 10:13:05.601485 4733 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189de7df5d565238\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189de7df5d565238 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 10:12:54.783504952 +0000 UTC m=+14.275239317,LastTimestamp:2026-03-18 10:13:04.783148033 +0000 UTC m=+24.274882358,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 10:13:05 crc kubenswrapper[4733]: W0318 10:13:05.964158 4733 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 18 10:13:05 crc kubenswrapper[4733]: E0318 10:13:05.964279 4733 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 18 10:13:06 crc kubenswrapper[4733]: I0318 10:13:06.125512 4733 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 10:13:07 crc kubenswrapper[4733]: I0318 10:13:07.128272 4733 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 10:13:08 crc kubenswrapper[4733]: I0318 10:13:08.124119 4733 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 10:13:08 crc kubenswrapper[4733]: I0318 10:13:08.407930 4733 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 10:13:08 crc kubenswrapper[4733]: I0318 10:13:08.410781 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:13:08 crc kubenswrapper[4733]: I0318 10:13:08.410848 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:13:08 crc kubenswrapper[4733]: I0318 10:13:08.410868 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:13:08 crc kubenswrapper[4733]: I0318 10:13:08.410906 4733 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 10:13:08 crc kubenswrapper[4733]: E0318 10:13:08.416615 4733 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 18 10:13:08 crc kubenswrapper[4733]: E0318 10:13:08.416662 4733 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 18 10:13:09 crc kubenswrapper[4733]: I0318 10:13:09.128263 4733 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 10:13:09 crc kubenswrapper[4733]: W0318 10:13:09.615579 4733 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 18 10:13:09 crc kubenswrapper[4733]: E0318 10:13:09.615680 4733 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 18 10:13:10 crc kubenswrapper[4733]: I0318 10:13:10.126737 4733 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 10:13:10 crc kubenswrapper[4733]: W0318 10:13:10.577849 4733 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 18 10:13:10 crc kubenswrapper[4733]: E0318 10:13:10.577950 4733 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 18 10:13:11 crc kubenswrapper[4733]: I0318 10:13:11.124164 4733 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 10:13:11 crc kubenswrapper[4733]: E0318 10:13:11.247593 4733 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 10:13:12 crc kubenswrapper[4733]: I0318 10:13:12.125443 4733 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 10:13:13 crc kubenswrapper[4733]: I0318 10:13:13.125879 4733 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 10:13:13 crc kubenswrapper[4733]: I0318 10:13:13.176150 4733 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 10:13:13 crc kubenswrapper[4733]: I0318 10:13:13.177935 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:13:13 crc kubenswrapper[4733]: I0318 10:13:13.177998 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:13:13 crc kubenswrapper[4733]: I0318 10:13:13.178018 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:13:13 crc kubenswrapper[4733]: I0318 10:13:13.178877 4733 scope.go:117] "RemoveContainer" containerID="512d400fdc468389180501b48b185ef2c56dbc18c94fa4a8dbd0c2ea829f8c95" Mar 18 10:13:13 crc kubenswrapper[4733]: I0318 10:13:13.502989 4733 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:37322->192.168.126.11:10357: read: connection reset by peer" start-of-body= Mar 18 10:13:13 crc kubenswrapper[4733]: I0318 10:13:13.503097 4733 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:37322->192.168.126.11:10357: read: connection reset by peer" Mar 18 10:13:13 crc kubenswrapper[4733]: I0318 10:13:13.503239 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 10:13:13 crc kubenswrapper[4733]: I0318 10:13:13.503528 4733 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 10:13:13 crc kubenswrapper[4733]: I0318 10:13:13.505626 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:13:13 crc kubenswrapper[4733]: I0318 10:13:13.505715 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:13:13 crc kubenswrapper[4733]: I0318 10:13:13.505737 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:13:13 crc kubenswrapper[4733]: I0318 10:13:13.506686 4733 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"a7c73fedb720681572ba31d10e49b7fc28537f98b4afb32bee611e6265eafaff"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 18 10:13:13 crc kubenswrapper[4733]: I0318 10:13:13.506988 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://a7c73fedb720681572ba31d10e49b7fc28537f98b4afb32bee611e6265eafaff" gracePeriod=30 Mar 18 10:13:13 crc kubenswrapper[4733]: E0318 10:13:13.517285 4733 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 18 10:13:13 crc kubenswrapper[4733]: &Event{ObjectMeta:{kube-controller-manager-crc.189de7e3b91c203a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": read tcp 192.168.126.11:37322->192.168.126.11:10357: read: connection reset by peer Mar 18 10:13:13 crc kubenswrapper[4733]: body: Mar 18 10:13:13 crc kubenswrapper[4733]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 10:13:13.503064122 +0000 UTC m=+32.994798497,LastTimestamp:2026-03-18 10:13:13.503064122 +0000 UTC m=+32.994798497,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 10:13:13 crc kubenswrapper[4733]: > Mar 18 10:13:13 crc kubenswrapper[4733]: E0318 10:13:13.523594 4733 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189de7e3b91d6b0e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:37322->192.168.126.11:10357: read: connection reset by peer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 10:13:13.503148814 +0000 UTC m=+32.994883179,LastTimestamp:2026-03-18 10:13:13.503148814 +0000 UTC m=+32.994883179,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 10:13:13 crc kubenswrapper[4733]: E0318 10:13:13.534726 4733 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189de7e3b957a10c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 10:13:13.506963724 +0000 UTC m=+32.998698089,LastTimestamp:2026-03-18 10:13:13.506963724 +0000 UTC m=+32.998698089,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 10:13:13 crc kubenswrapper[4733]: E0318 10:13:13.542095 4733 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189de7dc70bd9cce\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189de7dc70bd9cce openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 10:12:42.22413947 +0000 UTC m=+1.715873795,LastTimestamp:2026-03-18 10:13:13.526273978 +0000 UTC m=+33.018008353,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 10:13:13 crc kubenswrapper[4733]: E0318 10:13:13.737968 4733 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189de7dc8408c6ec\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189de7dc8408c6ec openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 10:12:42.547832556 +0000 UTC m=+2.039566881,LastTimestamp:2026-03-18 10:13:13.729688945 +0000 UTC m=+33.221423270,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 10:13:13 crc kubenswrapper[4733]: E0318 10:13:13.749813 4733 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189de7dc84ee1035\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189de7dc84ee1035 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 10:12:42.562859061 +0000 UTC m=+2.054593386,LastTimestamp:2026-03-18 10:13:13.739650195 +0000 UTC m=+33.231384520,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 10:13:14 crc kubenswrapper[4733]: I0318 10:13:14.123504 4733 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 10:13:14 crc kubenswrapper[4733]: I0318 10:13:14.352418 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 18 10:13:14 crc kubenswrapper[4733]: I0318 10:13:14.355732 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c5df0e453e549f1c53a257294fcfd5535a89c6524d17c1ea699e8a9a21a19a11"} Mar 18 10:13:14 crc kubenswrapper[4733]: I0318 10:13:14.355956 4733 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 10:13:14 crc kubenswrapper[4733]: I0318 10:13:14.357457 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:13:14 crc kubenswrapper[4733]: I0318 10:13:14.357515 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:13:14 crc kubenswrapper[4733]: I0318 10:13:14.357532 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:13:14 crc kubenswrapper[4733]: I0318 10:13:14.361704 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 18 10:13:14 crc kubenswrapper[4733]: I0318 10:13:14.362394 4733 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="a7c73fedb720681572ba31d10e49b7fc28537f98b4afb32bee611e6265eafaff" exitCode=255 Mar 18 10:13:14 crc kubenswrapper[4733]: I0318 10:13:14.362465 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"a7c73fedb720681572ba31d10e49b7fc28537f98b4afb32bee611e6265eafaff"} Mar 18 10:13:14 crc kubenswrapper[4733]: I0318 10:13:14.362518 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4287a7d43815108131e4b725925805740a64682bc2a9c96ff054f65517e501f2"} Mar 18 10:13:14 crc kubenswrapper[4733]: I0318 10:13:14.362671 4733 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 10:13:14 crc kubenswrapper[4733]: I0318 10:13:14.363852 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:13:14 crc kubenswrapper[4733]: I0318 10:13:14.363895 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:13:14 crc kubenswrapper[4733]: I0318 10:13:14.363913 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:13:15 crc kubenswrapper[4733]: I0318 10:13:15.125940 4733 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 10:13:15 crc kubenswrapper[4733]: I0318 10:13:15.368562 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 18 10:13:15 crc kubenswrapper[4733]: I0318 10:13:15.369534 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 18 10:13:15 crc kubenswrapper[4733]: I0318 10:13:15.372601 4733 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c5df0e453e549f1c53a257294fcfd5535a89c6524d17c1ea699e8a9a21a19a11" exitCode=255 Mar 18 10:13:15 crc kubenswrapper[4733]: I0318 10:13:15.372668 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"c5df0e453e549f1c53a257294fcfd5535a89c6524d17c1ea699e8a9a21a19a11"} Mar 18 10:13:15 crc kubenswrapper[4733]: I0318 10:13:15.372718 4733 scope.go:117] "RemoveContainer" containerID="512d400fdc468389180501b48b185ef2c56dbc18c94fa4a8dbd0c2ea829f8c95" Mar 18 10:13:15 crc kubenswrapper[4733]: I0318 10:13:15.373006 4733 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 10:13:15 crc kubenswrapper[4733]: I0318 10:13:15.374445 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:13:15 crc kubenswrapper[4733]: I0318 10:13:15.374508 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:13:15 crc kubenswrapper[4733]: I0318 10:13:15.374531 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:13:15 crc kubenswrapper[4733]: I0318 10:13:15.375556 4733 scope.go:117] "RemoveContainer" containerID="c5df0e453e549f1c53a257294fcfd5535a89c6524d17c1ea699e8a9a21a19a11" Mar 18 10:13:15 crc kubenswrapper[4733]: E0318 10:13:15.375849 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 10:13:15 crc kubenswrapper[4733]: I0318 10:13:15.417547 4733 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 10:13:15 crc kubenswrapper[4733]: I0318 10:13:15.419395 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:13:15 crc kubenswrapper[4733]: I0318 10:13:15.419451 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:13:15 crc kubenswrapper[4733]: I0318 10:13:15.419469 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:13:15 crc kubenswrapper[4733]: I0318 10:13:15.419507 4733 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 10:13:15 crc kubenswrapper[4733]: E0318 10:13:15.423382 4733 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 18 10:13:15 crc kubenswrapper[4733]: E0318 10:13:15.423735 4733 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 18 10:13:16 crc kubenswrapper[4733]: I0318 10:13:16.124047 4733 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 10:13:16 crc kubenswrapper[4733]: I0318 10:13:16.378525 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 18 10:13:16 crc kubenswrapper[4733]: I0318 10:13:16.515056 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 10:13:16 crc kubenswrapper[4733]: I0318 10:13:16.515292 4733 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 10:13:16 crc kubenswrapper[4733]: I0318 10:13:16.516613 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:13:16 crc kubenswrapper[4733]: I0318 10:13:16.516692 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:13:16 crc kubenswrapper[4733]: I0318 10:13:16.516717 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:13:16 crc kubenswrapper[4733]: I0318 10:13:16.517757 4733 scope.go:117] "RemoveContainer" containerID="c5df0e453e549f1c53a257294fcfd5535a89c6524d17c1ea699e8a9a21a19a11" Mar 18 10:13:16 crc kubenswrapper[4733]: E0318 10:13:16.518076 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 10:13:17 crc kubenswrapper[4733]: I0318 10:13:17.127306 4733 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 10:13:18 crc kubenswrapper[4733]: I0318 10:13:18.124980 4733 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 10:13:18 crc kubenswrapper[4733]: I0318 10:13:18.316358 4733 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 10:13:18 crc kubenswrapper[4733]: I0318 10:13:18.316642 4733 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 10:13:18 crc kubenswrapper[4733]: I0318 10:13:18.318357 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:13:18 crc kubenswrapper[4733]: I0318 10:13:18.318409 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:13:18 crc kubenswrapper[4733]: I0318 10:13:18.318423 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:13:18 crc kubenswrapper[4733]: I0318 10:13:18.319073 4733 scope.go:117] "RemoveContainer" containerID="c5df0e453e549f1c53a257294fcfd5535a89c6524d17c1ea699e8a9a21a19a11" Mar 18 10:13:18 crc kubenswrapper[4733]: E0318 10:13:18.319318 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 10:13:18 crc kubenswrapper[4733]: I0318 10:13:18.708813 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 10:13:18 crc kubenswrapper[4733]: I0318 10:13:18.709422 4733 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 10:13:18 crc kubenswrapper[4733]: I0318 10:13:18.711176 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:13:18 crc kubenswrapper[4733]: I0318 10:13:18.711280 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:13:18 crc kubenswrapper[4733]: I0318 10:13:18.711299 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:13:19 crc kubenswrapper[4733]: I0318 10:13:19.125090 4733 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 10:13:20 crc kubenswrapper[4733]: I0318 10:13:20.125825 4733 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 10:13:21 crc kubenswrapper[4733]: I0318 10:13:21.125775 4733 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 10:13:21 crc kubenswrapper[4733]: E0318 10:13:21.247847 4733 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 10:13:21 crc kubenswrapper[4733]: I0318 10:13:21.782377 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 10:13:21 crc kubenswrapper[4733]: I0318 10:13:21.782599 4733 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 10:13:21 crc kubenswrapper[4733]: I0318 10:13:21.784612 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:13:21 crc kubenswrapper[4733]: I0318 10:13:21.784663 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:13:21 crc kubenswrapper[4733]: I0318 10:13:21.784698 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:13:22 crc kubenswrapper[4733]: I0318 10:13:22.125415 4733 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 10:13:22 crc kubenswrapper[4733]: I0318 10:13:22.424109 4733 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 10:13:22 crc kubenswrapper[4733]: I0318 10:13:22.426018 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:13:22 crc kubenswrapper[4733]: I0318 10:13:22.426092 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:13:22 crc kubenswrapper[4733]: I0318 10:13:22.426110 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:13:22 crc kubenswrapper[4733]: I0318 10:13:22.426157 4733 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 10:13:22 crc kubenswrapper[4733]: E0318 10:13:22.431109 4733 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 18 10:13:22 crc kubenswrapper[4733]: E0318 10:13:22.431351 4733 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 18 10:13:23 crc kubenswrapper[4733]: I0318 10:13:23.125281 4733 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 10:13:24 crc kubenswrapper[4733]: I0318 10:13:24.123417 4733 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 10:13:24 crc kubenswrapper[4733]: I0318 10:13:24.782629 4733 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 10:13:24 crc kubenswrapper[4733]: I0318 10:13:24.782735 4733 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 10:13:24 crc kubenswrapper[4733]: E0318 10:13:24.790344 4733 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189de7df5d552a46\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 18 10:13:24 crc kubenswrapper[4733]: &Event{ObjectMeta:{kube-controller-manager-crc.189de7df5d552a46 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 18 10:13:24 crc kubenswrapper[4733]: body: Mar 18 10:13:24 crc kubenswrapper[4733]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 10:12:54.78342919 +0000 UTC m=+14.275163545,LastTimestamp:2026-03-18 10:13:24.782698705 +0000 UTC m=+44.274433070,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 10:13:24 crc kubenswrapper[4733]: > Mar 18 10:13:24 crc kubenswrapper[4733]: E0318 10:13:24.796850 4733 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189de7df5d565238\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189de7df5d565238 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 10:12:54.783504952 +0000 UTC m=+14.275239317,LastTimestamp:2026-03-18 10:13:24.782769797 +0000 UTC m=+44.274504162,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 10:13:25 crc kubenswrapper[4733]: I0318 10:13:25.123372 4733 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 10:13:26 crc kubenswrapper[4733]: I0318 10:13:26.124273 4733 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 10:13:27 crc kubenswrapper[4733]: I0318 10:13:27.124245 4733 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 10:13:27 crc kubenswrapper[4733]: W0318 10:13:27.474138 4733 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 18 10:13:27 crc kubenswrapper[4733]: E0318 10:13:27.474262 4733 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 18 10:13:28 crc kubenswrapper[4733]: I0318 10:13:28.126611 4733 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 10:13:29 crc kubenswrapper[4733]: I0318 10:13:29.125957 4733 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 10:13:29 crc kubenswrapper[4733]: I0318 10:13:29.432362 4733 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 10:13:29 crc kubenswrapper[4733]: I0318 10:13:29.434507 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:13:29 crc kubenswrapper[4733]: I0318 10:13:29.434591 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:13:29 crc kubenswrapper[4733]: I0318 10:13:29.434616 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:13:29 crc kubenswrapper[4733]: I0318 10:13:29.434672 4733 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 10:13:29 crc kubenswrapper[4733]: E0318 10:13:29.440720 4733 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 18 10:13:29 crc kubenswrapper[4733]: E0318 10:13:29.440804 4733 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 18 10:13:29 crc kubenswrapper[4733]: W0318 10:13:29.890441 4733 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 18 10:13:29 crc kubenswrapper[4733]: E0318 10:13:29.890530 4733 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 18 10:13:30 crc kubenswrapper[4733]: I0318 10:13:30.126391 4733 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 10:13:30 crc kubenswrapper[4733]: I0318 10:13:30.175180 4733 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 10:13:30 crc kubenswrapper[4733]: I0318 10:13:30.177235 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:13:30 crc kubenswrapper[4733]: I0318 10:13:30.177324 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:13:30 crc kubenswrapper[4733]: I0318 10:13:30.177349 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:13:30 crc kubenswrapper[4733]: I0318 10:13:30.178742 4733 scope.go:117] "RemoveContainer" containerID="c5df0e453e549f1c53a257294fcfd5535a89c6524d17c1ea699e8a9a21a19a11" Mar 18 10:13:30 crc kubenswrapper[4733]: E0318 10:13:30.179118 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 10:13:31 crc kubenswrapper[4733]: I0318 10:13:31.124860 4733 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 10:13:31 crc kubenswrapper[4733]: E0318 10:13:31.248636 4733 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 10:13:31 crc kubenswrapper[4733]: W0318 10:13:31.693606 4733 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 18 10:13:31 crc kubenswrapper[4733]: E0318 10:13:31.693710 4733 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 18 10:13:32 crc kubenswrapper[4733]: W0318 10:13:32.080474 4733 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 18 10:13:32 crc kubenswrapper[4733]: E0318 10:13:32.080581 4733 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 18 10:13:32 crc kubenswrapper[4733]: I0318 10:13:32.127510 4733 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 10:13:32 crc kubenswrapper[4733]: I0318 10:13:32.861608 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 10:13:32 crc kubenswrapper[4733]: I0318 10:13:32.861918 4733 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 10:13:32 crc kubenswrapper[4733]: I0318 10:13:32.863949 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:13:32 crc kubenswrapper[4733]: I0318 10:13:32.864305 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:13:32 crc kubenswrapper[4733]: I0318 10:13:32.864328 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:13:33 crc kubenswrapper[4733]: I0318 10:13:33.128128 4733 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 10:13:34 crc kubenswrapper[4733]: I0318 10:13:34.125935 4733 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 10:13:34 crc kubenswrapper[4733]: I0318 10:13:34.782968 4733 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 10:13:34 crc kubenswrapper[4733]: I0318 10:13:34.783063 4733 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 10:13:34 crc kubenswrapper[4733]: E0318 10:13:34.787765 4733 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189de7df5d552a46\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 18 10:13:34 crc kubenswrapper[4733]: &Event{ObjectMeta:{kube-controller-manager-crc.189de7df5d552a46 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 18 10:13:34 crc kubenswrapper[4733]: body: Mar 18 10:13:34 crc kubenswrapper[4733]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 10:12:54.78342919 +0000 UTC m=+14.275163545,LastTimestamp:2026-03-18 10:13:34.783039193 +0000 UTC m=+54.274773558,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 10:13:34 crc kubenswrapper[4733]: > Mar 18 10:13:35 crc kubenswrapper[4733]: I0318 10:13:35.126251 4733 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 10:13:36 crc kubenswrapper[4733]: I0318 10:13:36.126016 4733 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 10:13:36 crc kubenswrapper[4733]: I0318 10:13:36.440856 4733 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 10:13:36 crc kubenswrapper[4733]: I0318 10:13:36.441714 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:13:36 crc kubenswrapper[4733]: I0318 10:13:36.441766 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:13:36 crc kubenswrapper[4733]: I0318 10:13:36.441783 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:13:36 crc kubenswrapper[4733]: I0318 10:13:36.441817 4733 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 10:13:36 crc kubenswrapper[4733]: E0318 10:13:36.447156 4733 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 18 10:13:36 crc kubenswrapper[4733]: E0318 10:13:36.447347 4733 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 18 10:13:37 crc kubenswrapper[4733]: I0318 10:13:37.128396 4733 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 10:13:38 crc kubenswrapper[4733]: I0318 10:13:38.125158 4733 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 10:13:39 crc kubenswrapper[4733]: I0318 10:13:39.123246 4733 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 10:13:40 crc kubenswrapper[4733]: I0318 10:13:40.122819 4733 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 10:13:41 crc kubenswrapper[4733]: I0318 10:13:41.123350 4733 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 10:13:41 crc kubenswrapper[4733]: I0318 10:13:41.174920 4733 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 10:13:41 crc kubenswrapper[4733]: I0318 10:13:41.176718 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:13:41 crc kubenswrapper[4733]: I0318 10:13:41.176759 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:13:41 crc kubenswrapper[4733]: I0318 10:13:41.176783 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:13:41 crc kubenswrapper[4733]: I0318 10:13:41.181995 4733 scope.go:117] "RemoveContainer" containerID="c5df0e453e549f1c53a257294fcfd5535a89c6524d17c1ea699e8a9a21a19a11" Mar 18 10:13:41 crc kubenswrapper[4733]: E0318 10:13:41.249302 4733 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 10:13:41 crc kubenswrapper[4733]: I0318 10:13:41.456976 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 18 10:13:41 crc kubenswrapper[4733]: I0318 10:13:41.458970 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ba371d0dc81f8827d305037cab25306e3abe8ed3d243f74923b4709198f7ea38"} Mar 18 10:13:41 crc kubenswrapper[4733]: I0318 10:13:41.459148 4733 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 10:13:41 crc kubenswrapper[4733]: I0318 10:13:41.460627 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:13:41 crc kubenswrapper[4733]: I0318 10:13:41.460670 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:13:41 crc kubenswrapper[4733]: I0318 10:13:41.460681 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:13:41 crc kubenswrapper[4733]: I0318 10:13:41.786445 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 10:13:41 crc kubenswrapper[4733]: I0318 10:13:41.786641 4733 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 10:13:41 crc kubenswrapper[4733]: I0318 10:13:41.787773 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:13:41 crc kubenswrapper[4733]: I0318 10:13:41.787812 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:13:41 crc kubenswrapper[4733]: I0318 10:13:41.787825 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:13:41 crc kubenswrapper[4733]: I0318 10:13:41.789842 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 10:13:42 crc kubenswrapper[4733]: I0318 10:13:42.122682 4733 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 10:13:42 crc kubenswrapper[4733]: I0318 10:13:42.463000 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 18 10:13:42 crc kubenswrapper[4733]: I0318 10:13:42.463998 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 18 10:13:42 crc kubenswrapper[4733]: I0318 10:13:42.465589 4733 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ba371d0dc81f8827d305037cab25306e3abe8ed3d243f74923b4709198f7ea38" exitCode=255 Mar 18 10:13:42 crc kubenswrapper[4733]: I0318 10:13:42.465658 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"ba371d0dc81f8827d305037cab25306e3abe8ed3d243f74923b4709198f7ea38"} Mar 18 10:13:42 crc kubenswrapper[4733]: I0318 10:13:42.465771 4733 scope.go:117] "RemoveContainer" containerID="c5df0e453e549f1c53a257294fcfd5535a89c6524d17c1ea699e8a9a21a19a11" Mar 18 10:13:42 crc kubenswrapper[4733]: I0318 10:13:42.465945 4733 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 10:13:42 crc kubenswrapper[4733]: I0318 10:13:42.465954 4733 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 10:13:42 crc kubenswrapper[4733]: I0318 10:13:42.467249 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:13:42 crc kubenswrapper[4733]: I0318 10:13:42.467275 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:13:42 crc kubenswrapper[4733]: I0318 10:13:42.467285 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:13:42 crc kubenswrapper[4733]: I0318 10:13:42.467425 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:13:42 crc kubenswrapper[4733]: I0318 10:13:42.467454 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:13:42 crc kubenswrapper[4733]: I0318 10:13:42.467469 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:13:42 crc kubenswrapper[4733]: I0318 10:13:42.467837 4733 scope.go:117] "RemoveContainer" containerID="ba371d0dc81f8827d305037cab25306e3abe8ed3d243f74923b4709198f7ea38" Mar 18 10:13:42 crc kubenswrapper[4733]: E0318 10:13:42.467993 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 10:13:43 crc kubenswrapper[4733]: I0318 10:13:43.124278 4733 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 10:13:43 crc kubenswrapper[4733]: I0318 10:13:43.447864 4733 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 10:13:43 crc kubenswrapper[4733]: I0318 10:13:43.449287 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:13:43 crc kubenswrapper[4733]: I0318 10:13:43.449349 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:13:43 crc kubenswrapper[4733]: I0318 10:13:43.449366 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:13:43 crc kubenswrapper[4733]: I0318 10:13:43.449401 4733 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 10:13:43 crc kubenswrapper[4733]: E0318 10:13:43.453744 4733 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 18 10:13:43 crc kubenswrapper[4733]: E0318 10:13:43.454417 4733 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 18 10:13:43 crc kubenswrapper[4733]: I0318 10:13:43.470038 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 18 10:13:44 crc kubenswrapper[4733]: I0318 10:13:44.123734 4733 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 10:13:45 crc kubenswrapper[4733]: I0318 10:13:45.122968 4733 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 10:13:45 crc kubenswrapper[4733]: I0318 10:13:45.575438 4733 csr.go:261] certificate signing request csr-4ptw7 is approved, waiting to be issued Mar 18 10:13:45 crc kubenswrapper[4733]: I0318 10:13:45.582410 4733 csr.go:257] certificate signing request csr-4ptw7 is issued Mar 18 10:13:45 crc kubenswrapper[4733]: I0318 10:13:45.643786 4733 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 18 10:13:45 crc kubenswrapper[4733]: I0318 10:13:45.979242 4733 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 18 10:13:46 crc kubenswrapper[4733]: I0318 10:13:46.515054 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 10:13:46 crc kubenswrapper[4733]: I0318 10:13:46.515228 4733 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 10:13:46 crc kubenswrapper[4733]: I0318 10:13:46.516397 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:13:46 crc kubenswrapper[4733]: I0318 10:13:46.516427 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:13:46 crc kubenswrapper[4733]: I0318 10:13:46.516439 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:13:46 crc kubenswrapper[4733]: I0318 10:13:46.517033 4733 scope.go:117] "RemoveContainer" containerID="ba371d0dc81f8827d305037cab25306e3abe8ed3d243f74923b4709198f7ea38" Mar 18 10:13:46 crc kubenswrapper[4733]: E0318 10:13:46.517220 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 10:13:46 crc kubenswrapper[4733]: I0318 10:13:46.583378 4733 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-19 14:06:01.863061398 +0000 UTC Mar 18 10:13:46 crc kubenswrapper[4733]: I0318 10:13:46.583448 4733 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 5907h52m15.279618985s for next certificate rotation Mar 18 10:13:48 crc kubenswrapper[4733]: I0318 10:13:48.316783 4733 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 10:13:48 crc kubenswrapper[4733]: I0318 10:13:48.316952 4733 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 10:13:48 crc kubenswrapper[4733]: I0318 10:13:48.318239 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:13:48 crc kubenswrapper[4733]: I0318 10:13:48.318284 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:13:48 crc kubenswrapper[4733]: I0318 10:13:48.318293 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:13:48 crc kubenswrapper[4733]: I0318 10:13:48.318875 4733 scope.go:117] "RemoveContainer" containerID="ba371d0dc81f8827d305037cab25306e3abe8ed3d243f74923b4709198f7ea38" Mar 18 10:13:48 crc kubenswrapper[4733]: E0318 10:13:48.319065 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 10:13:50 crc kubenswrapper[4733]: I0318 10:13:50.454292 4733 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 10:13:50 crc kubenswrapper[4733]: I0318 10:13:50.455985 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:13:50 crc kubenswrapper[4733]: I0318 10:13:50.456044 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:13:50 crc kubenswrapper[4733]: I0318 10:13:50.456070 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:13:50 crc kubenswrapper[4733]: I0318 10:13:50.456224 4733 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 10:13:50 crc kubenswrapper[4733]: I0318 10:13:50.465881 4733 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 18 10:13:50 crc kubenswrapper[4733]: I0318 10:13:50.466305 4733 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 18 10:13:50 crc kubenswrapper[4733]: E0318 10:13:50.466346 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 18 10:13:50 crc kubenswrapper[4733]: I0318 10:13:50.469823 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:13:50 crc kubenswrapper[4733]: I0318 10:13:50.469872 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:13:50 crc kubenswrapper[4733]: I0318 10:13:50.469886 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:13:50 crc kubenswrapper[4733]: I0318 10:13:50.469904 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:13:50 crc kubenswrapper[4733]: I0318 10:13:50.469919 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:13:50Z","lastTransitionTime":"2026-03-18T10:13:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:13:50 crc kubenswrapper[4733]: E0318 10:13:50.487077 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:13:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:13:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:13:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:13:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a826494-c246-4717-869b-fd136e2b8410\\\",\\\"systemUUID\\\":\\\"fe704b25-4cdf-410a-9afb-ebc7963f4bc5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:13:50 crc kubenswrapper[4733]: I0318 10:13:50.498730 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:13:50 crc kubenswrapper[4733]: I0318 10:13:50.498810 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:13:50 crc kubenswrapper[4733]: I0318 10:13:50.498830 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:13:50 crc kubenswrapper[4733]: I0318 10:13:50.498850 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:13:50 crc kubenswrapper[4733]: I0318 10:13:50.498864 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:13:50Z","lastTransitionTime":"2026-03-18T10:13:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:13:50 crc kubenswrapper[4733]: E0318 10:13:50.526705 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:13:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:13:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:13:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:13:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a826494-c246-4717-869b-fd136e2b8410\\\",\\\"systemUUID\\\":\\\"fe704b25-4cdf-410a-9afb-ebc7963f4bc5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:13:50 crc kubenswrapper[4733]: I0318 10:13:50.538745 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:13:50 crc kubenswrapper[4733]: I0318 10:13:50.538777 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:13:50 crc kubenswrapper[4733]: I0318 10:13:50.538784 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:13:50 crc kubenswrapper[4733]: I0318 10:13:50.538799 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:13:50 crc kubenswrapper[4733]: I0318 10:13:50.538809 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:13:50Z","lastTransitionTime":"2026-03-18T10:13:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:13:50 crc kubenswrapper[4733]: E0318 10:13:50.552599 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:13:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:13:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:13:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:13:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a826494-c246-4717-869b-fd136e2b8410\\\",\\\"systemUUID\\\":\\\"fe704b25-4cdf-410a-9afb-ebc7963f4bc5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:13:50 crc kubenswrapper[4733]: I0318 10:13:50.561148 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:13:50 crc kubenswrapper[4733]: I0318 10:13:50.561181 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:13:50 crc kubenswrapper[4733]: I0318 10:13:50.561205 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:13:50 crc kubenswrapper[4733]: I0318 10:13:50.561223 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:13:50 crc kubenswrapper[4733]: I0318 10:13:50.561232 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:13:50Z","lastTransitionTime":"2026-03-18T10:13:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:13:50 crc kubenswrapper[4733]: E0318 10:13:50.574674 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:13:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:13:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:13:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:13:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a826494-c246-4717-869b-fd136e2b8410\\\",\\\"systemUUID\\\":\\\"fe704b25-4cdf-410a-9afb-ebc7963f4bc5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:13:50 crc kubenswrapper[4733]: E0318 10:13:50.574799 4733 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 10:13:50 crc kubenswrapper[4733]: E0318 10:13:50.574825 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:50 crc kubenswrapper[4733]: E0318 10:13:50.675427 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:50 crc kubenswrapper[4733]: E0318 10:13:50.776468 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:50 crc kubenswrapper[4733]: E0318 10:13:50.877538 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:50 crc kubenswrapper[4733]: E0318 10:13:50.978145 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:51 crc kubenswrapper[4733]: E0318 10:13:51.078278 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:51 crc kubenswrapper[4733]: E0318 10:13:51.179308 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:51 crc kubenswrapper[4733]: E0318 10:13:51.249862 4733 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 10:13:51 crc kubenswrapper[4733]: E0318 10:13:51.280235 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:51 crc kubenswrapper[4733]: E0318 10:13:51.381041 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:51 crc kubenswrapper[4733]: E0318 10:13:51.482528 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:51 crc kubenswrapper[4733]: E0318 10:13:51.583556 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:51 crc kubenswrapper[4733]: E0318 10:13:51.684408 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:51 crc kubenswrapper[4733]: E0318 10:13:51.785426 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:51 crc kubenswrapper[4733]: E0318 10:13:51.887002 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:51 crc kubenswrapper[4733]: E0318 10:13:51.988299 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:52 crc kubenswrapper[4733]: E0318 10:13:52.089507 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:52 crc kubenswrapper[4733]: E0318 10:13:52.190336 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:52 crc kubenswrapper[4733]: E0318 10:13:52.290764 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:52 crc kubenswrapper[4733]: E0318 10:13:52.391875 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:52 crc kubenswrapper[4733]: E0318 10:13:52.492023 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:52 crc kubenswrapper[4733]: E0318 10:13:52.595382 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:52 crc kubenswrapper[4733]: E0318 10:13:52.696748 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:52 crc kubenswrapper[4733]: E0318 10:13:52.797117 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:52 crc kubenswrapper[4733]: E0318 10:13:52.898059 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:52 crc kubenswrapper[4733]: E0318 10:13:52.998899 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:53 crc kubenswrapper[4733]: E0318 10:13:53.099562 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:53 crc kubenswrapper[4733]: E0318 10:13:53.200145 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:53 crc kubenswrapper[4733]: E0318 10:13:53.300435 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:53 crc kubenswrapper[4733]: E0318 10:13:53.401165 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:53 crc kubenswrapper[4733]: E0318 10:13:53.502311 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:53 crc kubenswrapper[4733]: E0318 10:13:53.602503 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:53 crc kubenswrapper[4733]: E0318 10:13:53.703247 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:53 crc kubenswrapper[4733]: E0318 10:13:53.803562 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:53 crc kubenswrapper[4733]: E0318 10:13:53.904631 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:54 crc kubenswrapper[4733]: E0318 10:13:54.005354 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:54 crc kubenswrapper[4733]: E0318 10:13:54.105617 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:54 crc kubenswrapper[4733]: E0318 10:13:54.206520 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:54 crc kubenswrapper[4733]: E0318 10:13:54.307723 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:54 crc kubenswrapper[4733]: E0318 10:13:54.408699 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:54 crc kubenswrapper[4733]: E0318 10:13:54.508843 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:54 crc kubenswrapper[4733]: E0318 10:13:54.609750 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:54 crc kubenswrapper[4733]: E0318 10:13:54.710877 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:54 crc kubenswrapper[4733]: E0318 10:13:54.811616 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:54 crc kubenswrapper[4733]: E0318 10:13:54.912738 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:55 crc kubenswrapper[4733]: E0318 10:13:55.013786 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:55 crc kubenswrapper[4733]: E0318 10:13:55.114930 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:55 crc kubenswrapper[4733]: E0318 10:13:55.215765 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:55 crc kubenswrapper[4733]: E0318 10:13:55.316237 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:55 crc kubenswrapper[4733]: E0318 10:13:55.417364 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:55 crc kubenswrapper[4733]: E0318 10:13:55.518261 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:55 crc kubenswrapper[4733]: E0318 10:13:55.619138 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:55 crc kubenswrapper[4733]: E0318 10:13:55.720612 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:55 crc kubenswrapper[4733]: E0318 10:13:55.821349 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:55 crc kubenswrapper[4733]: E0318 10:13:55.922263 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:56 crc kubenswrapper[4733]: E0318 10:13:56.022584 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:56 crc kubenswrapper[4733]: E0318 10:13:56.123673 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:56 crc kubenswrapper[4733]: E0318 10:13:56.225030 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:56 crc kubenswrapper[4733]: E0318 10:13:56.326072 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:56 crc kubenswrapper[4733]: E0318 10:13:56.426559 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:56 crc kubenswrapper[4733]: E0318 10:13:56.527537 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:56 crc kubenswrapper[4733]: E0318 10:13:56.627936 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:56 crc kubenswrapper[4733]: E0318 10:13:56.728759 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:56 crc kubenswrapper[4733]: E0318 10:13:56.829710 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:56 crc kubenswrapper[4733]: E0318 10:13:56.930605 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:57 crc kubenswrapper[4733]: E0318 10:13:57.031670 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:57 crc kubenswrapper[4733]: E0318 10:13:57.132077 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:57 crc kubenswrapper[4733]: E0318 10:13:57.232470 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:57 crc kubenswrapper[4733]: E0318 10:13:57.333591 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:57 crc kubenswrapper[4733]: E0318 10:13:57.433664 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:57 crc kubenswrapper[4733]: E0318 10:13:57.534107 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:57 crc kubenswrapper[4733]: E0318 10:13:57.634239 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:57 crc kubenswrapper[4733]: E0318 10:13:57.734403 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:57 crc kubenswrapper[4733]: E0318 10:13:57.835271 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:57 crc kubenswrapper[4733]: E0318 10:13:57.936226 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:58 crc kubenswrapper[4733]: E0318 10:13:58.036336 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:58 crc kubenswrapper[4733]: E0318 10:13:58.136842 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:58 crc kubenswrapper[4733]: E0318 10:13:58.237436 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:58 crc kubenswrapper[4733]: E0318 10:13:58.337787 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:58 crc kubenswrapper[4733]: E0318 10:13:58.437882 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:58 crc kubenswrapper[4733]: E0318 10:13:58.538461 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:58 crc kubenswrapper[4733]: E0318 10:13:58.639516 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:58 crc kubenswrapper[4733]: I0318 10:13:58.694941 4733 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 18 10:13:58 crc kubenswrapper[4733]: E0318 10:13:58.740643 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:58 crc kubenswrapper[4733]: E0318 10:13:58.841836 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:58 crc kubenswrapper[4733]: E0318 10:13:58.942863 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:59 crc kubenswrapper[4733]: E0318 10:13:59.043650 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:59 crc kubenswrapper[4733]: E0318 10:13:59.144789 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:59 crc kubenswrapper[4733]: E0318 10:13:59.245681 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:59 crc kubenswrapper[4733]: E0318 10:13:59.346407 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:59 crc kubenswrapper[4733]: E0318 10:13:59.446580 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:59 crc kubenswrapper[4733]: E0318 10:13:59.547392 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:59 crc kubenswrapper[4733]: E0318 10:13:59.647874 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:59 crc kubenswrapper[4733]: E0318 10:13:59.748831 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:59 crc kubenswrapper[4733]: E0318 10:13:59.849343 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:13:59 crc kubenswrapper[4733]: E0318 10:13:59.950101 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:00 crc kubenswrapper[4733]: E0318 10:14:00.051157 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:00 crc kubenswrapper[4733]: E0318 10:14:00.152155 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:00 crc kubenswrapper[4733]: I0318 10:14:00.175395 4733 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 10:14:00 crc kubenswrapper[4733]: I0318 10:14:00.176513 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:00 crc kubenswrapper[4733]: I0318 10:14:00.176598 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:00 crc kubenswrapper[4733]: I0318 10:14:00.176622 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:00 crc kubenswrapper[4733]: I0318 10:14:00.177152 4733 scope.go:117] "RemoveContainer" containerID="ba371d0dc81f8827d305037cab25306e3abe8ed3d243f74923b4709198f7ea38" Mar 18 10:14:00 crc kubenswrapper[4733]: E0318 10:14:00.177328 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 10:14:00 crc kubenswrapper[4733]: E0318 10:14:00.253152 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:00 crc kubenswrapper[4733]: E0318 10:14:00.353455 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:00 crc kubenswrapper[4733]: E0318 10:14:00.454155 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:00 crc kubenswrapper[4733]: E0318 10:14:00.554869 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:00 crc kubenswrapper[4733]: E0318 10:14:00.655739 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:00 crc kubenswrapper[4733]: E0318 10:14:00.756810 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:00 crc kubenswrapper[4733]: E0318 10:14:00.838601 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 18 10:14:00 crc kubenswrapper[4733]: I0318 10:14:00.843746 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:00 crc kubenswrapper[4733]: I0318 10:14:00.843786 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:00 crc kubenswrapper[4733]: I0318 10:14:00.843802 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:00 crc kubenswrapper[4733]: I0318 10:14:00.843821 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:00 crc kubenswrapper[4733]: I0318 10:14:00.843834 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:00Z","lastTransitionTime":"2026-03-18T10:14:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:00 crc kubenswrapper[4733]: E0318 10:14:00.858652 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a826494-c246-4717-869b-fd136e2b8410\\\",\\\"systemUUID\\\":\\\"fe704b25-4cdf-410a-9afb-ebc7963f4bc5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:00 crc kubenswrapper[4733]: I0318 10:14:00.863239 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:00 crc kubenswrapper[4733]: I0318 10:14:00.863269 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:00 crc kubenswrapper[4733]: I0318 10:14:00.863280 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:00 crc kubenswrapper[4733]: I0318 10:14:00.863299 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:00 crc kubenswrapper[4733]: I0318 10:14:00.863311 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:00Z","lastTransitionTime":"2026-03-18T10:14:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:00 crc kubenswrapper[4733]: E0318 10:14:00.877967 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a826494-c246-4717-869b-fd136e2b8410\\\",\\\"systemUUID\\\":\\\"fe704b25-4cdf-410a-9afb-ebc7963f4bc5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:00 crc kubenswrapper[4733]: I0318 10:14:00.883599 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:00 crc kubenswrapper[4733]: I0318 10:14:00.883662 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:00 crc kubenswrapper[4733]: I0318 10:14:00.883675 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:00 crc kubenswrapper[4733]: I0318 10:14:00.883694 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:00 crc kubenswrapper[4733]: I0318 10:14:00.883710 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:00Z","lastTransitionTime":"2026-03-18T10:14:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:00 crc kubenswrapper[4733]: E0318 10:14:00.899025 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a826494-c246-4717-869b-fd136e2b8410\\\",\\\"systemUUID\\\":\\\"fe704b25-4cdf-410a-9afb-ebc7963f4bc5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:00 crc kubenswrapper[4733]: I0318 10:14:00.903133 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:00 crc kubenswrapper[4733]: I0318 10:14:00.903175 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:00 crc kubenswrapper[4733]: I0318 10:14:00.903209 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:00 crc kubenswrapper[4733]: I0318 10:14:00.903227 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:00 crc kubenswrapper[4733]: I0318 10:14:00.903240 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:00Z","lastTransitionTime":"2026-03-18T10:14:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:00 crc kubenswrapper[4733]: E0318 10:14:00.917851 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a826494-c246-4717-869b-fd136e2b8410\\\",\\\"systemUUID\\\":\\\"fe704b25-4cdf-410a-9afb-ebc7963f4bc5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:00 crc kubenswrapper[4733]: E0318 10:14:00.918001 4733 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 10:14:00 crc kubenswrapper[4733]: E0318 10:14:00.918037 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:01 crc kubenswrapper[4733]: E0318 10:14:01.018616 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:01 crc kubenswrapper[4733]: E0318 10:14:01.119145 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:01 crc kubenswrapper[4733]: E0318 10:14:01.219707 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:01 crc kubenswrapper[4733]: E0318 10:14:01.250373 4733 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 10:14:01 crc kubenswrapper[4733]: E0318 10:14:01.320589 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:01 crc kubenswrapper[4733]: E0318 10:14:01.421453 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:01 crc kubenswrapper[4733]: E0318 10:14:01.522445 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:01 crc kubenswrapper[4733]: E0318 10:14:01.622715 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:01 crc kubenswrapper[4733]: E0318 10:14:01.723023 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:01 crc kubenswrapper[4733]: E0318 10:14:01.823377 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:01 crc kubenswrapper[4733]: E0318 10:14:01.923704 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:02 crc kubenswrapper[4733]: E0318 10:14:02.024163 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:02 crc kubenswrapper[4733]: E0318 10:14:02.124716 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:02 crc kubenswrapper[4733]: E0318 10:14:02.225776 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:02 crc kubenswrapper[4733]: E0318 10:14:02.326260 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:02 crc kubenswrapper[4733]: E0318 10:14:02.426855 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:02 crc kubenswrapper[4733]: E0318 10:14:02.527015 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:02 crc kubenswrapper[4733]: E0318 10:14:02.627700 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:02 crc kubenswrapper[4733]: E0318 10:14:02.729042 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:02 crc kubenswrapper[4733]: E0318 10:14:02.830019 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:02 crc kubenswrapper[4733]: E0318 10:14:02.930923 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:03 crc kubenswrapper[4733]: E0318 10:14:03.031137 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:03 crc kubenswrapper[4733]: E0318 10:14:03.132106 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:03 crc kubenswrapper[4733]: E0318 10:14:03.233280 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:03 crc kubenswrapper[4733]: E0318 10:14:03.333921 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:03 crc kubenswrapper[4733]: E0318 10:14:03.435123 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:03 crc kubenswrapper[4733]: E0318 10:14:03.536062 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:03 crc kubenswrapper[4733]: E0318 10:14:03.637337 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:03 crc kubenswrapper[4733]: E0318 10:14:03.738622 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:03 crc kubenswrapper[4733]: E0318 10:14:03.839477 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:03 crc kubenswrapper[4733]: E0318 10:14:03.940121 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:04 crc kubenswrapper[4733]: E0318 10:14:04.040261 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:04 crc kubenswrapper[4733]: E0318 10:14:04.141316 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:04 crc kubenswrapper[4733]: E0318 10:14:04.241859 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:04 crc kubenswrapper[4733]: E0318 10:14:04.342167 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:04 crc kubenswrapper[4733]: E0318 10:14:04.442732 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:04 crc kubenswrapper[4733]: E0318 10:14:04.543464 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:04 crc kubenswrapper[4733]: E0318 10:14:04.644282 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:04 crc kubenswrapper[4733]: E0318 10:14:04.762865 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:04 crc kubenswrapper[4733]: E0318 10:14:04.863910 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:04 crc kubenswrapper[4733]: E0318 10:14:04.964899 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:05 crc kubenswrapper[4733]: E0318 10:14:05.065895 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:05 crc kubenswrapper[4733]: E0318 10:14:05.167167 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:05 crc kubenswrapper[4733]: E0318 10:14:05.268280 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:05 crc kubenswrapper[4733]: E0318 10:14:05.369384 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:05 crc kubenswrapper[4733]: E0318 10:14:05.470540 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:05 crc kubenswrapper[4733]: E0318 10:14:05.570885 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:05 crc kubenswrapper[4733]: E0318 10:14:05.672059 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:05 crc kubenswrapper[4733]: E0318 10:14:05.772801 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:05 crc kubenswrapper[4733]: E0318 10:14:05.873965 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:05 crc kubenswrapper[4733]: E0318 10:14:05.974817 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:06 crc kubenswrapper[4733]: E0318 10:14:06.075814 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:06 crc kubenswrapper[4733]: E0318 10:14:06.177060 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:06 crc kubenswrapper[4733]: E0318 10:14:06.278274 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:06 crc kubenswrapper[4733]: E0318 10:14:06.379030 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:06 crc kubenswrapper[4733]: E0318 10:14:06.479526 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:06 crc kubenswrapper[4733]: E0318 10:14:06.580090 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:06 crc kubenswrapper[4733]: E0318 10:14:06.681199 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:06 crc kubenswrapper[4733]: E0318 10:14:06.782102 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:06 crc kubenswrapper[4733]: E0318 10:14:06.883040 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:06 crc kubenswrapper[4733]: E0318 10:14:06.984082 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:07 crc kubenswrapper[4733]: E0318 10:14:07.085264 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:07 crc kubenswrapper[4733]: E0318 10:14:07.185670 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:07 crc kubenswrapper[4733]: E0318 10:14:07.286501 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:07 crc kubenswrapper[4733]: E0318 10:14:07.387511 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:07 crc kubenswrapper[4733]: E0318 10:14:07.487985 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:07 crc kubenswrapper[4733]: E0318 10:14:07.589121 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:07 crc kubenswrapper[4733]: E0318 10:14:07.689710 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:07 crc kubenswrapper[4733]: E0318 10:14:07.790339 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:07 crc kubenswrapper[4733]: E0318 10:14:07.891497 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:07 crc kubenswrapper[4733]: E0318 10:14:07.992262 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:08 crc kubenswrapper[4733]: E0318 10:14:08.093347 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:08 crc kubenswrapper[4733]: E0318 10:14:08.193981 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:08 crc kubenswrapper[4733]: E0318 10:14:08.294963 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:08 crc kubenswrapper[4733]: E0318 10:14:08.395479 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:08 crc kubenswrapper[4733]: E0318 10:14:08.496543 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:08 crc kubenswrapper[4733]: E0318 10:14:08.597585 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:08 crc kubenswrapper[4733]: E0318 10:14:08.698291 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:08 crc kubenswrapper[4733]: E0318 10:14:08.798593 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:08 crc kubenswrapper[4733]: E0318 10:14:08.899765 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:08 crc kubenswrapper[4733]: E0318 10:14:08.999906 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:09 crc kubenswrapper[4733]: E0318 10:14:09.100088 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:09 crc kubenswrapper[4733]: E0318 10:14:09.200649 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:09 crc kubenswrapper[4733]: E0318 10:14:09.301459 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:09 crc kubenswrapper[4733]: E0318 10:14:09.402648 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:09 crc kubenswrapper[4733]: E0318 10:14:09.503008 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:09 crc kubenswrapper[4733]: E0318 10:14:09.604122 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:09 crc kubenswrapper[4733]: E0318 10:14:09.704296 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:09 crc kubenswrapper[4733]: E0318 10:14:09.804454 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:09 crc kubenswrapper[4733]: E0318 10:14:09.905532 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:10 crc kubenswrapper[4733]: E0318 10:14:10.006136 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:10 crc kubenswrapper[4733]: E0318 10:14:10.106561 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:10 crc kubenswrapper[4733]: E0318 10:14:10.207395 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:10 crc kubenswrapper[4733]: E0318 10:14:10.308219 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:10 crc kubenswrapper[4733]: E0318 10:14:10.409263 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:10 crc kubenswrapper[4733]: E0318 10:14:10.509422 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:10 crc kubenswrapper[4733]: E0318 10:14:10.609958 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:10 crc kubenswrapper[4733]: E0318 10:14:10.710846 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:10 crc kubenswrapper[4733]: E0318 10:14:10.811133 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:10 crc kubenswrapper[4733]: E0318 10:14:10.912049 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:11 crc kubenswrapper[4733]: E0318 10:14:11.012564 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:11 crc kubenswrapper[4733]: E0318 10:14:11.113571 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:11 crc kubenswrapper[4733]: E0318 10:14:11.116904 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 18 10:14:11 crc kubenswrapper[4733]: I0318 10:14:11.121553 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:11 crc kubenswrapper[4733]: I0318 10:14:11.121596 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:11 crc kubenswrapper[4733]: I0318 10:14:11.121609 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:11 crc kubenswrapper[4733]: I0318 10:14:11.121628 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:11 crc kubenswrapper[4733]: I0318 10:14:11.121642 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:11Z","lastTransitionTime":"2026-03-18T10:14:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:11 crc kubenswrapper[4733]: E0318 10:14:11.130951 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a826494-c246-4717-869b-fd136e2b8410\\\",\\\"systemUUID\\\":\\\"fe704b25-4cdf-410a-9afb-ebc7963f4bc5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:11 crc kubenswrapper[4733]: I0318 10:14:11.134659 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:11 crc kubenswrapper[4733]: I0318 10:14:11.134677 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:11 crc kubenswrapper[4733]: I0318 10:14:11.134685 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:11 crc kubenswrapper[4733]: I0318 10:14:11.134698 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:11 crc kubenswrapper[4733]: I0318 10:14:11.134706 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:11Z","lastTransitionTime":"2026-03-18T10:14:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:11 crc kubenswrapper[4733]: E0318 10:14:11.143347 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a826494-c246-4717-869b-fd136e2b8410\\\",\\\"systemUUID\\\":\\\"fe704b25-4cdf-410a-9afb-ebc7963f4bc5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:11 crc kubenswrapper[4733]: I0318 10:14:11.146346 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:11 crc kubenswrapper[4733]: I0318 10:14:11.146361 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:11 crc kubenswrapper[4733]: I0318 10:14:11.146370 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:11 crc kubenswrapper[4733]: I0318 10:14:11.146382 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:11 crc kubenswrapper[4733]: I0318 10:14:11.146391 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:11Z","lastTransitionTime":"2026-03-18T10:14:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:11 crc kubenswrapper[4733]: E0318 10:14:11.154683 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a826494-c246-4717-869b-fd136e2b8410\\\",\\\"systemUUID\\\":\\\"fe704b25-4cdf-410a-9afb-ebc7963f4bc5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:11 crc kubenswrapper[4733]: I0318 10:14:11.158933 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:11 crc kubenswrapper[4733]: I0318 10:14:11.158966 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:11 crc kubenswrapper[4733]: I0318 10:14:11.158977 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:11 crc kubenswrapper[4733]: I0318 10:14:11.158995 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:11 crc kubenswrapper[4733]: I0318 10:14:11.159007 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:11Z","lastTransitionTime":"2026-03-18T10:14:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:11 crc kubenswrapper[4733]: E0318 10:14:11.169658 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a826494-c246-4717-869b-fd136e2b8410\\\",\\\"systemUUID\\\":\\\"fe704b25-4cdf-410a-9afb-ebc7963f4bc5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:11 crc kubenswrapper[4733]: E0318 10:14:11.169809 4733 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 10:14:11 crc kubenswrapper[4733]: I0318 10:14:11.174634 4733 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 10:14:11 crc kubenswrapper[4733]: I0318 10:14:11.175524 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:11 crc kubenswrapper[4733]: I0318 10:14:11.175579 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:11 crc kubenswrapper[4733]: I0318 10:14:11.175591 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:11 crc kubenswrapper[4733]: I0318 10:14:11.176290 4733 scope.go:117] "RemoveContainer" containerID="ba371d0dc81f8827d305037cab25306e3abe8ed3d243f74923b4709198f7ea38" Mar 18 10:14:11 crc kubenswrapper[4733]: E0318 10:14:11.176522 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 10:14:11 crc kubenswrapper[4733]: E0318 10:14:11.214013 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:11 crc kubenswrapper[4733]: E0318 10:14:11.250710 4733 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 10:14:11 crc kubenswrapper[4733]: E0318 10:14:11.314944 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:11 crc kubenswrapper[4733]: E0318 10:14:11.416008 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:11 crc kubenswrapper[4733]: E0318 10:14:11.516593 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:11 crc kubenswrapper[4733]: E0318 10:14:11.617035 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:11 crc kubenswrapper[4733]: E0318 10:14:11.718061 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:11 crc kubenswrapper[4733]: E0318 10:14:11.818702 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:11 crc kubenswrapper[4733]: E0318 10:14:11.919612 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:12 crc kubenswrapper[4733]: E0318 10:14:12.020731 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:12 crc kubenswrapper[4733]: E0318 10:14:12.121083 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:12 crc kubenswrapper[4733]: E0318 10:14:12.221526 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:12 crc kubenswrapper[4733]: E0318 10:14:12.321639 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:12 crc kubenswrapper[4733]: E0318 10:14:12.422732 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:12 crc kubenswrapper[4733]: E0318 10:14:12.523377 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:12 crc kubenswrapper[4733]: E0318 10:14:12.623563 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:12 crc kubenswrapper[4733]: E0318 10:14:12.723961 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:12 crc kubenswrapper[4733]: E0318 10:14:12.824991 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:12 crc kubenswrapper[4733]: E0318 10:14:12.925529 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:13 crc kubenswrapper[4733]: E0318 10:14:13.026697 4733 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.048336 4733 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.129383 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.129446 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.129469 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.129497 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.129522 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:13Z","lastTransitionTime":"2026-03-18T10:14:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.158685 4733 apiserver.go:52] "Watching apiserver" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.167173 4733 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.168796 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-ovn-kubernetes/ovnkube-node-7pxwd","openshift-multus/multus-additional-cni-plugins-t28sh","openshift-network-node-identity/network-node-identity-vrzqb","openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spfjj","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-image-registry/node-ca-xfvfl","openshift-multus/multus-g6j2q","openshift-network-operator/iptables-alerter-4ln5h","openshift-dns/node-resolver-hsk58","openshift-multus/network-metrics-daemon-4s425","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-target-xd92c","openshift-machine-config-operator/machine-config-daemon-2h7dp"] Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.169280 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.169289 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:14:13 crc kubenswrapper[4733]: E0318 10:14:13.169486 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.169900 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.172111 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:14:13 crc kubenswrapper[4733]: E0318 10:14:13.172136 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.170765 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-xfvfl" Mar 18 10:14:13 crc kubenswrapper[4733]: E0318 10:14:13.172221 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.170982 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:14:13 crc kubenswrapper[4733]: E0318 10:14:13.172303 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4s425" podUID="b3650177-e338-4eba-ab42-bc0cd14c9d65" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.171060 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.170877 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-t28sh" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.172705 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.169906 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.171019 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spfjj" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.171171 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hsk58" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.170149 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.171321 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.173481 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-g6j2q" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.174221 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.174986 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.175983 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.176021 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.177622 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.181903 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.182267 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.182495 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.184860 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.184950 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.184870 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.184962 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.185540 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.185649 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.185740 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.185966 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.185979 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.186109 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.186239 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.186256 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.186553 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.186558 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.188021 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.188945 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.189416 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.189732 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.190013 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.190287 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.190612 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.190850 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.191067 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.191928 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.192244 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.192543 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.192867 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.194531 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.200689 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.204253 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.222227 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c2c181c8-3361-40a2-afc5-a677e0ab4ecd-hosts-file\") pod \"node-resolver-hsk58\" (UID: \"c2c181c8-3361-40a2-afc5-a677e0ab4ecd\") " pod="openshift-dns/node-resolver-hsk58" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.222476 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b3650177-e338-4eba-ab42-bc0cd14c9d65-metrics-certs\") pod \"network-metrics-daemon-4s425\" (UID: \"b3650177-e338-4eba-ab42-bc0cd14c9d65\") " pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.222514 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9zpb\" (UniqueName: \"kubernetes.io/projected/b3650177-e338-4eba-ab42-bc0cd14c9d65-kube-api-access-x9zpb\") pod \"network-metrics-daemon-4s425\" (UID: \"b3650177-e338-4eba-ab42-bc0cd14c9d65\") " pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.222543 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-httph\" (UniqueName: \"kubernetes.io/projected/c2c181c8-3361-40a2-afc5-a677e0ab4ecd-kube-api-access-httph\") pod \"node-resolver-hsk58\" (UID: \"c2c181c8-3361-40a2-afc5-a677e0ab4ecd\") " pod="openshift-dns/node-resolver-hsk58" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.223522 4733 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.223759 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.233295 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.233336 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.233354 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.233379 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.233399 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:13Z","lastTransitionTime":"2026-03-18T10:14:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.241174 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.253665 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4s425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3650177-e338-4eba-ab42-bc0cd14c9d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4s425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.266072 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.280452 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spfjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d693a73-68c1-4595-bbcc-be97691b06fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-spfjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.288819 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hsk58" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2c181c8-3361-40a2-afc5-a677e0ab4ecd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-httph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hsk58\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.297732 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.311856 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g6j2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ph8vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g6j2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.323104 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.323166 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.323286 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.323325 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.323363 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.323398 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.323430 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.323463 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.323495 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.323530 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.323560 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.323593 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.323627 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.323661 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.323723 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.323758 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.323793 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.323912 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.323963 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.324001 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.324036 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.324082 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.324119 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.324152 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.324208 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.324260 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.324426 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.324468 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.324504 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.324541 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.324622 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.324656 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.324689 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.324722 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.324772 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.324806 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.324844 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.324881 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.324913 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.324947 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.324981 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.325057 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.325130 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.325283 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.325411 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.325528 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.325457 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.325652 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.325687 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.325721 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.325758 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.325882 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.325957 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.325991 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.326029 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.326066 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.326099 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.326131 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.326164 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.326221 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.326255 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.326285 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.326315 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.326346 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.326377 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.326411 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.326443 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.326476 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.326508 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.326542 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.326580 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.326612 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.326643 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.326677 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.326709 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.326743 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.326777 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.326810 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.326758 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.326842 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.327010 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.325884 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.325969 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.326144 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.326379 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.326415 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.326959 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.326949 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.327265 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.327286 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.327600 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.327625 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.327657 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.327755 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.327783 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.327796 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.327831 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.328171 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.328168 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.328119 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.328124 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.328384 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.328494 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.328734 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.328780 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.328960 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.329120 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.329374 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.329578 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.329618 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: E0318 10:14:13.329737 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:14:13.828074935 +0000 UTC m=+93.319809340 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.330776 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.329757 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.329833 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.329871 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.329971 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.329978 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.330154 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.330153 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.330302 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.330373 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.330514 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.330808 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.330826 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.331320 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.331328 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.331335 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.331444 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.331464 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.331642 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.331686 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.331729 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.331721 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.332214 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.332261 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.332357 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.332449 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.332658 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.332726 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.333033 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.333099 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.333110 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.333392 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.333433 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.333458 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.333518 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.333517 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.333563 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.333808 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.333872 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.334023 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.334303 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.334314 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.334524 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.334342 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.334451 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.334459 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.334607 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.334624 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.334660 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.334689 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.334748 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.334995 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.335217 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.335004 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.335518 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.335592 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.335666 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.335708 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.335723 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.335768 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.335805 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.335840 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.335908 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.335985 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.336039 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.336265 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.336369 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.336294 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.336499 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.336599 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.336673 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.336691 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.336700 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.336755 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.337050 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.337128 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.337368 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.337419 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.337434 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.337452 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.337463 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:13Z","lastTransitionTime":"2026-03-18T10:14:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.337603 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.337645 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.337662 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.337863 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.338331 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.338423 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.338506 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.338760 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.338825 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.338759 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.338858 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.338928 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.338953 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.338970 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.339009 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.339048 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.339214 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.339256 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.339295 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.339328 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.339331 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.339356 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.339374 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.339466 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.339508 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.339543 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.339582 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.339615 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.339734 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.339777 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.339818 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.339828 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.339819 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.339955 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.339991 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.340023 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.340054 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.340089 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.340123 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.340238 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.340271 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.340305 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.340344 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.340376 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.340407 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.340440 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.340473 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.340505 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.340536 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.340605 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.340637 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.340670 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.340700 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.340733 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.340764 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.340800 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.340831 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.340865 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.340899 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.340933 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.340966 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.341002 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.341035 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.341070 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.341104 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.341137 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.341171 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.341241 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.341277 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.341311 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.341343 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.341377 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.341414 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.341448 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.341482 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.339945 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.340068 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.340209 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.340405 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.340443 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.340481 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.340758 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.341018 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.341098 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.341311 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.341552 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.341580 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.341876 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.341984 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.342009 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.342021 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.342029 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.342081 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.342161 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.342336 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.342349 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.342369 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.342397 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.342420 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.342444 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.342469 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.342494 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.342517 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.342540 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.342564 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.342586 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.342607 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.342633 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.342662 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.342698 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.342725 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.342748 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.342781 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.342825 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.342856 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.342892 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.342922 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.342947 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.342949 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.342964 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.343009 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.343021 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.343102 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-systemd-units\") pod \"ovnkube-node-7pxwd\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.343326 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwk4s\" (UniqueName: \"kubernetes.io/projected/0f82588a-9dbd-4c55-8cfc-f96e57fa58b9-kube-api-access-xwk4s\") pod \"multus-additional-cni-plugins-t28sh\" (UID: \"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9\") " pod="openshift-multus/multus-additional-cni-plugins-t28sh" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.343382 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-httph\" (UniqueName: \"kubernetes.io/projected/c2c181c8-3361-40a2-afc5-a677e0ab4ecd-kube-api-access-httph\") pod \"node-resolver-hsk58\" (UID: \"c2c181c8-3361-40a2-afc5-a677e0ab4ecd\") " pod="openshift-dns/node-resolver-hsk58" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.343418 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9zpb\" (UniqueName: \"kubernetes.io/projected/b3650177-e338-4eba-ab42-bc0cd14c9d65-kube-api-access-x9zpb\") pod \"network-metrics-daemon-4s425\" (UID: \"b3650177-e338-4eba-ab42-bc0cd14c9d65\") " pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.343444 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.343460 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bb58b528-9013-4fab-9747-60bb6ff1bc1f-serviceca\") pod \"node-ca-xfvfl\" (UID: \"bb58b528-9013-4fab-9747-60bb6ff1bc1f\") " pod="openshift-image-registry/node-ca-xfvfl" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.343504 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0f82588a-9dbd-4c55-8cfc-f96e57fa58b9-cnibin\") pod \"multus-additional-cni-plugins-t28sh\" (UID: \"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9\") " pod="openshift-multus/multus-additional-cni-plugins-t28sh" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.343545 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cc85b0d4-15a5-4894-9f07-9aaeb28f63fa-host-var-lib-kubelet\") pod \"multus-g6j2q\" (UID: \"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\") " pod="openshift-multus/multus-g6j2q" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.343579 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cc85b0d4-15a5-4894-9f07-9aaeb28f63fa-multus-conf-dir\") pod \"multus-g6j2q\" (UID: \"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\") " pod="openshift-multus/multus-g6j2q" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.343610 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bb58b528-9013-4fab-9747-60bb6ff1bc1f-host\") pod \"node-ca-xfvfl\" (UID: \"bb58b528-9013-4fab-9747-60bb6ff1bc1f\") " pod="openshift-image-registry/node-ca-xfvfl" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.343641 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-var-lib-openvswitch\") pod \"ovnkube-node-7pxwd\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.343674 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/6f75e1c5-e0c5-43df-944f-77b734070793-rootfs\") pod \"machine-config-daemon-2h7dp\" (UID: \"6f75e1c5-e0c5-43df-944f-77b734070793\") " pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.343703 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6f75e1c5-e0c5-43df-944f-77b734070793-proxy-tls\") pod \"machine-config-daemon-2h7dp\" (UID: \"6f75e1c5-e0c5-43df-944f-77b734070793\") " pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.343737 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0f82588a-9dbd-4c55-8cfc-f96e57fa58b9-cni-binary-copy\") pod \"multus-additional-cni-plugins-t28sh\" (UID: \"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9\") " pod="openshift-multus/multus-additional-cni-plugins-t28sh" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.343771 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0f82588a-9dbd-4c55-8cfc-f96e57fa58b9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-t28sh\" (UID: \"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9\") " pod="openshift-multus/multus-additional-cni-plugins-t28sh" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.343805 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zg7jp\" (UniqueName: \"kubernetes.io/projected/bb58b528-9013-4fab-9747-60bb6ff1bc1f-kube-api-access-zg7jp\") pod \"node-ca-xfvfl\" (UID: \"bb58b528-9013-4fab-9747-60bb6ff1bc1f\") " pod="openshift-image-registry/node-ca-xfvfl" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.343837 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cc85b0d4-15a5-4894-9f07-9aaeb28f63fa-cnibin\") pod \"multus-g6j2q\" (UID: \"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\") " pod="openshift-multus/multus-g6j2q" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.343870 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cc85b0d4-15a5-4894-9f07-9aaeb28f63fa-os-release\") pod \"multus-g6j2q\" (UID: \"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\") " pod="openshift-multus/multus-g6j2q" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.343902 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/cc85b0d4-15a5-4894-9f07-9aaeb28f63fa-host-run-multus-certs\") pod \"multus-g6j2q\" (UID: \"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\") " pod="openshift-multus/multus-g6j2q" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.343933 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-node-log\") pod \"ovnkube-node-7pxwd\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.343966 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7pxwd\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.344001 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqxdr\" (UniqueName: \"kubernetes.io/projected/73327417-4d3b-45f1-b3b6-575fdeeaa31a-kube-api-access-zqxdr\") pod \"ovnkube-node-7pxwd\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.344035 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-host-cni-bin\") pod \"ovnkube-node-7pxwd\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.344072 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpnv6\" (UniqueName: \"kubernetes.io/projected/6f75e1c5-e0c5-43df-944f-77b734070793-kube-api-access-xpnv6\") pod \"machine-config-daemon-2h7dp\" (UID: \"6f75e1c5-e0c5-43df-944f-77b734070793\") " pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.344102 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0f82588a-9dbd-4c55-8cfc-f96e57fa58b9-os-release\") pod \"multus-additional-cni-plugins-t28sh\" (UID: \"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9\") " pod="openshift-multus/multus-additional-cni-plugins-t28sh" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.344138 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c2c181c8-3361-40a2-afc5-a677e0ab4ecd-hosts-file\") pod \"node-resolver-hsk58\" (UID: \"c2c181c8-3361-40a2-afc5-a677e0ab4ecd\") " pod="openshift-dns/node-resolver-hsk58" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.344173 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-etc-openvswitch\") pod \"ovnkube-node-7pxwd\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.344234 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-log-socket\") pod \"ovnkube-node-7pxwd\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.344271 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/73327417-4d3b-45f1-b3b6-575fdeeaa31a-env-overrides\") pod \"ovnkube-node-7pxwd\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.344305 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cc85b0d4-15a5-4894-9f07-9aaeb28f63fa-multus-cni-dir\") pod \"multus-g6j2q\" (UID: \"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\") " pod="openshift-multus/multus-g6j2q" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.344336 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/cc85b0d4-15a5-4894-9f07-9aaeb28f63fa-host-run-k8s-cni-cncf-io\") pod \"multus-g6j2q\" (UID: \"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\") " pod="openshift-multus/multus-g6j2q" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.344373 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/cc85b0d4-15a5-4894-9f07-9aaeb28f63fa-host-var-lib-cni-multus\") pod \"multus-g6j2q\" (UID: \"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\") " pod="openshift-multus/multus-g6j2q" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.344412 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/cc85b0d4-15a5-4894-9f07-9aaeb28f63fa-multus-socket-dir-parent\") pod \"multus-g6j2q\" (UID: \"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\") " pod="openshift-multus/multus-g6j2q" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.344449 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0f82588a-9dbd-4c55-8cfc-f96e57fa58b9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-t28sh\" (UID: \"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9\") " pod="openshift-multus/multus-additional-cni-plugins-t28sh" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.344482 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7d693a73-68c1-4595-bbcc-be97691b06fe-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-spfjj\" (UID: \"7d693a73-68c1-4595-bbcc-be97691b06fe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spfjj" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.344512 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-run-systemd\") pod \"ovnkube-node-7pxwd\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.344543 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0f82588a-9dbd-4c55-8cfc-f96e57fa58b9-system-cni-dir\") pod \"multus-additional-cni-plugins-t28sh\" (UID: \"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9\") " pod="openshift-multus/multus-additional-cni-plugins-t28sh" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.344612 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cc85b0d4-15a5-4894-9f07-9aaeb28f63fa-cni-binary-copy\") pod \"multus-g6j2q\" (UID: \"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\") " pod="openshift-multus/multus-g6j2q" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.344643 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cc85b0d4-15a5-4894-9f07-9aaeb28f63fa-host-var-lib-cni-bin\") pod \"multus-g6j2q\" (UID: \"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\") " pod="openshift-multus/multus-g6j2q" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.344675 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cc85b0d4-15a5-4894-9f07-9aaeb28f63fa-etc-kubernetes\") pod \"multus-g6j2q\" (UID: \"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\") " pod="openshift-multus/multus-g6j2q" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.344706 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ph8vv\" (UniqueName: \"kubernetes.io/projected/cc85b0d4-15a5-4894-9f07-9aaeb28f63fa-kube-api-access-ph8vv\") pod \"multus-g6j2q\" (UID: \"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\") " pod="openshift-multus/multus-g6j2q" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.344746 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.344779 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-host-kubelet\") pod \"ovnkube-node-7pxwd\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.344810 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-host-run-netns\") pod \"ovnkube-node-7pxwd\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.344851 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.342328 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xfvfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb58b528-9013-4fab-9747-60bb6ff1bc1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg7jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xfvfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.343509 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.343713 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.343877 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.344182 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.344336 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.344370 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.344739 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.344879 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.344889 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.345387 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.345418 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.345403 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.345858 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.345932 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c2c181c8-3361-40a2-afc5-a677e0ab4ecd-hosts-file\") pod \"node-resolver-hsk58\" (UID: \"c2c181c8-3361-40a2-afc5-a677e0ab4ecd\") " pod="openshift-dns/node-resolver-hsk58" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.345950 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.345957 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.346235 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.346285 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.346375 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: E0318 10:14:13.346519 4733 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 10:14:13 crc kubenswrapper[4733]: E0318 10:14:13.346613 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 10:14:13.846588872 +0000 UTC m=+93.338323227 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.346950 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.347472 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.347741 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.349183 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.349949 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.350312 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.350671 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.351788 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.351802 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.351853 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.360350 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.362082 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.362329 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.362483 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.362966 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.363081 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.363588 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.363825 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.363921 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.364352 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.364360 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.364965 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.365043 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.365064 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.365453 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.365778 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.365822 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.366180 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.366391 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.344896 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cc85b0d4-15a5-4894-9f07-9aaeb28f63fa-system-cni-dir\") pod \"multus-g6j2q\" (UID: \"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\") " pod="openshift-multus/multus-g6j2q" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.366767 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.366803 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.366814 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.366824 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-run-ovn\") pod \"ovnkube-node-7pxwd\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.366879 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.366920 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.367026 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-host-slash\") pod \"ovnkube-node-7pxwd\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.367062 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-host-run-ovn-kubernetes\") pod \"ovnkube-node-7pxwd\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.367074 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.367090 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/73327417-4d3b-45f1-b3b6-575fdeeaa31a-ovnkube-script-lib\") pod \"ovnkube-node-7pxwd\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.367121 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.367148 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.367178 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6f75e1c5-e0c5-43df-944f-77b734070793-mcd-auth-proxy-config\") pod \"machine-config-daemon-2h7dp\" (UID: \"6f75e1c5-e0c5-43df-944f-77b734070793\") " pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.367225 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.367252 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7d693a73-68c1-4595-bbcc-be97691b06fe-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-spfjj\" (UID: \"7d693a73-68c1-4595-bbcc-be97691b06fe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spfjj" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.367275 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7d693a73-68c1-4595-bbcc-be97691b06fe-env-overrides\") pod \"ovnkube-control-plane-749d76644c-spfjj\" (UID: \"7d693a73-68c1-4595-bbcc-be97691b06fe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spfjj" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.367273 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.367307 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vg7hc\" (UniqueName: \"kubernetes.io/projected/7d693a73-68c1-4595-bbcc-be97691b06fe-kube-api-access-vg7hc\") pod \"ovnkube-control-plane-749d76644c-spfjj\" (UID: \"7d693a73-68c1-4595-bbcc-be97691b06fe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spfjj" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.367418 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.367450 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.367320 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.367465 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.367477 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.367504 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.367572 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.367720 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.367737 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.367796 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.367848 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-host-cni-netd\") pod \"ovnkube-node-7pxwd\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.367898 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/73327417-4d3b-45f1-b3b6-575fdeeaa31a-ovnkube-config\") pod \"ovnkube-node-7pxwd\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.367953 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b3650177-e338-4eba-ab42-bc0cd14c9d65-metrics-certs\") pod \"network-metrics-daemon-4s425\" (UID: \"b3650177-e338-4eba-ab42-bc0cd14c9d65\") " pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.367970 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.367999 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.368012 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cc85b0d4-15a5-4894-9f07-9aaeb28f63fa-host-run-netns\") pod \"multus-g6j2q\" (UID: \"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\") " pod="openshift-multus/multus-g6j2q" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.368068 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/cc85b0d4-15a5-4894-9f07-9aaeb28f63fa-hostroot\") pod \"multus-g6j2q\" (UID: \"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\") " pod="openshift-multus/multus-g6j2q" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.368123 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/cc85b0d4-15a5-4894-9f07-9aaeb28f63fa-multus-daemon-config\") pod \"multus-g6j2q\" (UID: \"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\") " pod="openshift-multus/multus-g6j2q" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.368176 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.366580 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.368271 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.368328 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-run-openvswitch\") pod \"ovnkube-node-7pxwd\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.368379 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/73327417-4d3b-45f1-b3b6-575fdeeaa31a-ovn-node-metrics-cert\") pod \"ovnkube-node-7pxwd\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.368782 4733 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.368833 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.368841 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.368867 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.368898 4733 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.368929 4733 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.368959 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.368987 4733 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.369016 4733 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.369046 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.369078 4733 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.369107 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.369149 4733 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.369185 4733 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: E0318 10:14:13.369235 4733 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 10:14:13 crc kubenswrapper[4733]: E0318 10:14:13.369317 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 10:14:13.869295816 +0000 UTC m=+93.361030161 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.369242 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.369631 4733 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.369648 4733 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.369664 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.369679 4733 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.369692 4733 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.369707 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.369720 4733 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.369735 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.369748 4733 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.369762 4733 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.369775 4733 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.369788 4733 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.369800 4733 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.369813 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.369826 4733 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.369838 4733 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.369834 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 10:14:13 crc kubenswrapper[4733]: E0318 10:14:13.370008 4733 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 10:14:13 crc kubenswrapper[4733]: E0318 10:14:13.370050 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3650177-e338-4eba-ab42-bc0cd14c9d65-metrics-certs podName:b3650177-e338-4eba-ab42-bc0cd14c9d65 nodeName:}" failed. No retries permitted until 2026-03-18 10:14:13.870038077 +0000 UTC m=+93.361772412 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b3650177-e338-4eba-ab42-bc0cd14c9d65-metrics-certs") pod "network-metrics-daemon-4s425" (UID: "b3650177-e338-4eba-ab42-bc0cd14c9d65") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.370168 4733 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.370481 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.369853 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.370541 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.370562 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.370582 4733 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.370600 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.370624 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.370646 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.370667 4733 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.370687 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.370707 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.370725 4733 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.370744 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.370762 4733 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.370784 4733 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.370802 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.370820 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.370837 4733 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.370854 4733 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.370873 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.370892 4733 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.370910 4733 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.370927 4733 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.370945 4733 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.370963 4733 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.370981 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371001 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371018 4733 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371035 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371052 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371066 4733 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371078 4733 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371091 4733 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371103 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371116 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371128 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371131 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371141 4733 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371154 4733 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371168 4733 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371181 4733 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371224 4733 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371242 4733 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371261 4733 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371276 4733 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371288 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371301 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371315 4733 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371327 4733 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371340 4733 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371353 4733 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371365 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371378 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371390 4733 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371403 4733 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371415 4733 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371429 4733 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371441 4733 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371454 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371468 4733 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371479 4733 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371492 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371505 4733 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371524 4733 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371535 4733 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371549 4733 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371562 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371574 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371587 4733 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371603 4733 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371616 4733 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371628 4733 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371641 4733 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371652 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371664 4733 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371676 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371688 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371701 4733 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371713 4733 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371728 4733 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371740 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371752 4733 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371765 4733 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371777 4733 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371789 4733 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371802 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371813 4733 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371826 4733 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371839 4733 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371851 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371863 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371876 4733 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371888 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371900 4733 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371913 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371927 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371939 4733 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371950 4733 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371962 4733 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371974 4733 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371985 4733 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.371997 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.372008 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.372020 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.372032 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.372044 4733 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.372056 4733 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.372067 4733 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.372078 4733 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.372090 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.372101 4733 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.372113 4733 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.372125 4733 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.372137 4733 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.372150 4733 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.372163 4733 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.372174 4733 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.372206 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.372224 4733 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.372240 4733 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.372257 4733 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.372273 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.372291 4733 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.372306 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.372325 4733 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.372342 4733 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.372354 4733 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.372448 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.372465 4733 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.372486 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.372499 4733 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.372513 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.372599 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.372635 4733 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.372658 4733 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.372679 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.372700 4733 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.372724 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: E0318 10:14:13.372731 4733 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 10:14:13 crc kubenswrapper[4733]: E0318 10:14:13.372786 4733 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 10:14:13 crc kubenswrapper[4733]: E0318 10:14:13.372815 4733 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 10:14:13 crc kubenswrapper[4733]: E0318 10:14:13.372963 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 10:14:13.872921257 +0000 UTC m=+93.364655792 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.372743 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.373041 4733 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.373067 4733 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.373090 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.373111 4733 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.373139 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.373163 4733 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.383110 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.383554 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.383769 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.385436 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.385162 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73327417-4d3b-45f1-b3b6-575fdeeaa31a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pxwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.387949 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.388357 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.388872 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.391482 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.391609 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: E0318 10:14:13.392714 4733 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 10:14:13 crc kubenswrapper[4733]: E0318 10:14:13.392748 4733 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 10:14:13 crc kubenswrapper[4733]: E0318 10:14:13.392772 4733 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 10:14:13 crc kubenswrapper[4733]: E0318 10:14:13.392871 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 10:14:13.892842293 +0000 UTC m=+93.384576818 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.392990 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9zpb\" (UniqueName: \"kubernetes.io/projected/b3650177-e338-4eba-ab42-bc0cd14c9d65-kube-api-access-x9zpb\") pod \"network-metrics-daemon-4s425\" (UID: \"b3650177-e338-4eba-ab42-bc0cd14c9d65\") " pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.393426 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.393969 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-httph\" (UniqueName: \"kubernetes.io/projected/c2c181c8-3361-40a2-afc5-a677e0ab4ecd-kube-api-access-httph\") pod \"node-resolver-hsk58\" (UID: \"c2c181c8-3361-40a2-afc5-a677e0ab4ecd\") " pod="openshift-dns/node-resolver-hsk58" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.396335 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.397706 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.398521 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.399043 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.399058 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"908bd772-fb33-4f68-8971-d1fef3118c82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3457636bb3e1cc25507158454524b9cee6812beb56c7b22fb86b9438b8082488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.399258 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.399386 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.399155 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.400357 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.400747 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.404636 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.407146 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.417363 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.420870 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.427537 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.432810 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.433049 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.439790 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.442551 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.442599 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.442611 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.442632 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.442647 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:13Z","lastTransitionTime":"2026-03-18T10:14:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.447556 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t28sh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t28sh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.456675 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f75e1c5-e0c5-43df-944f-77b734070793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2h7dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.474133 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-host-slash\") pod \"ovnkube-node-7pxwd\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.474166 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-host-run-ovn-kubernetes\") pod \"ovnkube-node-7pxwd\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.474205 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/73327417-4d3b-45f1-b3b6-575fdeeaa31a-ovnkube-script-lib\") pod \"ovnkube-node-7pxwd\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.474242 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6f75e1c5-e0c5-43df-944f-77b734070793-mcd-auth-proxy-config\") pod \"machine-config-daemon-2h7dp\" (UID: \"6f75e1c5-e0c5-43df-944f-77b734070793\") " pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.474263 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7d693a73-68c1-4595-bbcc-be97691b06fe-env-overrides\") pod \"ovnkube-control-plane-749d76644c-spfjj\" (UID: \"7d693a73-68c1-4595-bbcc-be97691b06fe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spfjj" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.474281 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vg7hc\" (UniqueName: \"kubernetes.io/projected/7d693a73-68c1-4595-bbcc-be97691b06fe-kube-api-access-vg7hc\") pod \"ovnkube-control-plane-749d76644c-spfjj\" (UID: \"7d693a73-68c1-4595-bbcc-be97691b06fe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spfjj" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.474302 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.474321 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.474342 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7d693a73-68c1-4595-bbcc-be97691b06fe-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-spfjj\" (UID: \"7d693a73-68c1-4595-bbcc-be97691b06fe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spfjj" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.474431 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-host-slash\") pod \"ovnkube-node-7pxwd\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.474495 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.474538 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/73327417-4d3b-45f1-b3b6-575fdeeaa31a-ovnkube-config\") pod \"ovnkube-node-7pxwd\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.474642 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.474722 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cc85b0d4-15a5-4894-9f07-9aaeb28f63fa-host-run-netns\") pod \"multus-g6j2q\" (UID: \"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\") " pod="openshift-multus/multus-g6j2q" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.474909 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cc85b0d4-15a5-4894-9f07-9aaeb28f63fa-host-run-netns\") pod \"multus-g6j2q\" (UID: \"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\") " pod="openshift-multus/multus-g6j2q" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.475293 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/73327417-4d3b-45f1-b3b6-575fdeeaa31a-ovnkube-script-lib\") pod \"ovnkube-node-7pxwd\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.475347 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7d693a73-68c1-4595-bbcc-be97691b06fe-env-overrides\") pod \"ovnkube-control-plane-749d76644c-spfjj\" (UID: \"7d693a73-68c1-4595-bbcc-be97691b06fe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spfjj" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.475384 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/cc85b0d4-15a5-4894-9f07-9aaeb28f63fa-hostroot\") pod \"multus-g6j2q\" (UID: \"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\") " pod="openshift-multus/multus-g6j2q" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.475362 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-host-run-ovn-kubernetes\") pod \"ovnkube-node-7pxwd\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.475453 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/cc85b0d4-15a5-4894-9f07-9aaeb28f63fa-multus-daemon-config\") pod \"multus-g6j2q\" (UID: \"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\") " pod="openshift-multus/multus-g6j2q" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.475491 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-host-cni-netd\") pod \"ovnkube-node-7pxwd\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.475517 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-run-openvswitch\") pod \"ovnkube-node-7pxwd\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.475490 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/cc85b0d4-15a5-4894-9f07-9aaeb28f63fa-hostroot\") pod \"multus-g6j2q\" (UID: \"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\") " pod="openshift-multus/multus-g6j2q" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.475541 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/73327417-4d3b-45f1-b3b6-575fdeeaa31a-ovnkube-config\") pod \"ovnkube-node-7pxwd\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.475549 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/73327417-4d3b-45f1-b3b6-575fdeeaa31a-ovn-node-metrics-cert\") pod \"ovnkube-node-7pxwd\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.475565 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7d693a73-68c1-4595-bbcc-be97691b06fe-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-spfjj\" (UID: \"7d693a73-68c1-4595-bbcc-be97691b06fe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spfjj" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.475678 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-run-openvswitch\") pod \"ovnkube-node-7pxwd\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.475702 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-host-cni-netd\") pod \"ovnkube-node-7pxwd\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.475706 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-systemd-units\") pod \"ovnkube-node-7pxwd\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.475758 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwk4s\" (UniqueName: \"kubernetes.io/projected/0f82588a-9dbd-4c55-8cfc-f96e57fa58b9-kube-api-access-xwk4s\") pod \"multus-additional-cni-plugins-t28sh\" (UID: \"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9\") " pod="openshift-multus/multus-additional-cni-plugins-t28sh" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.475755 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-systemd-units\") pod \"ovnkube-node-7pxwd\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.475299 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6f75e1c5-e0c5-43df-944f-77b734070793-mcd-auth-proxy-config\") pod \"machine-config-daemon-2h7dp\" (UID: \"6f75e1c5-e0c5-43df-944f-77b734070793\") " pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.476017 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bb58b528-9013-4fab-9747-60bb6ff1bc1f-serviceca\") pod \"node-ca-xfvfl\" (UID: \"bb58b528-9013-4fab-9747-60bb6ff1bc1f\") " pod="openshift-image-registry/node-ca-xfvfl" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.476119 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cc85b0d4-15a5-4894-9f07-9aaeb28f63fa-host-var-lib-kubelet\") pod \"multus-g6j2q\" (UID: \"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\") " pod="openshift-multus/multus-g6j2q" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.476145 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cc85b0d4-15a5-4894-9f07-9aaeb28f63fa-multus-conf-dir\") pod \"multus-g6j2q\" (UID: \"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\") " pod="openshift-multus/multus-g6j2q" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.476170 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0f82588a-9dbd-4c55-8cfc-f96e57fa58b9-cnibin\") pod \"multus-additional-cni-plugins-t28sh\" (UID: \"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9\") " pod="openshift-multus/multus-additional-cni-plugins-t28sh" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.476214 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bb58b528-9013-4fab-9747-60bb6ff1bc1f-host\") pod \"node-ca-xfvfl\" (UID: \"bb58b528-9013-4fab-9747-60bb6ff1bc1f\") " pod="openshift-image-registry/node-ca-xfvfl" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.476240 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-var-lib-openvswitch\") pod \"ovnkube-node-7pxwd\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.476274 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6f75e1c5-e0c5-43df-944f-77b734070793-proxy-tls\") pod \"machine-config-daemon-2h7dp\" (UID: \"6f75e1c5-e0c5-43df-944f-77b734070793\") " pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.476299 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0f82588a-9dbd-4c55-8cfc-f96e57fa58b9-cni-binary-copy\") pod \"multus-additional-cni-plugins-t28sh\" (UID: \"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9\") " pod="openshift-multus/multus-additional-cni-plugins-t28sh" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.476317 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cc85b0d4-15a5-4894-9f07-9aaeb28f63fa-multus-conf-dir\") pod \"multus-g6j2q\" (UID: \"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\") " pod="openshift-multus/multus-g6j2q" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.476327 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0f82588a-9dbd-4c55-8cfc-f96e57fa58b9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-t28sh\" (UID: \"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9\") " pod="openshift-multus/multus-additional-cni-plugins-t28sh" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.476446 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zg7jp\" (UniqueName: \"kubernetes.io/projected/bb58b528-9013-4fab-9747-60bb6ff1bc1f-kube-api-access-zg7jp\") pod \"node-ca-xfvfl\" (UID: \"bb58b528-9013-4fab-9747-60bb6ff1bc1f\") " pod="openshift-image-registry/node-ca-xfvfl" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.476481 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cc85b0d4-15a5-4894-9f07-9aaeb28f63fa-cnibin\") pod \"multus-g6j2q\" (UID: \"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\") " pod="openshift-multus/multus-g6j2q" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.476532 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cc85b0d4-15a5-4894-9f07-9aaeb28f63fa-os-release\") pod \"multus-g6j2q\" (UID: \"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\") " pod="openshift-multus/multus-g6j2q" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.476580 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/6f75e1c5-e0c5-43df-944f-77b734070793-rootfs\") pod \"machine-config-daemon-2h7dp\" (UID: \"6f75e1c5-e0c5-43df-944f-77b734070793\") " pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.476245 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cc85b0d4-15a5-4894-9f07-9aaeb28f63fa-host-var-lib-kubelet\") pod \"multus-g6j2q\" (UID: \"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\") " pod="openshift-multus/multus-g6j2q" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.476636 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/cc85b0d4-15a5-4894-9f07-9aaeb28f63fa-host-run-multus-certs\") pod \"multus-g6j2q\" (UID: \"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\") " pod="openshift-multus/multus-g6j2q" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.476680 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-node-log\") pod \"ovnkube-node-7pxwd\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.476671 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/cc85b0d4-15a5-4894-9f07-9aaeb28f63fa-host-run-multus-certs\") pod \"multus-g6j2q\" (UID: \"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\") " pod="openshift-multus/multus-g6j2q" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.476708 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7pxwd\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.476734 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqxdr\" (UniqueName: \"kubernetes.io/projected/73327417-4d3b-45f1-b3b6-575fdeeaa31a-kube-api-access-zqxdr\") pod \"ovnkube-node-7pxwd\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.476332 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0f82588a-9dbd-4c55-8cfc-f96e57fa58b9-cnibin\") pod \"multus-additional-cni-plugins-t28sh\" (UID: \"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9\") " pod="openshift-multus/multus-additional-cni-plugins-t28sh" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.476747 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-var-lib-openvswitch\") pod \"ovnkube-node-7pxwd\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.476621 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/cc85b0d4-15a5-4894-9f07-9aaeb28f63fa-multus-daemon-config\") pod \"multus-g6j2q\" (UID: \"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\") " pod="openshift-multus/multus-g6j2q" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.476793 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cc85b0d4-15a5-4894-9f07-9aaeb28f63fa-os-release\") pod \"multus-g6j2q\" (UID: \"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\") " pod="openshift-multus/multus-g6j2q" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.476805 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-node-log\") pod \"ovnkube-node-7pxwd\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.476676 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bb58b528-9013-4fab-9747-60bb6ff1bc1f-host\") pod \"node-ca-xfvfl\" (UID: \"bb58b528-9013-4fab-9747-60bb6ff1bc1f\") " pod="openshift-image-registry/node-ca-xfvfl" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.476832 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/6f75e1c5-e0c5-43df-944f-77b734070793-rootfs\") pod \"machine-config-daemon-2h7dp\" (UID: \"6f75e1c5-e0c5-43df-944f-77b734070793\") " pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.476734 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cc85b0d4-15a5-4894-9f07-9aaeb28f63fa-cnibin\") pod \"multus-g6j2q\" (UID: \"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\") " pod="openshift-multus/multus-g6j2q" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.476755 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpnv6\" (UniqueName: \"kubernetes.io/projected/6f75e1c5-e0c5-43df-944f-77b734070793-kube-api-access-xpnv6\") pod \"machine-config-daemon-2h7dp\" (UID: \"6f75e1c5-e0c5-43df-944f-77b734070793\") " pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.476912 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0f82588a-9dbd-4c55-8cfc-f96e57fa58b9-os-release\") pod \"multus-additional-cni-plugins-t28sh\" (UID: \"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9\") " pod="openshift-multus/multus-additional-cni-plugins-t28sh" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.476935 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-etc-openvswitch\") pod \"ovnkube-node-7pxwd\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.476954 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-log-socket\") pod \"ovnkube-node-7pxwd\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.476972 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-host-cni-bin\") pod \"ovnkube-node-7pxwd\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.476992 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cc85b0d4-15a5-4894-9f07-9aaeb28f63fa-multus-cni-dir\") pod \"multus-g6j2q\" (UID: \"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\") " pod="openshift-multus/multus-g6j2q" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.477011 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/cc85b0d4-15a5-4894-9f07-9aaeb28f63fa-host-run-k8s-cni-cncf-io\") pod \"multus-g6j2q\" (UID: \"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\") " pod="openshift-multus/multus-g6j2q" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.476858 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7pxwd\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.477032 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/cc85b0d4-15a5-4894-9f07-9aaeb28f63fa-host-var-lib-cni-multus\") pod \"multus-g6j2q\" (UID: \"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\") " pod="openshift-multus/multus-g6j2q" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.477055 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/73327417-4d3b-45f1-b3b6-575fdeeaa31a-env-overrides\") pod \"ovnkube-node-7pxwd\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.477081 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/cc85b0d4-15a5-4894-9f07-9aaeb28f63fa-multus-socket-dir-parent\") pod \"multus-g6j2q\" (UID: \"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\") " pod="openshift-multus/multus-g6j2q" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.477086 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-log-socket\") pod \"ovnkube-node-7pxwd\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.477100 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0f82588a-9dbd-4c55-8cfc-f96e57fa58b9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-t28sh\" (UID: \"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9\") " pod="openshift-multus/multus-additional-cni-plugins-t28sh" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.477104 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-host-cni-bin\") pod \"ovnkube-node-7pxwd\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.477139 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cc85b0d4-15a5-4894-9f07-9aaeb28f63fa-multus-cni-dir\") pod \"multus-g6j2q\" (UID: \"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\") " pod="openshift-multus/multus-g6j2q" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.477159 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7d693a73-68c1-4595-bbcc-be97691b06fe-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-spfjj\" (UID: \"7d693a73-68c1-4595-bbcc-be97691b06fe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spfjj" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.477172 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0f82588a-9dbd-4c55-8cfc-f96e57fa58b9-os-release\") pod \"multus-additional-cni-plugins-t28sh\" (UID: \"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9\") " pod="openshift-multus/multus-additional-cni-plugins-t28sh" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.477180 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0f82588a-9dbd-4c55-8cfc-f96e57fa58b9-system-cni-dir\") pod \"multus-additional-cni-plugins-t28sh\" (UID: \"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9\") " pod="openshift-multus/multus-additional-cni-plugins-t28sh" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.477230 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cc85b0d4-15a5-4894-9f07-9aaeb28f63fa-cni-binary-copy\") pod \"multus-g6j2q\" (UID: \"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\") " pod="openshift-multus/multus-g6j2q" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.477252 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-etc-openvswitch\") pod \"ovnkube-node-7pxwd\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.477248 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cc85b0d4-15a5-4894-9f07-9aaeb28f63fa-host-var-lib-cni-bin\") pod \"multus-g6j2q\" (UID: \"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\") " pod="openshift-multus/multus-g6j2q" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.477291 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cc85b0d4-15a5-4894-9f07-9aaeb28f63fa-etc-kubernetes\") pod \"multus-g6j2q\" (UID: \"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\") " pod="openshift-multus/multus-g6j2q" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.477293 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0f82588a-9dbd-4c55-8cfc-f96e57fa58b9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-t28sh\" (UID: \"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9\") " pod="openshift-multus/multus-additional-cni-plugins-t28sh" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.477308 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-run-systemd\") pod \"ovnkube-node-7pxwd\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.477327 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/cc85b0d4-15a5-4894-9f07-9aaeb28f63fa-multus-socket-dir-parent\") pod \"multus-g6j2q\" (UID: \"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\") " pod="openshift-multus/multus-g6j2q" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.477353 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-host-kubelet\") pod \"ovnkube-node-7pxwd\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.477412 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-host-kubelet\") pod \"ovnkube-node-7pxwd\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.477418 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/cc85b0d4-15a5-4894-9f07-9aaeb28f63fa-host-var-lib-cni-multus\") pod \"multus-g6j2q\" (UID: \"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\") " pod="openshift-multus/multus-g6j2q" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.477452 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cc85b0d4-15a5-4894-9f07-9aaeb28f63fa-etc-kubernetes\") pod \"multus-g6j2q\" (UID: \"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\") " pod="openshift-multus/multus-g6j2q" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.477458 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-run-systemd\") pod \"ovnkube-node-7pxwd\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.477487 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0f82588a-9dbd-4c55-8cfc-f96e57fa58b9-system-cni-dir\") pod \"multus-additional-cni-plugins-t28sh\" (UID: \"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9\") " pod="openshift-multus/multus-additional-cni-plugins-t28sh" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.477512 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/cc85b0d4-15a5-4894-9f07-9aaeb28f63fa-host-run-k8s-cni-cncf-io\") pod \"multus-g6j2q\" (UID: \"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\") " pod="openshift-multus/multus-g6j2q" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.477714 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-host-run-netns\") pod \"ovnkube-node-7pxwd\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.477747 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-host-run-netns\") pod \"ovnkube-node-7pxwd\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.477787 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cc85b0d4-15a5-4894-9f07-9aaeb28f63fa-system-cni-dir\") pod \"multus-g6j2q\" (UID: \"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\") " pod="openshift-multus/multus-g6j2q" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.477821 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ph8vv\" (UniqueName: \"kubernetes.io/projected/cc85b0d4-15a5-4894-9f07-9aaeb28f63fa-kube-api-access-ph8vv\") pod \"multus-g6j2q\" (UID: \"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\") " pod="openshift-multus/multus-g6j2q" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.477923 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cc85b0d4-15a5-4894-9f07-9aaeb28f63fa-system-cni-dir\") pod \"multus-g6j2q\" (UID: \"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\") " pod="openshift-multus/multus-g6j2q" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.477983 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cc85b0d4-15a5-4894-9f07-9aaeb28f63fa-host-var-lib-cni-bin\") pod \"multus-g6j2q\" (UID: \"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\") " pod="openshift-multus/multus-g6j2q" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.478271 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/73327417-4d3b-45f1-b3b6-575fdeeaa31a-env-overrides\") pod \"ovnkube-node-7pxwd\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.478332 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0f82588a-9dbd-4c55-8cfc-f96e57fa58b9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-t28sh\" (UID: \"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9\") " pod="openshift-multus/multus-additional-cni-plugins-t28sh" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.478600 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-run-ovn\") pod \"ovnkube-node-7pxwd\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.478644 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cc85b0d4-15a5-4894-9f07-9aaeb28f63fa-cni-binary-copy\") pod \"multus-g6j2q\" (UID: \"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\") " pod="openshift-multus/multus-g6j2q" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.479293 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-run-ovn\") pod \"ovnkube-node-7pxwd\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.479349 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.479586 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.479597 4733 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.479608 4733 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.479617 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.479627 4733 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.479636 4733 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.479646 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.479655 4733 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.479666 4733 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.479675 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.479685 4733 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.479694 4733 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.479704 4733 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.479715 4733 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.479725 4733 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.479737 4733 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.479748 4733 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.479758 4733 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.479768 4733 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.479780 4733 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.479803 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bb58b528-9013-4fab-9747-60bb6ff1bc1f-serviceca\") pod \"node-ca-xfvfl\" (UID: \"bb58b528-9013-4fab-9747-60bb6ff1bc1f\") " pod="openshift-image-registry/node-ca-xfvfl" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.480533 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0f82588a-9dbd-4c55-8cfc-f96e57fa58b9-cni-binary-copy\") pod \"multus-additional-cni-plugins-t28sh\" (UID: \"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9\") " pod="openshift-multus/multus-additional-cni-plugins-t28sh" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.489916 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/73327417-4d3b-45f1-b3b6-575fdeeaa31a-ovn-node-metrics-cert\") pod \"ovnkube-node-7pxwd\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.490251 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6f75e1c5-e0c5-43df-944f-77b734070793-proxy-tls\") pod \"machine-config-daemon-2h7dp\" (UID: \"6f75e1c5-e0c5-43df-944f-77b734070793\") " pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.491348 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7d693a73-68c1-4595-bbcc-be97691b06fe-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-spfjj\" (UID: \"7d693a73-68c1-4595-bbcc-be97691b06fe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spfjj" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.493105 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqxdr\" (UniqueName: \"kubernetes.io/projected/73327417-4d3b-45f1-b3b6-575fdeeaa31a-kube-api-access-zqxdr\") pod \"ovnkube-node-7pxwd\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.494523 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwk4s\" (UniqueName: \"kubernetes.io/projected/0f82588a-9dbd-4c55-8cfc-f96e57fa58b9-kube-api-access-xwk4s\") pod \"multus-additional-cni-plugins-t28sh\" (UID: \"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9\") " pod="openshift-multus/multus-additional-cni-plugins-t28sh" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.495822 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ph8vv\" (UniqueName: \"kubernetes.io/projected/cc85b0d4-15a5-4894-9f07-9aaeb28f63fa-kube-api-access-ph8vv\") pod \"multus-g6j2q\" (UID: \"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\") " pod="openshift-multus/multus-g6j2q" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.496071 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vg7hc\" (UniqueName: \"kubernetes.io/projected/7d693a73-68c1-4595-bbcc-be97691b06fe-kube-api-access-vg7hc\") pod \"ovnkube-control-plane-749d76644c-spfjj\" (UID: \"7d693a73-68c1-4595-bbcc-be97691b06fe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spfjj" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.497097 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpnv6\" (UniqueName: \"kubernetes.io/projected/6f75e1c5-e0c5-43df-944f-77b734070793-kube-api-access-xpnv6\") pod \"machine-config-daemon-2h7dp\" (UID: \"6f75e1c5-e0c5-43df-944f-77b734070793\") " pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.498865 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zg7jp\" (UniqueName: \"kubernetes.io/projected/bb58b528-9013-4fab-9747-60bb6ff1bc1f-kube-api-access-zg7jp\") pod \"node-ca-xfvfl\" (UID: \"bb58b528-9013-4fab-9747-60bb6ff1bc1f\") " pod="openshift-image-registry/node-ca-xfvfl" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.520774 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.521433 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-xfvfl" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.531082 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.536392 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-t28sh" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.545015 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.547639 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.547728 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.547786 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.547860 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.547919 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:13Z","lastTransitionTime":"2026-03-18T10:14:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:13 crc kubenswrapper[4733]: W0318 10:14:13.547794 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-e36bf33aebc25dbcc802b0414aa206e2fb77c5c19a7273e84ddfa5ff8f3ed1da WatchSource:0}: Error finding container e36bf33aebc25dbcc802b0414aa206e2fb77c5c19a7273e84ddfa5ff8f3ed1da: Status 404 returned error can't find the container with id e36bf33aebc25dbcc802b0414aa206e2fb77c5c19a7273e84ddfa5ff8f3ed1da Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.552882 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spfjj" Mar 18 10:14:13 crc kubenswrapper[4733]: E0318 10:14:13.554846 4733 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 10:14:13 crc kubenswrapper[4733]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 18 10:14:13 crc kubenswrapper[4733]: set -o allexport Mar 18 10:14:13 crc kubenswrapper[4733]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 18 10:14:13 crc kubenswrapper[4733]: source /etc/kubernetes/apiserver-url.env Mar 18 10:14:13 crc kubenswrapper[4733]: else Mar 18 10:14:13 crc kubenswrapper[4733]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 18 10:14:13 crc kubenswrapper[4733]: exit 1 Mar 18 10:14:13 crc kubenswrapper[4733]: fi Mar 18 10:14:13 crc kubenswrapper[4733]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 18 10:14:13 crc kubenswrapper[4733]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 10:14:13 crc kubenswrapper[4733]: > logger="UnhandledError" Mar 18 10:14:13 crc kubenswrapper[4733]: E0318 10:14:13.554921 4733 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 10:14:13 crc kubenswrapper[4733]: container &Container{Name:node-ca,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f,Command:[/bin/sh -c trap 'jobs -p | xargs -r kill; echo shutting down node-ca; exit 0' TERM Mar 18 10:14:13 crc kubenswrapper[4733]: while [ true ]; Mar 18 10:14:13 crc kubenswrapper[4733]: do Mar 18 10:14:13 crc kubenswrapper[4733]: for f in $(ls /tmp/serviceca); do Mar 18 10:14:13 crc kubenswrapper[4733]: echo $f Mar 18 10:14:13 crc kubenswrapper[4733]: ca_file_path="/tmp/serviceca/${f}" Mar 18 10:14:13 crc kubenswrapper[4733]: f=$(echo $f | sed -r 's/(.*)\.\./\1:/') Mar 18 10:14:13 crc kubenswrapper[4733]: reg_dir_path="/etc/docker/certs.d/${f}" Mar 18 10:14:13 crc kubenswrapper[4733]: if [ -e "${reg_dir_path}" ]; then Mar 18 10:14:13 crc kubenswrapper[4733]: cp -u $ca_file_path $reg_dir_path/ca.crt Mar 18 10:14:13 crc kubenswrapper[4733]: else Mar 18 10:14:13 crc kubenswrapper[4733]: mkdir $reg_dir_path Mar 18 10:14:13 crc kubenswrapper[4733]: cp $ca_file_path $reg_dir_path/ca.crt Mar 18 10:14:13 crc kubenswrapper[4733]: fi Mar 18 10:14:13 crc kubenswrapper[4733]: done Mar 18 10:14:13 crc kubenswrapper[4733]: for d in $(ls /etc/docker/certs.d); do Mar 18 10:14:13 crc kubenswrapper[4733]: echo $d Mar 18 10:14:13 crc kubenswrapper[4733]: dp=$(echo $d | sed -r 's/(.*):/\1\.\./') Mar 18 10:14:13 crc kubenswrapper[4733]: reg_conf_path="/tmp/serviceca/${dp}" Mar 18 10:14:13 crc kubenswrapper[4733]: if [ ! -e "${reg_conf_path}" ]; then Mar 18 10:14:13 crc kubenswrapper[4733]: rm -rf /etc/docker/certs.d/$d Mar 18 10:14:13 crc kubenswrapper[4733]: fi Mar 18 10:14:13 crc kubenswrapper[4733]: done Mar 18 10:14:13 crc kubenswrapper[4733]: sleep 60 & wait ${!} Mar 18 10:14:13 crc kubenswrapper[4733]: done Mar 18 10:14:13 crc kubenswrapper[4733]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serviceca,ReadOnly:false,MountPath:/tmp/serviceca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/etc/docker/certs.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zg7jp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-ca-xfvfl_openshift-image-registry(bb58b528-9013-4fab-9747-60bb6ff1bc1f): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 10:14:13 crc kubenswrapper[4733]: > logger="UnhandledError" Mar 18 10:14:13 crc kubenswrapper[4733]: E0318 10:14:13.556626 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node-ca\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-image-registry/node-ca-xfvfl" podUID="bb58b528-9013-4fab-9747-60bb6ff1bc1f" Mar 18 10:14:13 crc kubenswrapper[4733]: E0318 10:14:13.556680 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.557802 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hsk58" Mar 18 10:14:13 crc kubenswrapper[4733]: W0318 10:14:13.561762 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73327417_4d3b_45f1_b3b6_575fdeeaa31a.slice/crio-35bea9a3e63456f3c4522f7b18c54f2df3fc823d29bd3059264ea8e5f121d012 WatchSource:0}: Error finding container 35bea9a3e63456f3c4522f7b18c54f2df3fc823d29bd3059264ea8e5f121d012: Status 404 returned error can't find the container with id 35bea9a3e63456f3c4522f7b18c54f2df3fc823d29bd3059264ea8e5f121d012 Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.563714 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.569968 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" event={"ID":"73327417-4d3b-45f1-b3b6-575fdeeaa31a","Type":"ContainerStarted","Data":"35bea9a3e63456f3c4522f7b18c54f2df3fc823d29bd3059264ea8e5f121d012"} Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.570333 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" Mar 18 10:14:13 crc kubenswrapper[4733]: W0318 10:14:13.570351 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f82588a_9dbd_4c55_8cfc_f96e57fa58b9.slice/crio-ff1d93cef35a316c51b4c062176258993d12b241a2a73aaf584a089109ebf4c1 WatchSource:0}: Error finding container ff1d93cef35a316c51b4c062176258993d12b241a2a73aaf584a089109ebf4c1: Status 404 returned error can't find the container with id ff1d93cef35a316c51b4c062176258993d12b241a2a73aaf584a089109ebf4c1 Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.571284 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-xfvfl" event={"ID":"bb58b528-9013-4fab-9747-60bb6ff1bc1f","Type":"ContainerStarted","Data":"da169ce5c75cd05bb49cb01f0a3fda10717fc3999af212b34c0dcccc7dbbab26"} Mar 18 10:14:13 crc kubenswrapper[4733]: W0318 10:14:13.572407 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-7fb0376f2eec0926827408777c433a08963622c37f222da5c978e502f9dbcbfd WatchSource:0}: Error finding container 7fb0376f2eec0926827408777c433a08963622c37f222da5c978e502f9dbcbfd: Status 404 returned error can't find the container with id 7fb0376f2eec0926827408777c433a08963622c37f222da5c978e502f9dbcbfd Mar 18 10:14:13 crc kubenswrapper[4733]: E0318 10:14:13.572577 4733 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 10:14:13 crc kubenswrapper[4733]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 18 10:14:13 crc kubenswrapper[4733]: apiVersion: v1 Mar 18 10:14:13 crc kubenswrapper[4733]: clusters: Mar 18 10:14:13 crc kubenswrapper[4733]: - cluster: Mar 18 10:14:13 crc kubenswrapper[4733]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 18 10:14:13 crc kubenswrapper[4733]: server: https://api-int.crc.testing:6443 Mar 18 10:14:13 crc kubenswrapper[4733]: name: default-cluster Mar 18 10:14:13 crc kubenswrapper[4733]: contexts: Mar 18 10:14:13 crc kubenswrapper[4733]: - context: Mar 18 10:14:13 crc kubenswrapper[4733]: cluster: default-cluster Mar 18 10:14:13 crc kubenswrapper[4733]: namespace: default Mar 18 10:14:13 crc kubenswrapper[4733]: user: default-auth Mar 18 10:14:13 crc kubenswrapper[4733]: name: default-context Mar 18 10:14:13 crc kubenswrapper[4733]: current-context: default-context Mar 18 10:14:13 crc kubenswrapper[4733]: kind: Config Mar 18 10:14:13 crc kubenswrapper[4733]: preferences: {} Mar 18 10:14:13 crc kubenswrapper[4733]: users: Mar 18 10:14:13 crc kubenswrapper[4733]: - name: default-auth Mar 18 10:14:13 crc kubenswrapper[4733]: user: Mar 18 10:14:13 crc kubenswrapper[4733]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 18 10:14:13 crc kubenswrapper[4733]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 18 10:14:13 crc kubenswrapper[4733]: EOF Mar 18 10:14:13 crc kubenswrapper[4733]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zqxdr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-7pxwd_openshift-ovn-kubernetes(73327417-4d3b-45f1-b3b6-575fdeeaa31a): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 10:14:13 crc kubenswrapper[4733]: > logger="UnhandledError" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.573609 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"e36bf33aebc25dbcc802b0414aa206e2fb77c5c19a7273e84ddfa5ff8f3ed1da"} Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.577312 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-g6j2q" Mar 18 10:14:13 crc kubenswrapper[4733]: E0318 10:14:13.578112 4733 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 10:14:13 crc kubenswrapper[4733]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 18 10:14:13 crc kubenswrapper[4733]: set -o allexport Mar 18 10:14:13 crc kubenswrapper[4733]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 18 10:14:13 crc kubenswrapper[4733]: source /etc/kubernetes/apiserver-url.env Mar 18 10:14:13 crc kubenswrapper[4733]: else Mar 18 10:14:13 crc kubenswrapper[4733]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 18 10:14:13 crc kubenswrapper[4733]: exit 1 Mar 18 10:14:13 crc kubenswrapper[4733]: fi Mar 18 10:14:13 crc kubenswrapper[4733]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 18 10:14:13 crc kubenswrapper[4733]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 10:14:13 crc kubenswrapper[4733]: > logger="UnhandledError" Mar 18 10:14:13 crc kubenswrapper[4733]: E0318 10:14:13.578130 4733 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 10:14:13 crc kubenswrapper[4733]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 10:14:13 crc kubenswrapper[4733]: if [[ -f "/env/_master" ]]; then Mar 18 10:14:13 crc kubenswrapper[4733]: set -o allexport Mar 18 10:14:13 crc kubenswrapper[4733]: source "/env/_master" Mar 18 10:14:13 crc kubenswrapper[4733]: set +o allexport Mar 18 10:14:13 crc kubenswrapper[4733]: fi Mar 18 10:14:13 crc kubenswrapper[4733]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 18 10:14:13 crc kubenswrapper[4733]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 18 10:14:13 crc kubenswrapper[4733]: ho_enable="--enable-hybrid-overlay" Mar 18 10:14:13 crc kubenswrapper[4733]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 18 10:14:13 crc kubenswrapper[4733]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 18 10:14:13 crc kubenswrapper[4733]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 18 10:14:13 crc kubenswrapper[4733]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 18 10:14:13 crc kubenswrapper[4733]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 18 10:14:13 crc kubenswrapper[4733]: --webhook-host=127.0.0.1 \ Mar 18 10:14:13 crc kubenswrapper[4733]: --webhook-port=9743 \ Mar 18 10:14:13 crc kubenswrapper[4733]: ${ho_enable} \ Mar 18 10:14:13 crc kubenswrapper[4733]: --enable-interconnect \ Mar 18 10:14:13 crc kubenswrapper[4733]: --disable-approver \ Mar 18 10:14:13 crc kubenswrapper[4733]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 18 10:14:13 crc kubenswrapper[4733]: --wait-for-kubernetes-api=200s \ Mar 18 10:14:13 crc kubenswrapper[4733]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 18 10:14:13 crc kubenswrapper[4733]: --loglevel="${LOGLEVEL}" Mar 18 10:14:13 crc kubenswrapper[4733]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 10:14:13 crc kubenswrapper[4733]: > logger="UnhandledError" Mar 18 10:14:13 crc kubenswrapper[4733]: E0318 10:14:13.579072 4733 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 10:14:13 crc kubenswrapper[4733]: container &Container{Name:node-ca,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f,Command:[/bin/sh -c trap 'jobs -p | xargs -r kill; echo shutting down node-ca; exit 0' TERM Mar 18 10:14:13 crc kubenswrapper[4733]: while [ true ]; Mar 18 10:14:13 crc kubenswrapper[4733]: do Mar 18 10:14:13 crc kubenswrapper[4733]: for f in $(ls /tmp/serviceca); do Mar 18 10:14:13 crc kubenswrapper[4733]: echo $f Mar 18 10:14:13 crc kubenswrapper[4733]: ca_file_path="/tmp/serviceca/${f}" Mar 18 10:14:13 crc kubenswrapper[4733]: f=$(echo $f | sed -r 's/(.*)\.\./\1:/') Mar 18 10:14:13 crc kubenswrapper[4733]: reg_dir_path="/etc/docker/certs.d/${f}" Mar 18 10:14:13 crc kubenswrapper[4733]: if [ -e "${reg_dir_path}" ]; then Mar 18 10:14:13 crc kubenswrapper[4733]: cp -u $ca_file_path $reg_dir_path/ca.crt Mar 18 10:14:13 crc kubenswrapper[4733]: else Mar 18 10:14:13 crc kubenswrapper[4733]: mkdir $reg_dir_path Mar 18 10:14:13 crc kubenswrapper[4733]: cp $ca_file_path $reg_dir_path/ca.crt Mar 18 10:14:13 crc kubenswrapper[4733]: fi Mar 18 10:14:13 crc kubenswrapper[4733]: done Mar 18 10:14:13 crc kubenswrapper[4733]: for d in $(ls /etc/docker/certs.d); do Mar 18 10:14:13 crc kubenswrapper[4733]: echo $d Mar 18 10:14:13 crc kubenswrapper[4733]: dp=$(echo $d | sed -r 's/(.*):/\1\.\./') Mar 18 10:14:13 crc kubenswrapper[4733]: reg_conf_path="/tmp/serviceca/${dp}" Mar 18 10:14:13 crc kubenswrapper[4733]: if [ ! -e "${reg_conf_path}" ]; then Mar 18 10:14:13 crc kubenswrapper[4733]: rm -rf /etc/docker/certs.d/$d Mar 18 10:14:13 crc kubenswrapper[4733]: fi Mar 18 10:14:13 crc kubenswrapper[4733]: done Mar 18 10:14:13 crc kubenswrapper[4733]: sleep 60 & wait ${!} Mar 18 10:14:13 crc kubenswrapper[4733]: done Mar 18 10:14:13 crc kubenswrapper[4733]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serviceca,ReadOnly:false,MountPath:/tmp/serviceca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/etc/docker/certs.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zg7jp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-ca-xfvfl_openshift-image-registry(bb58b528-9013-4fab-9747-60bb6ff1bc1f): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 10:14:13 crc kubenswrapper[4733]: > logger="UnhandledError" Mar 18 10:14:13 crc kubenswrapper[4733]: E0318 10:14:13.579337 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 18 10:14:13 crc kubenswrapper[4733]: E0318 10:14:13.580368 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node-ca\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-image-registry/node-ca-xfvfl" podUID="bb58b528-9013-4fab-9747-60bb6ff1bc1f" Mar 18 10:14:13 crc kubenswrapper[4733]: E0318 10:14:13.580410 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" podUID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.580659 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:13 crc kubenswrapper[4733]: E0318 10:14:13.583244 4733 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xwk4s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-t28sh_openshift-multus(0f82588a-9dbd-4c55-8cfc-f96e57fa58b9): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 18 10:14:13 crc kubenswrapper[4733]: E0318 10:14:13.584564 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-t28sh" podUID="0f82588a-9dbd-4c55-8cfc-f96e57fa58b9" Mar 18 10:14:13 crc kubenswrapper[4733]: E0318 10:14:13.585529 4733 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 10:14:13 crc kubenswrapper[4733]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 10:14:13 crc kubenswrapper[4733]: if [[ -f "/env/_master" ]]; then Mar 18 10:14:13 crc kubenswrapper[4733]: set -o allexport Mar 18 10:14:13 crc kubenswrapper[4733]: source "/env/_master" Mar 18 10:14:13 crc kubenswrapper[4733]: set +o allexport Mar 18 10:14:13 crc kubenswrapper[4733]: fi Mar 18 10:14:13 crc kubenswrapper[4733]: Mar 18 10:14:13 crc kubenswrapper[4733]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 18 10:14:13 crc kubenswrapper[4733]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 18 10:14:13 crc kubenswrapper[4733]: --disable-webhook \ Mar 18 10:14:13 crc kubenswrapper[4733]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 18 10:14:13 crc kubenswrapper[4733]: --loglevel="${LOGLEVEL}" Mar 18 10:14:13 crc kubenswrapper[4733]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 10:14:13 crc kubenswrapper[4733]: > logger="UnhandledError" Mar 18 10:14:13 crc kubenswrapper[4733]: E0318 10:14:13.586842 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.590580 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4s425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3650177-e338-4eba-ab42-bc0cd14c9d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4s425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.604095 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:13 crc kubenswrapper[4733]: W0318 10:14:13.604975 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-082c29e3548c80a38ab2a902a222a7bd38f378ff323b000dc303bf38651a6b0f WatchSource:0}: Error finding container 082c29e3548c80a38ab2a902a222a7bd38f378ff323b000dc303bf38651a6b0f: Status 404 returned error can't find the container with id 082c29e3548c80a38ab2a902a222a7bd38f378ff323b000dc303bf38651a6b0f Mar 18 10:14:13 crc kubenswrapper[4733]: E0318 10:14:13.606321 4733 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 10:14:13 crc kubenswrapper[4733]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 18 10:14:13 crc kubenswrapper[4733]: set -uo pipefail Mar 18 10:14:13 crc kubenswrapper[4733]: Mar 18 10:14:13 crc kubenswrapper[4733]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 18 10:14:13 crc kubenswrapper[4733]: Mar 18 10:14:13 crc kubenswrapper[4733]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 18 10:14:13 crc kubenswrapper[4733]: HOSTS_FILE="/etc/hosts" Mar 18 10:14:13 crc kubenswrapper[4733]: TEMP_FILE="/etc/hosts.tmp" Mar 18 10:14:13 crc kubenswrapper[4733]: Mar 18 10:14:13 crc kubenswrapper[4733]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 18 10:14:13 crc kubenswrapper[4733]: Mar 18 10:14:13 crc kubenswrapper[4733]: # Make a temporary file with the old hosts file's attributes. Mar 18 10:14:13 crc kubenswrapper[4733]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 18 10:14:13 crc kubenswrapper[4733]: echo "Failed to preserve hosts file. Exiting." Mar 18 10:14:13 crc kubenswrapper[4733]: exit 1 Mar 18 10:14:13 crc kubenswrapper[4733]: fi Mar 18 10:14:13 crc kubenswrapper[4733]: Mar 18 10:14:13 crc kubenswrapper[4733]: while true; do Mar 18 10:14:13 crc kubenswrapper[4733]: declare -A svc_ips Mar 18 10:14:13 crc kubenswrapper[4733]: for svc in "${services[@]}"; do Mar 18 10:14:13 crc kubenswrapper[4733]: # Fetch service IP from cluster dns if present. We make several tries Mar 18 10:14:13 crc kubenswrapper[4733]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 18 10:14:13 crc kubenswrapper[4733]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 18 10:14:13 crc kubenswrapper[4733]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 18 10:14:13 crc kubenswrapper[4733]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 18 10:14:13 crc kubenswrapper[4733]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 18 10:14:13 crc kubenswrapper[4733]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 18 10:14:13 crc kubenswrapper[4733]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 18 10:14:13 crc kubenswrapper[4733]: for i in ${!cmds[*]} Mar 18 10:14:13 crc kubenswrapper[4733]: do Mar 18 10:14:13 crc kubenswrapper[4733]: ips=($(eval "${cmds[i]}")) Mar 18 10:14:13 crc kubenswrapper[4733]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 18 10:14:13 crc kubenswrapper[4733]: svc_ips["${svc}"]="${ips[@]}" Mar 18 10:14:13 crc kubenswrapper[4733]: break Mar 18 10:14:13 crc kubenswrapper[4733]: fi Mar 18 10:14:13 crc kubenswrapper[4733]: done Mar 18 10:14:13 crc kubenswrapper[4733]: done Mar 18 10:14:13 crc kubenswrapper[4733]: Mar 18 10:14:13 crc kubenswrapper[4733]: # Update /etc/hosts only if we get valid service IPs Mar 18 10:14:13 crc kubenswrapper[4733]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 18 10:14:13 crc kubenswrapper[4733]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 18 10:14:13 crc kubenswrapper[4733]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 18 10:14:13 crc kubenswrapper[4733]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 18 10:14:13 crc kubenswrapper[4733]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 18 10:14:13 crc kubenswrapper[4733]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 18 10:14:13 crc kubenswrapper[4733]: sleep 60 & wait Mar 18 10:14:13 crc kubenswrapper[4733]: continue Mar 18 10:14:13 crc kubenswrapper[4733]: fi Mar 18 10:14:13 crc kubenswrapper[4733]: Mar 18 10:14:13 crc kubenswrapper[4733]: # Append resolver entries for services Mar 18 10:14:13 crc kubenswrapper[4733]: rc=0 Mar 18 10:14:13 crc kubenswrapper[4733]: for svc in "${!svc_ips[@]}"; do Mar 18 10:14:13 crc kubenswrapper[4733]: for ip in ${svc_ips[${svc}]}; do Mar 18 10:14:13 crc kubenswrapper[4733]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 18 10:14:13 crc kubenswrapper[4733]: done Mar 18 10:14:13 crc kubenswrapper[4733]: done Mar 18 10:14:13 crc kubenswrapper[4733]: if [[ $rc -ne 0 ]]; then Mar 18 10:14:13 crc kubenswrapper[4733]: sleep 60 & wait Mar 18 10:14:13 crc kubenswrapper[4733]: continue Mar 18 10:14:13 crc kubenswrapper[4733]: fi Mar 18 10:14:13 crc kubenswrapper[4733]: Mar 18 10:14:13 crc kubenswrapper[4733]: Mar 18 10:14:13 crc kubenswrapper[4733]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 18 10:14:13 crc kubenswrapper[4733]: # Replace /etc/hosts with our modified version if needed Mar 18 10:14:13 crc kubenswrapper[4733]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 18 10:14:13 crc kubenswrapper[4733]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 18 10:14:13 crc kubenswrapper[4733]: fi Mar 18 10:14:13 crc kubenswrapper[4733]: sleep 60 & wait Mar 18 10:14:13 crc kubenswrapper[4733]: unset svc_ips Mar 18 10:14:13 crc kubenswrapper[4733]: done Mar 18 10:14:13 crc kubenswrapper[4733]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-httph,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-hsk58_openshift-dns(c2c181c8-3361-40a2-afc5-a677e0ab4ecd): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 10:14:13 crc kubenswrapper[4733]: > logger="UnhandledError" Mar 18 10:14:13 crc kubenswrapper[4733]: E0318 10:14:13.606321 4733 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 10:14:13 crc kubenswrapper[4733]: container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[/bin/bash -c #!/bin/bash Mar 18 10:14:13 crc kubenswrapper[4733]: set -euo pipefail Mar 18 10:14:13 crc kubenswrapper[4733]: TLS_PK=/etc/pki/tls/metrics-cert/tls.key Mar 18 10:14:13 crc kubenswrapper[4733]: TLS_CERT=/etc/pki/tls/metrics-cert/tls.crt Mar 18 10:14:13 crc kubenswrapper[4733]: # As the secret mount is optional we must wait for the files to be present. Mar 18 10:14:13 crc kubenswrapper[4733]: # The service is created in monitor.yaml and this is created in sdn.yaml. Mar 18 10:14:13 crc kubenswrapper[4733]: TS=$(date +%s) Mar 18 10:14:13 crc kubenswrapper[4733]: WARN_TS=$(( ${TS} + $(( 20 * 60)) )) Mar 18 10:14:13 crc kubenswrapper[4733]: HAS_LOGGED_INFO=0 Mar 18 10:14:13 crc kubenswrapper[4733]: Mar 18 10:14:13 crc kubenswrapper[4733]: log_missing_certs(){ Mar 18 10:14:13 crc kubenswrapper[4733]: CUR_TS=$(date +%s) Mar 18 10:14:13 crc kubenswrapper[4733]: if [[ "${CUR_TS}" -gt "WARN_TS" ]]; then Mar 18 10:14:13 crc kubenswrapper[4733]: echo $(date -Iseconds) WARN: ovn-control-plane-metrics-cert not mounted after 20 minutes. Mar 18 10:14:13 crc kubenswrapper[4733]: elif [[ "${HAS_LOGGED_INFO}" -eq 0 ]] ; then Mar 18 10:14:13 crc kubenswrapper[4733]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-cert not mounted. Waiting 20 minutes. Mar 18 10:14:13 crc kubenswrapper[4733]: HAS_LOGGED_INFO=1 Mar 18 10:14:13 crc kubenswrapper[4733]: fi Mar 18 10:14:13 crc kubenswrapper[4733]: } Mar 18 10:14:13 crc kubenswrapper[4733]: while [[ ! -f "${TLS_PK}" || ! -f "${TLS_CERT}" ]] ; do Mar 18 10:14:13 crc kubenswrapper[4733]: log_missing_certs Mar 18 10:14:13 crc kubenswrapper[4733]: sleep 5 Mar 18 10:14:13 crc kubenswrapper[4733]: done Mar 18 10:14:13 crc kubenswrapper[4733]: Mar 18 10:14:13 crc kubenswrapper[4733]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-certs mounted, starting kube-rbac-proxy Mar 18 10:14:13 crc kubenswrapper[4733]: exec /usr/bin/kube-rbac-proxy \ Mar 18 10:14:13 crc kubenswrapper[4733]: --logtostderr \ Mar 18 10:14:13 crc kubenswrapper[4733]: --secure-listen-address=:9108 \ Mar 18 10:14:13 crc kubenswrapper[4733]: --tls-cipher-suites=TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256 \ Mar 18 10:14:13 crc kubenswrapper[4733]: --upstream=http://127.0.0.1:29108/ \ Mar 18 10:14:13 crc kubenswrapper[4733]: --tls-private-key-file=${TLS_PK} \ Mar 18 10:14:13 crc kubenswrapper[4733]: --tls-cert-file=${TLS_CERT} Mar 18 10:14:13 crc kubenswrapper[4733]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:9108,ContainerPort:9108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovn-control-plane-metrics-cert,ReadOnly:true,MountPath:/etc/pki/tls/metrics-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vg7hc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-spfjj_openshift-ovn-kubernetes(7d693a73-68c1-4595-bbcc-be97691b06fe): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 10:14:13 crc kubenswrapper[4733]: > logger="UnhandledError" Mar 18 10:14:13 crc kubenswrapper[4733]: E0318 10:14:13.607622 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-hsk58" podUID="c2c181c8-3361-40a2-afc5-a677e0ab4ecd" Mar 18 10:14:13 crc kubenswrapper[4733]: W0318 10:14:13.614360 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f75e1c5_e0c5_43df_944f_77b734070793.slice/crio-44de9df63ec3ee4eae109c26660b6c1eb3f01d1f6242fd4f233e7ef6cfded6a1 WatchSource:0}: Error finding container 44de9df63ec3ee4eae109c26660b6c1eb3f01d1f6242fd4f233e7ef6cfded6a1: Status 404 returned error can't find the container with id 44de9df63ec3ee4eae109c26660b6c1eb3f01d1f6242fd4f233e7ef6cfded6a1 Mar 18 10:14:13 crc kubenswrapper[4733]: E0318 10:14:13.614915 4733 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 10:14:13 crc kubenswrapper[4733]: container &Container{Name:ovnkube-cluster-manager,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 10:14:13 crc kubenswrapper[4733]: if [[ -f "/env/_master" ]]; then Mar 18 10:14:13 crc kubenswrapper[4733]: set -o allexport Mar 18 10:14:13 crc kubenswrapper[4733]: source "/env/_master" Mar 18 10:14:13 crc kubenswrapper[4733]: set +o allexport Mar 18 10:14:13 crc kubenswrapper[4733]: fi Mar 18 10:14:13 crc kubenswrapper[4733]: Mar 18 10:14:13 crc kubenswrapper[4733]: ovn_v4_join_subnet_opt= Mar 18 10:14:13 crc kubenswrapper[4733]: if [[ "" != "" ]]; then Mar 18 10:14:13 crc kubenswrapper[4733]: ovn_v4_join_subnet_opt="--gateway-v4-join-subnet " Mar 18 10:14:13 crc kubenswrapper[4733]: fi Mar 18 10:14:13 crc kubenswrapper[4733]: ovn_v6_join_subnet_opt= Mar 18 10:14:13 crc kubenswrapper[4733]: if [[ "" != "" ]]; then Mar 18 10:14:13 crc kubenswrapper[4733]: ovn_v6_join_subnet_opt="--gateway-v6-join-subnet " Mar 18 10:14:13 crc kubenswrapper[4733]: fi Mar 18 10:14:13 crc kubenswrapper[4733]: Mar 18 10:14:13 crc kubenswrapper[4733]: ovn_v4_transit_switch_subnet_opt= Mar 18 10:14:13 crc kubenswrapper[4733]: if [[ "" != "" ]]; then Mar 18 10:14:13 crc kubenswrapper[4733]: ovn_v4_transit_switch_subnet_opt="--cluster-manager-v4-transit-switch-subnet " Mar 18 10:14:13 crc kubenswrapper[4733]: fi Mar 18 10:14:13 crc kubenswrapper[4733]: ovn_v6_transit_switch_subnet_opt= Mar 18 10:14:13 crc kubenswrapper[4733]: if [[ "" != "" ]]; then Mar 18 10:14:13 crc kubenswrapper[4733]: ovn_v6_transit_switch_subnet_opt="--cluster-manager-v6-transit-switch-subnet " Mar 18 10:14:13 crc kubenswrapper[4733]: fi Mar 18 10:14:13 crc kubenswrapper[4733]: Mar 18 10:14:13 crc kubenswrapper[4733]: dns_name_resolver_enabled_flag= Mar 18 10:14:13 crc kubenswrapper[4733]: if [[ "false" == "true" ]]; then Mar 18 10:14:13 crc kubenswrapper[4733]: dns_name_resolver_enabled_flag="--enable-dns-name-resolver" Mar 18 10:14:13 crc kubenswrapper[4733]: fi Mar 18 10:14:13 crc kubenswrapper[4733]: Mar 18 10:14:13 crc kubenswrapper[4733]: persistent_ips_enabled_flag= Mar 18 10:14:13 crc kubenswrapper[4733]: if [[ "true" == "true" ]]; then Mar 18 10:14:13 crc kubenswrapper[4733]: persistent_ips_enabled_flag="--enable-persistent-ips" Mar 18 10:14:13 crc kubenswrapper[4733]: fi Mar 18 10:14:13 crc kubenswrapper[4733]: Mar 18 10:14:13 crc kubenswrapper[4733]: # This is needed so that converting clusters from GA to TP Mar 18 10:14:13 crc kubenswrapper[4733]: # will rollout control plane pods as well Mar 18 10:14:13 crc kubenswrapper[4733]: network_segmentation_enabled_flag= Mar 18 10:14:13 crc kubenswrapper[4733]: multi_network_enabled_flag= Mar 18 10:14:13 crc kubenswrapper[4733]: if [[ "true" == "true" ]]; then Mar 18 10:14:13 crc kubenswrapper[4733]: multi_network_enabled_flag="--enable-multi-network" Mar 18 10:14:13 crc kubenswrapper[4733]: network_segmentation_enabled_flag="--enable-network-segmentation" Mar 18 10:14:13 crc kubenswrapper[4733]: fi Mar 18 10:14:13 crc kubenswrapper[4733]: Mar 18 10:14:13 crc kubenswrapper[4733]: echo "I$(date "+%m%d %H:%M:%S.%N") - ovnkube-control-plane - start ovnkube --init-cluster-manager ${K8S_NODE}" Mar 18 10:14:13 crc kubenswrapper[4733]: exec /usr/bin/ovnkube \ Mar 18 10:14:13 crc kubenswrapper[4733]: --enable-interconnect \ Mar 18 10:14:13 crc kubenswrapper[4733]: --init-cluster-manager "${K8S_NODE}" \ Mar 18 10:14:13 crc kubenswrapper[4733]: --config-file=/run/ovnkube-config/ovnkube.conf \ Mar 18 10:14:13 crc kubenswrapper[4733]: --loglevel "${OVN_KUBE_LOG_LEVEL}" \ Mar 18 10:14:13 crc kubenswrapper[4733]: --metrics-bind-address "127.0.0.1:29108" \ Mar 18 10:14:13 crc kubenswrapper[4733]: --metrics-enable-pprof \ Mar 18 10:14:13 crc kubenswrapper[4733]: --metrics-enable-config-duration \ Mar 18 10:14:13 crc kubenswrapper[4733]: ${ovn_v4_join_subnet_opt} \ Mar 18 10:14:13 crc kubenswrapper[4733]: ${ovn_v6_join_subnet_opt} \ Mar 18 10:14:13 crc kubenswrapper[4733]: ${ovn_v4_transit_switch_subnet_opt} \ Mar 18 10:14:13 crc kubenswrapper[4733]: ${ovn_v6_transit_switch_subnet_opt} \ Mar 18 10:14:13 crc kubenswrapper[4733]: ${dns_name_resolver_enabled_flag} \ Mar 18 10:14:13 crc kubenswrapper[4733]: ${persistent_ips_enabled_flag} \ Mar 18 10:14:13 crc kubenswrapper[4733]: ${multi_network_enabled_flag} \ Mar 18 10:14:13 crc kubenswrapper[4733]: ${network_segmentation_enabled_flag} Mar 18 10:14:13 crc kubenswrapper[4733]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics-port,HostPort:29108,ContainerPort:29108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OVN_KUBE_LOG_LEVEL,Value:4,ValueFrom:nil,},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{314572800 0} {} 300Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovnkube-config,ReadOnly:false,MountPath:/run/ovnkube-config/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vg7hc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-spfjj_openshift-ovn-kubernetes(7d693a73-68c1-4595-bbcc-be97691b06fe): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 10:14:13 crc kubenswrapper[4733]: > logger="UnhandledError" Mar 18 10:14:13 crc kubenswrapper[4733]: E0318 10:14:13.614999 4733 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.615472 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spfjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d693a73-68c1-4595-bbcc-be97691b06fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-spfjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:13 crc kubenswrapper[4733]: W0318 10:14:13.616110 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc85b0d4_15a5_4894_9f07_9aaeb28f63fa.slice/crio-f6b7bfe1b02ca90d8b984d1a9c50c920b6a4ec109f2b15d0d9e4be17d1064a45 WatchSource:0}: Error finding container f6b7bfe1b02ca90d8b984d1a9c50c920b6a4ec109f2b15d0d9e4be17d1064a45: Status 404 returned error can't find the container with id f6b7bfe1b02ca90d8b984d1a9c50c920b6a4ec109f2b15d0d9e4be17d1064a45 Mar 18 10:14:13 crc kubenswrapper[4733]: E0318 10:14:13.616120 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 18 10:14:13 crc kubenswrapper[4733]: E0318 10:14:13.616174 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"ovnkube-cluster-manager\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spfjj" podUID="7d693a73-68c1-4595-bbcc-be97691b06fe" Mar 18 10:14:13 crc kubenswrapper[4733]: E0318 10:14:13.617925 4733 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xpnv6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 18 10:14:13 crc kubenswrapper[4733]: E0318 10:14:13.617954 4733 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 10:14:13 crc kubenswrapper[4733]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 18 10:14:13 crc kubenswrapper[4733]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 18 10:14:13 crc kubenswrapper[4733]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ph8vv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-g6j2q_openshift-multus(cc85b0d4-15a5-4894-9f07-9aaeb28f63fa): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 10:14:13 crc kubenswrapper[4733]: > logger="UnhandledError" Mar 18 10:14:13 crc kubenswrapper[4733]: E0318 10:14:13.619219 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-g6j2q" podUID="cc85b0d4-15a5-4894-9f07-9aaeb28f63fa" Mar 18 10:14:13 crc kubenswrapper[4733]: E0318 10:14:13.620346 4733 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xpnv6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 18 10:14:13 crc kubenswrapper[4733]: E0318 10:14:13.621423 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.625144 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hsk58" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2c181c8-3361-40a2-afc5-a677e0ab4ecd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-httph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hsk58\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.638335 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.651949 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.651986 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.651999 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.652016 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.652028 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:13Z","lastTransitionTime":"2026-03-18T10:14:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.655370 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g6j2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ph8vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g6j2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.670942 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.680144 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xfvfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb58b528-9013-4fab-9747-60bb6ff1bc1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg7jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xfvfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.695476 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73327417-4d3b-45f1-b3b6-575fdeeaa31a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pxwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.707024 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"908bd772-fb33-4f68-8971-d1fef3118c82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3457636bb3e1cc25507158454524b9cee6812beb56c7b22fb86b9438b8082488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.723159 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.733614 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.748215 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t28sh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t28sh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.755272 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.755332 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.755353 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.755382 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.755401 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:13Z","lastTransitionTime":"2026-03-18T10:14:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.759860 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f75e1c5-e0c5-43df-944f-77b734070793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2h7dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.772323 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"908bd772-fb33-4f68-8971-d1fef3118c82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3457636bb3e1cc25507158454524b9cee6812beb56c7b22fb86b9438b8082488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.787486 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.801046 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.812701 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t28sh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t28sh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.824354 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f75e1c5-e0c5-43df-944f-77b734070793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2h7dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.835943 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.845075 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4s425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3650177-e338-4eba-ab42-bc0cd14c9d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4s425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.856709 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.861819 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.861856 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.861868 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.861888 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.861902 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:13Z","lastTransitionTime":"2026-03-18T10:14:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.866088 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hsk58" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2c181c8-3361-40a2-afc5-a677e0ab4ecd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-httph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hsk58\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.883736 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:14:13 crc kubenswrapper[4733]: E0318 10:14:13.883932 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:14:14.883902848 +0000 UTC m=+94.375637173 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.883986 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.884056 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.884120 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b3650177-e338-4eba-ab42-bc0cd14c9d65-metrics-certs\") pod \"network-metrics-daemon-4s425\" (UID: \"b3650177-e338-4eba-ab42-bc0cd14c9d65\") " pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.884171 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:14:13 crc kubenswrapper[4733]: E0318 10:14:13.884172 4733 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 10:14:13 crc kubenswrapper[4733]: E0318 10:14:13.884251 4733 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 10:14:13 crc kubenswrapper[4733]: E0318 10:14:13.884300 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 10:14:14.884291989 +0000 UTC m=+94.376026314 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 10:14:13 crc kubenswrapper[4733]: E0318 10:14:13.884326 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3650177-e338-4eba-ab42-bc0cd14c9d65-metrics-certs podName:b3650177-e338-4eba-ab42-bc0cd14c9d65 nodeName:}" failed. No retries permitted until 2026-03-18 10:14:14.884316359 +0000 UTC m=+94.376050684 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b3650177-e338-4eba-ab42-bc0cd14c9d65-metrics-certs") pod "network-metrics-daemon-4s425" (UID: "b3650177-e338-4eba-ab42-bc0cd14c9d65") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 10:14:13 crc kubenswrapper[4733]: E0318 10:14:13.884360 4733 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 10:14:13 crc kubenswrapper[4733]: E0318 10:14:13.884387 4733 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 10:14:13 crc kubenswrapper[4733]: E0318 10:14:13.884406 4733 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 10:14:13 crc kubenswrapper[4733]: E0318 10:14:13.884468 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 10:14:14.884448983 +0000 UTC m=+94.376183318 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 10:14:13 crc kubenswrapper[4733]: E0318 10:14:13.884609 4733 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 10:14:13 crc kubenswrapper[4733]: E0318 10:14:13.884656 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 10:14:14.884644489 +0000 UTC m=+94.376378814 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.887026 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.927571 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g6j2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ph8vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g6j2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.964876 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.964935 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.964946 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.964963 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.964973 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:13Z","lastTransitionTime":"2026-03-18T10:14:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.968646 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spfjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d693a73-68c1-4595-bbcc-be97691b06fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-spfjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:13 crc kubenswrapper[4733]: I0318 10:14:13.985675 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:14:13 crc kubenswrapper[4733]: E0318 10:14:13.985868 4733 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 10:14:13 crc kubenswrapper[4733]: E0318 10:14:13.985891 4733 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 10:14:13 crc kubenswrapper[4733]: E0318 10:14:13.985905 4733 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 10:14:13 crc kubenswrapper[4733]: E0318 10:14:13.985958 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 10:14:14.985941686 +0000 UTC m=+94.477676011 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.010264 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.050080 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xfvfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb58b528-9013-4fab-9747-60bb6ff1bc1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg7jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xfvfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.067498 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.067542 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.067555 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.067572 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.067586 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:14Z","lastTransitionTime":"2026-03-18T10:14:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.106434 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73327417-4d3b-45f1-b3b6-575fdeeaa31a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pxwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.170153 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.170218 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.170231 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.170248 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.170261 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:14Z","lastTransitionTime":"2026-03-18T10:14:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.272724 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.273069 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.273158 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.273264 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.273369 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:14Z","lastTransitionTime":"2026-03-18T10:14:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.375579 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.375616 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.375624 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.375647 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.375657 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:14Z","lastTransitionTime":"2026-03-18T10:14:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.477967 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.478018 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.478030 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.478049 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.478066 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:14Z","lastTransitionTime":"2026-03-18T10:14:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.578281 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" event={"ID":"6f75e1c5-e0c5-43df-944f-77b734070793","Type":"ContainerStarted","Data":"44de9df63ec3ee4eae109c26660b6c1eb3f01d1f6242fd4f233e7ef6cfded6a1"} Mar 18 10:14:14 crc kubenswrapper[4733]: E0318 10:14:14.582286 4733 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xpnv6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.582420 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.582437 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.582447 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.582464 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.582486 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spfjj" event={"ID":"7d693a73-68c1-4595-bbcc-be97691b06fe","Type":"ContainerStarted","Data":"ca6147b370c26e7fb7577a789dd3313109801f85318d443e4aaac91a1c487a78"} Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.582478 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:14Z","lastTransitionTime":"2026-03-18T10:14:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:14 crc kubenswrapper[4733]: E0318 10:14:14.591907 4733 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xpnv6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 18 10:14:14 crc kubenswrapper[4733]: E0318 10:14:14.592175 4733 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 10:14:14 crc kubenswrapper[4733]: container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[/bin/bash -c #!/bin/bash Mar 18 10:14:14 crc kubenswrapper[4733]: set -euo pipefail Mar 18 10:14:14 crc kubenswrapper[4733]: TLS_PK=/etc/pki/tls/metrics-cert/tls.key Mar 18 10:14:14 crc kubenswrapper[4733]: TLS_CERT=/etc/pki/tls/metrics-cert/tls.crt Mar 18 10:14:14 crc kubenswrapper[4733]: # As the secret mount is optional we must wait for the files to be present. Mar 18 10:14:14 crc kubenswrapper[4733]: # The service is created in monitor.yaml and this is created in sdn.yaml. Mar 18 10:14:14 crc kubenswrapper[4733]: TS=$(date +%s) Mar 18 10:14:14 crc kubenswrapper[4733]: WARN_TS=$(( ${TS} + $(( 20 * 60)) )) Mar 18 10:14:14 crc kubenswrapper[4733]: HAS_LOGGED_INFO=0 Mar 18 10:14:14 crc kubenswrapper[4733]: Mar 18 10:14:14 crc kubenswrapper[4733]: log_missing_certs(){ Mar 18 10:14:14 crc kubenswrapper[4733]: CUR_TS=$(date +%s) Mar 18 10:14:14 crc kubenswrapper[4733]: if [[ "${CUR_TS}" -gt "WARN_TS" ]]; then Mar 18 10:14:14 crc kubenswrapper[4733]: echo $(date -Iseconds) WARN: ovn-control-plane-metrics-cert not mounted after 20 minutes. Mar 18 10:14:14 crc kubenswrapper[4733]: elif [[ "${HAS_LOGGED_INFO}" -eq 0 ]] ; then Mar 18 10:14:14 crc kubenswrapper[4733]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-cert not mounted. Waiting 20 minutes. Mar 18 10:14:14 crc kubenswrapper[4733]: HAS_LOGGED_INFO=1 Mar 18 10:14:14 crc kubenswrapper[4733]: fi Mar 18 10:14:14 crc kubenswrapper[4733]: } Mar 18 10:14:14 crc kubenswrapper[4733]: while [[ ! -f "${TLS_PK}" || ! -f "${TLS_CERT}" ]] ; do Mar 18 10:14:14 crc kubenswrapper[4733]: log_missing_certs Mar 18 10:14:14 crc kubenswrapper[4733]: sleep 5 Mar 18 10:14:14 crc kubenswrapper[4733]: done Mar 18 10:14:14 crc kubenswrapper[4733]: Mar 18 10:14:14 crc kubenswrapper[4733]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-certs mounted, starting kube-rbac-proxy Mar 18 10:14:14 crc kubenswrapper[4733]: exec /usr/bin/kube-rbac-proxy \ Mar 18 10:14:14 crc kubenswrapper[4733]: --logtostderr \ Mar 18 10:14:14 crc kubenswrapper[4733]: --secure-listen-address=:9108 \ Mar 18 10:14:14 crc kubenswrapper[4733]: --tls-cipher-suites=TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256 \ Mar 18 10:14:14 crc kubenswrapper[4733]: --upstream=http://127.0.0.1:29108/ \ Mar 18 10:14:14 crc kubenswrapper[4733]: --tls-private-key-file=${TLS_PK} \ Mar 18 10:14:14 crc kubenswrapper[4733]: --tls-cert-file=${TLS_CERT} Mar 18 10:14:14 crc kubenswrapper[4733]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:9108,ContainerPort:9108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovn-control-plane-metrics-cert,ReadOnly:true,MountPath:/etc/pki/tls/metrics-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vg7hc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-spfjj_openshift-ovn-kubernetes(7d693a73-68c1-4595-bbcc-be97691b06fe): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 10:14:14 crc kubenswrapper[4733]: > logger="UnhandledError" Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.593206 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hsk58" event={"ID":"c2c181c8-3361-40a2-afc5-a677e0ab4ecd","Type":"ContainerStarted","Data":"16909e99ed8cac62104f4c439d3da9d1b3ffb5e99fada86f18c770cdbb02a00a"} Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.593250 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:14 crc kubenswrapper[4733]: E0318 10:14:14.593754 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 10:14:14 crc kubenswrapper[4733]: E0318 10:14:14.597722 4733 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 10:14:14 crc kubenswrapper[4733]: container &Container{Name:ovnkube-cluster-manager,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 10:14:14 crc kubenswrapper[4733]: if [[ -f "/env/_master" ]]; then Mar 18 10:14:14 crc kubenswrapper[4733]: set -o allexport Mar 18 10:14:14 crc kubenswrapper[4733]: source "/env/_master" Mar 18 10:14:14 crc kubenswrapper[4733]: set +o allexport Mar 18 10:14:14 crc kubenswrapper[4733]: fi Mar 18 10:14:14 crc kubenswrapper[4733]: Mar 18 10:14:14 crc kubenswrapper[4733]: ovn_v4_join_subnet_opt= Mar 18 10:14:14 crc kubenswrapper[4733]: if [[ "" != "" ]]; then Mar 18 10:14:14 crc kubenswrapper[4733]: ovn_v4_join_subnet_opt="--gateway-v4-join-subnet " Mar 18 10:14:14 crc kubenswrapper[4733]: fi Mar 18 10:14:14 crc kubenswrapper[4733]: ovn_v6_join_subnet_opt= Mar 18 10:14:14 crc kubenswrapper[4733]: if [[ "" != "" ]]; then Mar 18 10:14:14 crc kubenswrapper[4733]: ovn_v6_join_subnet_opt="--gateway-v6-join-subnet " Mar 18 10:14:14 crc kubenswrapper[4733]: fi Mar 18 10:14:14 crc kubenswrapper[4733]: Mar 18 10:14:14 crc kubenswrapper[4733]: ovn_v4_transit_switch_subnet_opt= Mar 18 10:14:14 crc kubenswrapper[4733]: if [[ "" != "" ]]; then Mar 18 10:14:14 crc kubenswrapper[4733]: ovn_v4_transit_switch_subnet_opt="--cluster-manager-v4-transit-switch-subnet " Mar 18 10:14:14 crc kubenswrapper[4733]: fi Mar 18 10:14:14 crc kubenswrapper[4733]: ovn_v6_transit_switch_subnet_opt= Mar 18 10:14:14 crc kubenswrapper[4733]: if [[ "" != "" ]]; then Mar 18 10:14:14 crc kubenswrapper[4733]: ovn_v6_transit_switch_subnet_opt="--cluster-manager-v6-transit-switch-subnet " Mar 18 10:14:14 crc kubenswrapper[4733]: fi Mar 18 10:14:14 crc kubenswrapper[4733]: Mar 18 10:14:14 crc kubenswrapper[4733]: dns_name_resolver_enabled_flag= Mar 18 10:14:14 crc kubenswrapper[4733]: if [[ "false" == "true" ]]; then Mar 18 10:14:14 crc kubenswrapper[4733]: dns_name_resolver_enabled_flag="--enable-dns-name-resolver" Mar 18 10:14:14 crc kubenswrapper[4733]: fi Mar 18 10:14:14 crc kubenswrapper[4733]: Mar 18 10:14:14 crc kubenswrapper[4733]: persistent_ips_enabled_flag= Mar 18 10:14:14 crc kubenswrapper[4733]: if [[ "true" == "true" ]]; then Mar 18 10:14:14 crc kubenswrapper[4733]: persistent_ips_enabled_flag="--enable-persistent-ips" Mar 18 10:14:14 crc kubenswrapper[4733]: fi Mar 18 10:14:14 crc kubenswrapper[4733]: Mar 18 10:14:14 crc kubenswrapper[4733]: # This is needed so that converting clusters from GA to TP Mar 18 10:14:14 crc kubenswrapper[4733]: # will rollout control plane pods as well Mar 18 10:14:14 crc kubenswrapper[4733]: network_segmentation_enabled_flag= Mar 18 10:14:14 crc kubenswrapper[4733]: multi_network_enabled_flag= Mar 18 10:14:14 crc kubenswrapper[4733]: if [[ "true" == "true" ]]; then Mar 18 10:14:14 crc kubenswrapper[4733]: multi_network_enabled_flag="--enable-multi-network" Mar 18 10:14:14 crc kubenswrapper[4733]: network_segmentation_enabled_flag="--enable-network-segmentation" Mar 18 10:14:14 crc kubenswrapper[4733]: fi Mar 18 10:14:14 crc kubenswrapper[4733]: Mar 18 10:14:14 crc kubenswrapper[4733]: echo "I$(date "+%m%d %H:%M:%S.%N") - ovnkube-control-plane - start ovnkube --init-cluster-manager ${K8S_NODE}" Mar 18 10:14:14 crc kubenswrapper[4733]: exec /usr/bin/ovnkube \ Mar 18 10:14:14 crc kubenswrapper[4733]: --enable-interconnect \ Mar 18 10:14:14 crc kubenswrapper[4733]: --init-cluster-manager "${K8S_NODE}" \ Mar 18 10:14:14 crc kubenswrapper[4733]: --config-file=/run/ovnkube-config/ovnkube.conf \ Mar 18 10:14:14 crc kubenswrapper[4733]: --loglevel "${OVN_KUBE_LOG_LEVEL}" \ Mar 18 10:14:14 crc kubenswrapper[4733]: --metrics-bind-address "127.0.0.1:29108" \ Mar 18 10:14:14 crc kubenswrapper[4733]: --metrics-enable-pprof \ Mar 18 10:14:14 crc kubenswrapper[4733]: --metrics-enable-config-duration \ Mar 18 10:14:14 crc kubenswrapper[4733]: ${ovn_v4_join_subnet_opt} \ Mar 18 10:14:14 crc kubenswrapper[4733]: ${ovn_v6_join_subnet_opt} \ Mar 18 10:14:14 crc kubenswrapper[4733]: ${ovn_v4_transit_switch_subnet_opt} \ Mar 18 10:14:14 crc kubenswrapper[4733]: ${ovn_v6_transit_switch_subnet_opt} \ Mar 18 10:14:14 crc kubenswrapper[4733]: ${dns_name_resolver_enabled_flag} \ Mar 18 10:14:14 crc kubenswrapper[4733]: ${persistent_ips_enabled_flag} \ Mar 18 10:14:14 crc kubenswrapper[4733]: ${multi_network_enabled_flag} \ Mar 18 10:14:14 crc kubenswrapper[4733]: ${network_segmentation_enabled_flag} Mar 18 10:14:14 crc kubenswrapper[4733]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics-port,HostPort:29108,ContainerPort:29108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OVN_KUBE_LOG_LEVEL,Value:4,ValueFrom:nil,},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{314572800 0} {} 300Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovnkube-config,ReadOnly:false,MountPath:/run/ovnkube-config/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vg7hc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-spfjj_openshift-ovn-kubernetes(7d693a73-68c1-4595-bbcc-be97691b06fe): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 10:14:14 crc kubenswrapper[4733]: > logger="UnhandledError" Mar 18 10:14:14 crc kubenswrapper[4733]: E0318 10:14:14.597956 4733 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 10:14:14 crc kubenswrapper[4733]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 18 10:14:14 crc kubenswrapper[4733]: set -uo pipefail Mar 18 10:14:14 crc kubenswrapper[4733]: Mar 18 10:14:14 crc kubenswrapper[4733]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 18 10:14:14 crc kubenswrapper[4733]: Mar 18 10:14:14 crc kubenswrapper[4733]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 18 10:14:14 crc kubenswrapper[4733]: HOSTS_FILE="/etc/hosts" Mar 18 10:14:14 crc kubenswrapper[4733]: TEMP_FILE="/etc/hosts.tmp" Mar 18 10:14:14 crc kubenswrapper[4733]: Mar 18 10:14:14 crc kubenswrapper[4733]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 18 10:14:14 crc kubenswrapper[4733]: Mar 18 10:14:14 crc kubenswrapper[4733]: # Make a temporary file with the old hosts file's attributes. Mar 18 10:14:14 crc kubenswrapper[4733]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 18 10:14:14 crc kubenswrapper[4733]: echo "Failed to preserve hosts file. Exiting." Mar 18 10:14:14 crc kubenswrapper[4733]: exit 1 Mar 18 10:14:14 crc kubenswrapper[4733]: fi Mar 18 10:14:14 crc kubenswrapper[4733]: Mar 18 10:14:14 crc kubenswrapper[4733]: while true; do Mar 18 10:14:14 crc kubenswrapper[4733]: declare -A svc_ips Mar 18 10:14:14 crc kubenswrapper[4733]: for svc in "${services[@]}"; do Mar 18 10:14:14 crc kubenswrapper[4733]: # Fetch service IP from cluster dns if present. We make several tries Mar 18 10:14:14 crc kubenswrapper[4733]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 18 10:14:14 crc kubenswrapper[4733]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 18 10:14:14 crc kubenswrapper[4733]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 18 10:14:14 crc kubenswrapper[4733]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 18 10:14:14 crc kubenswrapper[4733]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 18 10:14:14 crc kubenswrapper[4733]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 18 10:14:14 crc kubenswrapper[4733]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 18 10:14:14 crc kubenswrapper[4733]: for i in ${!cmds[*]} Mar 18 10:14:14 crc kubenswrapper[4733]: do Mar 18 10:14:14 crc kubenswrapper[4733]: ips=($(eval "${cmds[i]}")) Mar 18 10:14:14 crc kubenswrapper[4733]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 18 10:14:14 crc kubenswrapper[4733]: svc_ips["${svc}"]="${ips[@]}" Mar 18 10:14:14 crc kubenswrapper[4733]: break Mar 18 10:14:14 crc kubenswrapper[4733]: fi Mar 18 10:14:14 crc kubenswrapper[4733]: done Mar 18 10:14:14 crc kubenswrapper[4733]: done Mar 18 10:14:14 crc kubenswrapper[4733]: Mar 18 10:14:14 crc kubenswrapper[4733]: # Update /etc/hosts only if we get valid service IPs Mar 18 10:14:14 crc kubenswrapper[4733]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 18 10:14:14 crc kubenswrapper[4733]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 18 10:14:14 crc kubenswrapper[4733]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 18 10:14:14 crc kubenswrapper[4733]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 18 10:14:14 crc kubenswrapper[4733]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 18 10:14:14 crc kubenswrapper[4733]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 18 10:14:14 crc kubenswrapper[4733]: sleep 60 & wait Mar 18 10:14:14 crc kubenswrapper[4733]: continue Mar 18 10:14:14 crc kubenswrapper[4733]: fi Mar 18 10:14:14 crc kubenswrapper[4733]: Mar 18 10:14:14 crc kubenswrapper[4733]: # Append resolver entries for services Mar 18 10:14:14 crc kubenswrapper[4733]: rc=0 Mar 18 10:14:14 crc kubenswrapper[4733]: for svc in "${!svc_ips[@]}"; do Mar 18 10:14:14 crc kubenswrapper[4733]: for ip in ${svc_ips[${svc}]}; do Mar 18 10:14:14 crc kubenswrapper[4733]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 18 10:14:14 crc kubenswrapper[4733]: done Mar 18 10:14:14 crc kubenswrapper[4733]: done Mar 18 10:14:14 crc kubenswrapper[4733]: if [[ $rc -ne 0 ]]; then Mar 18 10:14:14 crc kubenswrapper[4733]: sleep 60 & wait Mar 18 10:14:14 crc kubenswrapper[4733]: continue Mar 18 10:14:14 crc kubenswrapper[4733]: fi Mar 18 10:14:14 crc kubenswrapper[4733]: Mar 18 10:14:14 crc kubenswrapper[4733]: Mar 18 10:14:14 crc kubenswrapper[4733]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 18 10:14:14 crc kubenswrapper[4733]: # Replace /etc/hosts with our modified version if needed Mar 18 10:14:14 crc kubenswrapper[4733]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 18 10:14:14 crc kubenswrapper[4733]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 18 10:14:14 crc kubenswrapper[4733]: fi Mar 18 10:14:14 crc kubenswrapper[4733]: sleep 60 & wait Mar 18 10:14:14 crc kubenswrapper[4733]: unset svc_ips Mar 18 10:14:14 crc kubenswrapper[4733]: done Mar 18 10:14:14 crc kubenswrapper[4733]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-httph,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-hsk58_openshift-dns(c2c181c8-3361-40a2-afc5-a677e0ab4ecd): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 10:14:14 crc kubenswrapper[4733]: > logger="UnhandledError" Mar 18 10:14:14 crc kubenswrapper[4733]: E0318 10:14:14.598890 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"ovnkube-cluster-manager\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spfjj" podUID="7d693a73-68c1-4595-bbcc-be97691b06fe" Mar 18 10:14:14 crc kubenswrapper[4733]: E0318 10:14:14.599111 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-hsk58" podUID="c2c181c8-3361-40a2-afc5-a677e0ab4ecd" Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.599958 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t28sh" event={"ID":"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9","Type":"ContainerStarted","Data":"ff1d93cef35a316c51b4c062176258993d12b241a2a73aaf584a089109ebf4c1"} Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.606120 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"082c29e3548c80a38ab2a902a222a7bd38f378ff323b000dc303bf38651a6b0f"} Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.607400 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t28sh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t28sh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:14 crc kubenswrapper[4733]: E0318 10:14:14.607619 4733 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xwk4s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-t28sh_openshift-multus(0f82588a-9dbd-4c55-8cfc-f96e57fa58b9): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 18 10:14:14 crc kubenswrapper[4733]: E0318 10:14:14.608910 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-t28sh" podUID="0f82588a-9dbd-4c55-8cfc-f96e57fa58b9" Mar 18 10:14:14 crc kubenswrapper[4733]: E0318 10:14:14.610970 4733 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.612125 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"7fb0376f2eec0926827408777c433a08963622c37f222da5c978e502f9dbcbfd"} Mar 18 10:14:14 crc kubenswrapper[4733]: E0318 10:14:14.612816 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 18 10:14:14 crc kubenswrapper[4733]: E0318 10:14:14.614687 4733 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 10:14:14 crc kubenswrapper[4733]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 10:14:14 crc kubenswrapper[4733]: if [[ -f "/env/_master" ]]; then Mar 18 10:14:14 crc kubenswrapper[4733]: set -o allexport Mar 18 10:14:14 crc kubenswrapper[4733]: source "/env/_master" Mar 18 10:14:14 crc kubenswrapper[4733]: set +o allexport Mar 18 10:14:14 crc kubenswrapper[4733]: fi Mar 18 10:14:14 crc kubenswrapper[4733]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 18 10:14:14 crc kubenswrapper[4733]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 18 10:14:14 crc kubenswrapper[4733]: ho_enable="--enable-hybrid-overlay" Mar 18 10:14:14 crc kubenswrapper[4733]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 18 10:14:14 crc kubenswrapper[4733]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 18 10:14:14 crc kubenswrapper[4733]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 18 10:14:14 crc kubenswrapper[4733]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 18 10:14:14 crc kubenswrapper[4733]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 18 10:14:14 crc kubenswrapper[4733]: --webhook-host=127.0.0.1 \ Mar 18 10:14:14 crc kubenswrapper[4733]: --webhook-port=9743 \ Mar 18 10:14:14 crc kubenswrapper[4733]: ${ho_enable} \ Mar 18 10:14:14 crc kubenswrapper[4733]: --enable-interconnect \ Mar 18 10:14:14 crc kubenswrapper[4733]: --disable-approver \ Mar 18 10:14:14 crc kubenswrapper[4733]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 18 10:14:14 crc kubenswrapper[4733]: --wait-for-kubernetes-api=200s \ Mar 18 10:14:14 crc kubenswrapper[4733]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 18 10:14:14 crc kubenswrapper[4733]: --loglevel="${LOGLEVEL}" Mar 18 10:14:14 crc kubenswrapper[4733]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 10:14:14 crc kubenswrapper[4733]: > logger="UnhandledError" Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.614839 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-g6j2q" event={"ID":"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa","Type":"ContainerStarted","Data":"f6b7bfe1b02ca90d8b984d1a9c50c920b6a4ec109f2b15d0d9e4be17d1064a45"} Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.616593 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f75e1c5-e0c5-43df-944f-77b734070793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2h7dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:14 crc kubenswrapper[4733]: E0318 10:14:14.616985 4733 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 10:14:14 crc kubenswrapper[4733]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 18 10:14:14 crc kubenswrapper[4733]: apiVersion: v1 Mar 18 10:14:14 crc kubenswrapper[4733]: clusters: Mar 18 10:14:14 crc kubenswrapper[4733]: - cluster: Mar 18 10:14:14 crc kubenswrapper[4733]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 18 10:14:14 crc kubenswrapper[4733]: server: https://api-int.crc.testing:6443 Mar 18 10:14:14 crc kubenswrapper[4733]: name: default-cluster Mar 18 10:14:14 crc kubenswrapper[4733]: contexts: Mar 18 10:14:14 crc kubenswrapper[4733]: - context: Mar 18 10:14:14 crc kubenswrapper[4733]: cluster: default-cluster Mar 18 10:14:14 crc kubenswrapper[4733]: namespace: default Mar 18 10:14:14 crc kubenswrapper[4733]: user: default-auth Mar 18 10:14:14 crc kubenswrapper[4733]: name: default-context Mar 18 10:14:14 crc kubenswrapper[4733]: current-context: default-context Mar 18 10:14:14 crc kubenswrapper[4733]: kind: Config Mar 18 10:14:14 crc kubenswrapper[4733]: preferences: {} Mar 18 10:14:14 crc kubenswrapper[4733]: users: Mar 18 10:14:14 crc kubenswrapper[4733]: - name: default-auth Mar 18 10:14:14 crc kubenswrapper[4733]: user: Mar 18 10:14:14 crc kubenswrapper[4733]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 18 10:14:14 crc kubenswrapper[4733]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 18 10:14:14 crc kubenswrapper[4733]: EOF Mar 18 10:14:14 crc kubenswrapper[4733]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zqxdr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-7pxwd_openshift-ovn-kubernetes(73327417-4d3b-45f1-b3b6-575fdeeaa31a): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 10:14:14 crc kubenswrapper[4733]: > logger="UnhandledError" Mar 18 10:14:14 crc kubenswrapper[4733]: E0318 10:14:14.618124 4733 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 10:14:14 crc kubenswrapper[4733]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 10:14:14 crc kubenswrapper[4733]: if [[ -f "/env/_master" ]]; then Mar 18 10:14:14 crc kubenswrapper[4733]: set -o allexport Mar 18 10:14:14 crc kubenswrapper[4733]: source "/env/_master" Mar 18 10:14:14 crc kubenswrapper[4733]: set +o allexport Mar 18 10:14:14 crc kubenswrapper[4733]: fi Mar 18 10:14:14 crc kubenswrapper[4733]: Mar 18 10:14:14 crc kubenswrapper[4733]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 18 10:14:14 crc kubenswrapper[4733]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 18 10:14:14 crc kubenswrapper[4733]: --disable-webhook \ Mar 18 10:14:14 crc kubenswrapper[4733]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 18 10:14:14 crc kubenswrapper[4733]: --loglevel="${LOGLEVEL}" Mar 18 10:14:14 crc kubenswrapper[4733]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 10:14:14 crc kubenswrapper[4733]: > logger="UnhandledError" Mar 18 10:14:14 crc kubenswrapper[4733]: E0318 10:14:14.618168 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" podUID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" Mar 18 10:14:14 crc kubenswrapper[4733]: E0318 10:14:14.618362 4733 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 10:14:14 crc kubenswrapper[4733]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 18 10:14:14 crc kubenswrapper[4733]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 18 10:14:14 crc kubenswrapper[4733]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ph8vv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-g6j2q_openshift-multus(cc85b0d4-15a5-4894-9f07-9aaeb28f63fa): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 10:14:14 crc kubenswrapper[4733]: > logger="UnhandledError" Mar 18 10:14:14 crc kubenswrapper[4733]: E0318 10:14:14.619377 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 18 10:14:14 crc kubenswrapper[4733]: E0318 10:14:14.619547 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-g6j2q" podUID="cc85b0d4-15a5-4894-9f07-9aaeb28f63fa" Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.624654 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"908bd772-fb33-4f68-8971-d1fef3118c82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3457636bb3e1cc25507158454524b9cee6812beb56c7b22fb86b9438b8082488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.636686 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.645595 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4s425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3650177-e338-4eba-ab42-bc0cd14c9d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4s425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.655118 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.664865 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.676893 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g6j2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ph8vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g6j2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.685546 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.685601 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.685611 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.685629 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.685642 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:14Z","lastTransitionTime":"2026-03-18T10:14:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.686037 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spfjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d693a73-68c1-4595-bbcc-be97691b06fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-spfjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.692960 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hsk58" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2c181c8-3361-40a2-afc5-a677e0ab4ecd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-httph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hsk58\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.708601 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.724396 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73327417-4d3b-45f1-b3b6-575fdeeaa31a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pxwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.733668 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.740735 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xfvfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb58b528-9013-4fab-9747-60bb6ff1bc1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg7jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xfvfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.750370 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t28sh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t28sh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.767564 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f75e1c5-e0c5-43df-944f-77b734070793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2h7dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.807559 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"908bd772-fb33-4f68-8971-d1fef3118c82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3457636bb3e1cc25507158454524b9cee6812beb56c7b22fb86b9438b8082488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.819330 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.819394 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.819404 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.819423 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.819436 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:14Z","lastTransitionTime":"2026-03-18T10:14:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.849295 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.887457 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.918173 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.918380 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.918423 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.918445 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:14:14 crc kubenswrapper[4733]: E0318 10:14:14.918503 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:14:16.918483803 +0000 UTC m=+96.410218128 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.918555 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b3650177-e338-4eba-ab42-bc0cd14c9d65-metrics-certs\") pod \"network-metrics-daemon-4s425\" (UID: \"b3650177-e338-4eba-ab42-bc0cd14c9d65\") " pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:14:14 crc kubenswrapper[4733]: E0318 10:14:14.918619 4733 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 10:14:14 crc kubenswrapper[4733]: E0318 10:14:14.918679 4733 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 10:14:14 crc kubenswrapper[4733]: E0318 10:14:14.918713 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3650177-e338-4eba-ab42-bc0cd14c9d65-metrics-certs podName:b3650177-e338-4eba-ab42-bc0cd14c9d65 nodeName:}" failed. No retries permitted until 2026-03-18 10:14:16.918705499 +0000 UTC m=+96.410439825 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b3650177-e338-4eba-ab42-bc0cd14c9d65-metrics-certs") pod "network-metrics-daemon-4s425" (UID: "b3650177-e338-4eba-ab42-bc0cd14c9d65") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 10:14:14 crc kubenswrapper[4733]: E0318 10:14:14.918742 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 10:14:16.91871922 +0000 UTC m=+96.410453545 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 10:14:14 crc kubenswrapper[4733]: E0318 10:14:14.918629 4733 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 10:14:14 crc kubenswrapper[4733]: E0318 10:14:14.918774 4733 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 10:14:14 crc kubenswrapper[4733]: E0318 10:14:14.918786 4733 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 10:14:14 crc kubenswrapper[4733]: E0318 10:14:14.918812 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 10:14:16.918805922 +0000 UTC m=+96.410540247 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 10:14:14 crc kubenswrapper[4733]: E0318 10:14:14.918834 4733 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 10:14:14 crc kubenswrapper[4733]: E0318 10:14:14.919030 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 10:14:16.918973547 +0000 UTC m=+96.410708062 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.923105 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.923160 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.923183 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.923253 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.923278 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:14Z","lastTransitionTime":"2026-03-18T10:14:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.934220 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:14 crc kubenswrapper[4733]: I0318 10:14:14.967108 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.006555 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4s425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3650177-e338-4eba-ab42-bc0cd14c9d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4s425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.019772 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:14:15 crc kubenswrapper[4733]: E0318 10:14:15.020148 4733 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 10:14:15 crc kubenswrapper[4733]: E0318 10:14:15.020254 4733 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 10:14:15 crc kubenswrapper[4733]: E0318 10:14:15.020288 4733 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 10:14:15 crc kubenswrapper[4733]: E0318 10:14:15.020409 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 10:14:17.020374287 +0000 UTC m=+96.512108772 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.026660 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.026718 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.026735 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.026760 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.026778 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:15Z","lastTransitionTime":"2026-03-18T10:14:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.050959 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spfjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d693a73-68c1-4595-bbcc-be97691b06fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-spfjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.088463 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hsk58" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2c181c8-3361-40a2-afc5-a677e0ab4ecd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-httph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hsk58\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.129583 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.130234 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.130322 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.130353 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.130390 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.130417 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:15Z","lastTransitionTime":"2026-03-18T10:14:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.172948 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g6j2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ph8vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g6j2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.175437 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:14:15 crc kubenswrapper[4733]: E0318 10:14:15.175583 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.175721 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.175790 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:14:15 crc kubenswrapper[4733]: E0318 10:14:15.175890 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 10:14:15 crc kubenswrapper[4733]: E0318 10:14:15.175980 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4s425" podUID="b3650177-e338-4eba-ab42-bc0cd14c9d65" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.176004 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:14:15 crc kubenswrapper[4733]: E0318 10:14:15.176329 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.181601 4733 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.184512 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.186138 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.188778 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.190122 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.192336 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.193736 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.194628 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.195424 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.196353 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.197239 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.198026 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.199274 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.200264 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.201321 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.202075 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.205894 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.206929 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.208050 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.208879 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.209744 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.211360 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.212340 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.213744 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.214753 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.215532 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.217098 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.218692 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.219480 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.220436 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.221794 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.222581 4733 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.222795 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.225540 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.226076 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.226803 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.228933 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.229604 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.230780 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.231462 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.232864 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.233415 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.234176 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.234321 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.234381 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.234404 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.234442 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.234469 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:15Z","lastTransitionTime":"2026-03-18T10:14:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.235054 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.235283 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.236319 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.236882 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.237965 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.238580 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.239702 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.240319 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.241231 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.241863 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.242433 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.243423 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.243921 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.267707 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xfvfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb58b528-9013-4fab-9747-60bb6ff1bc1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg7jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xfvfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.317298 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73327417-4d3b-45f1-b3b6-575fdeeaa31a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pxwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.338450 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.338610 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.338678 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.338792 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.338902 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:15Z","lastTransitionTime":"2026-03-18T10:14:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.442633 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.442707 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.442726 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.442755 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.442775 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:15Z","lastTransitionTime":"2026-03-18T10:14:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.545932 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.545998 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.546020 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.546052 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.546077 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:15Z","lastTransitionTime":"2026-03-18T10:14:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.648947 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.649042 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.649057 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.649077 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.649093 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:15Z","lastTransitionTime":"2026-03-18T10:14:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.755433 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.755468 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.755479 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.755496 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.755510 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:15Z","lastTransitionTime":"2026-03-18T10:14:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.858007 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.858055 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.858067 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.858083 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.858094 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:15Z","lastTransitionTime":"2026-03-18T10:14:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.961143 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.961211 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.961220 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.961241 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:15 crc kubenswrapper[4733]: I0318 10:14:15.961253 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:15Z","lastTransitionTime":"2026-03-18T10:14:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:16 crc kubenswrapper[4733]: I0318 10:14:16.064311 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:16 crc kubenswrapper[4733]: I0318 10:14:16.064383 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:16 crc kubenswrapper[4733]: I0318 10:14:16.064405 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:16 crc kubenswrapper[4733]: I0318 10:14:16.064435 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:16 crc kubenswrapper[4733]: I0318 10:14:16.064489 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:16Z","lastTransitionTime":"2026-03-18T10:14:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:16 crc kubenswrapper[4733]: I0318 10:14:16.166792 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:16 crc kubenswrapper[4733]: I0318 10:14:16.166875 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:16 crc kubenswrapper[4733]: I0318 10:14:16.166896 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:16 crc kubenswrapper[4733]: I0318 10:14:16.166927 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:16 crc kubenswrapper[4733]: I0318 10:14:16.166950 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:16Z","lastTransitionTime":"2026-03-18T10:14:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:16 crc kubenswrapper[4733]: I0318 10:14:16.269844 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:16 crc kubenswrapper[4733]: I0318 10:14:16.269897 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:16 crc kubenswrapper[4733]: I0318 10:14:16.269906 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:16 crc kubenswrapper[4733]: I0318 10:14:16.269944 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:16 crc kubenswrapper[4733]: I0318 10:14:16.269956 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:16Z","lastTransitionTime":"2026-03-18T10:14:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:16 crc kubenswrapper[4733]: I0318 10:14:16.372912 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:16 crc kubenswrapper[4733]: I0318 10:14:16.372960 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:16 crc kubenswrapper[4733]: I0318 10:14:16.372969 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:16 crc kubenswrapper[4733]: I0318 10:14:16.372985 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:16 crc kubenswrapper[4733]: I0318 10:14:16.372996 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:16Z","lastTransitionTime":"2026-03-18T10:14:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:16 crc kubenswrapper[4733]: I0318 10:14:16.476134 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:16 crc kubenswrapper[4733]: I0318 10:14:16.476169 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:16 crc kubenswrapper[4733]: I0318 10:14:16.476177 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:16 crc kubenswrapper[4733]: I0318 10:14:16.476207 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:16 crc kubenswrapper[4733]: I0318 10:14:16.476217 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:16Z","lastTransitionTime":"2026-03-18T10:14:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:16 crc kubenswrapper[4733]: I0318 10:14:16.578306 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:16 crc kubenswrapper[4733]: I0318 10:14:16.578337 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:16 crc kubenswrapper[4733]: I0318 10:14:16.578345 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:16 crc kubenswrapper[4733]: I0318 10:14:16.578360 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:16 crc kubenswrapper[4733]: I0318 10:14:16.578370 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:16Z","lastTransitionTime":"2026-03-18T10:14:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:16 crc kubenswrapper[4733]: I0318 10:14:16.681815 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:16 crc kubenswrapper[4733]: I0318 10:14:16.681880 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:16 crc kubenswrapper[4733]: I0318 10:14:16.681893 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:16 crc kubenswrapper[4733]: I0318 10:14:16.681916 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:16 crc kubenswrapper[4733]: I0318 10:14:16.681928 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:16Z","lastTransitionTime":"2026-03-18T10:14:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:16 crc kubenswrapper[4733]: I0318 10:14:16.785863 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:16 crc kubenswrapper[4733]: I0318 10:14:16.785909 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:16 crc kubenswrapper[4733]: I0318 10:14:16.785926 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:16 crc kubenswrapper[4733]: I0318 10:14:16.785947 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:16 crc kubenswrapper[4733]: I0318 10:14:16.785964 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:16Z","lastTransitionTime":"2026-03-18T10:14:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:16 crc kubenswrapper[4733]: I0318 10:14:16.889379 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:16 crc kubenswrapper[4733]: I0318 10:14:16.889448 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:16 crc kubenswrapper[4733]: I0318 10:14:16.889470 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:16 crc kubenswrapper[4733]: I0318 10:14:16.889498 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:16 crc kubenswrapper[4733]: I0318 10:14:16.889519 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:16Z","lastTransitionTime":"2026-03-18T10:14:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:16 crc kubenswrapper[4733]: I0318 10:14:16.942161 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:14:16 crc kubenswrapper[4733]: I0318 10:14:16.942383 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:14:16 crc kubenswrapper[4733]: E0318 10:14:16.942423 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:14:20.942381251 +0000 UTC m=+100.434115616 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:14:16 crc kubenswrapper[4733]: I0318 10:14:16.942462 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:14:16 crc kubenswrapper[4733]: I0318 10:14:16.942508 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:14:16 crc kubenswrapper[4733]: I0318 10:14:16.942568 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b3650177-e338-4eba-ab42-bc0cd14c9d65-metrics-certs\") pod \"network-metrics-daemon-4s425\" (UID: \"b3650177-e338-4eba-ab42-bc0cd14c9d65\") " pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:14:16 crc kubenswrapper[4733]: E0318 10:14:16.942691 4733 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 10:14:16 crc kubenswrapper[4733]: E0318 10:14:16.942703 4733 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 10:14:16 crc kubenswrapper[4733]: E0318 10:14:16.942755 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 10:14:20.942739751 +0000 UTC m=+100.434474116 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 10:14:16 crc kubenswrapper[4733]: E0318 10:14:16.942752 4733 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 10:14:16 crc kubenswrapper[4733]: E0318 10:14:16.942756 4733 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 10:14:16 crc kubenswrapper[4733]: E0318 10:14:16.942906 4733 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 10:14:16 crc kubenswrapper[4733]: E0318 10:14:16.942713 4733 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 10:14:16 crc kubenswrapper[4733]: E0318 10:14:16.942849 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3650177-e338-4eba-ab42-bc0cd14c9d65-metrics-certs podName:b3650177-e338-4eba-ab42-bc0cd14c9d65 nodeName:}" failed. No retries permitted until 2026-03-18 10:14:20.942819763 +0000 UTC m=+100.434554128 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b3650177-e338-4eba-ab42-bc0cd14c9d65-metrics-certs") pod "network-metrics-daemon-4s425" (UID: "b3650177-e338-4eba-ab42-bc0cd14c9d65") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 10:14:16 crc kubenswrapper[4733]: E0318 10:14:16.943087 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 10:14:20.94305852 +0000 UTC m=+100.434792875 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 10:14:16 crc kubenswrapper[4733]: E0318 10:14:16.943116 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 10:14:20.943102061 +0000 UTC m=+100.434836416 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 10:14:16 crc kubenswrapper[4733]: I0318 10:14:16.993657 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:16 crc kubenswrapper[4733]: I0318 10:14:16.993732 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:16 crc kubenswrapper[4733]: I0318 10:14:16.993755 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:16 crc kubenswrapper[4733]: I0318 10:14:16.993784 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:16 crc kubenswrapper[4733]: I0318 10:14:16.993807 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:16Z","lastTransitionTime":"2026-03-18T10:14:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:17 crc kubenswrapper[4733]: I0318 10:14:17.043777 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:14:17 crc kubenswrapper[4733]: E0318 10:14:17.043993 4733 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 10:14:17 crc kubenswrapper[4733]: E0318 10:14:17.044027 4733 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 10:14:17 crc kubenswrapper[4733]: E0318 10:14:17.044040 4733 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 10:14:17 crc kubenswrapper[4733]: E0318 10:14:17.044106 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 10:14:21.04408664 +0000 UTC m=+100.535820965 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 10:14:17 crc kubenswrapper[4733]: I0318 10:14:17.096372 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:17 crc kubenswrapper[4733]: I0318 10:14:17.096432 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:17 crc kubenswrapper[4733]: I0318 10:14:17.096450 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:17 crc kubenswrapper[4733]: I0318 10:14:17.096508 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:17 crc kubenswrapper[4733]: I0318 10:14:17.096528 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:17Z","lastTransitionTime":"2026-03-18T10:14:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:17 crc kubenswrapper[4733]: I0318 10:14:17.174720 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:14:17 crc kubenswrapper[4733]: I0318 10:14:17.174816 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:14:17 crc kubenswrapper[4733]: E0318 10:14:17.174883 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4s425" podUID="b3650177-e338-4eba-ab42-bc0cd14c9d65" Mar 18 10:14:17 crc kubenswrapper[4733]: E0318 10:14:17.175010 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 10:14:17 crc kubenswrapper[4733]: I0318 10:14:17.175105 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:14:17 crc kubenswrapper[4733]: E0318 10:14:17.176363 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 10:14:17 crc kubenswrapper[4733]: I0318 10:14:17.176549 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:14:17 crc kubenswrapper[4733]: E0318 10:14:17.176916 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 10:14:17 crc kubenswrapper[4733]: I0318 10:14:17.200112 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:17 crc kubenswrapper[4733]: I0318 10:14:17.200333 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:17 crc kubenswrapper[4733]: I0318 10:14:17.200394 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:17 crc kubenswrapper[4733]: I0318 10:14:17.200421 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:17 crc kubenswrapper[4733]: I0318 10:14:17.200440 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:17Z","lastTransitionTime":"2026-03-18T10:14:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:17 crc kubenswrapper[4733]: I0318 10:14:17.303124 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:17 crc kubenswrapper[4733]: I0318 10:14:17.303213 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:17 crc kubenswrapper[4733]: I0318 10:14:17.303233 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:17 crc kubenswrapper[4733]: I0318 10:14:17.303264 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:17 crc kubenswrapper[4733]: I0318 10:14:17.303289 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:17Z","lastTransitionTime":"2026-03-18T10:14:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:17 crc kubenswrapper[4733]: I0318 10:14:17.406724 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:17 crc kubenswrapper[4733]: I0318 10:14:17.406776 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:17 crc kubenswrapper[4733]: I0318 10:14:17.406792 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:17 crc kubenswrapper[4733]: I0318 10:14:17.406811 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:17 crc kubenswrapper[4733]: I0318 10:14:17.406825 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:17Z","lastTransitionTime":"2026-03-18T10:14:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:17 crc kubenswrapper[4733]: I0318 10:14:17.510143 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:17 crc kubenswrapper[4733]: I0318 10:14:17.510210 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:17 crc kubenswrapper[4733]: I0318 10:14:17.510227 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:17 crc kubenswrapper[4733]: I0318 10:14:17.510249 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:17 crc kubenswrapper[4733]: I0318 10:14:17.510266 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:17Z","lastTransitionTime":"2026-03-18T10:14:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:17 crc kubenswrapper[4733]: I0318 10:14:17.613378 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:17 crc kubenswrapper[4733]: I0318 10:14:17.613434 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:17 crc kubenswrapper[4733]: I0318 10:14:17.613448 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:17 crc kubenswrapper[4733]: I0318 10:14:17.613478 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:17 crc kubenswrapper[4733]: I0318 10:14:17.613495 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:17Z","lastTransitionTime":"2026-03-18T10:14:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:17 crc kubenswrapper[4733]: I0318 10:14:17.715984 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:17 crc kubenswrapper[4733]: I0318 10:14:17.716022 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:17 crc kubenswrapper[4733]: I0318 10:14:17.716034 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:17 crc kubenswrapper[4733]: I0318 10:14:17.716052 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:17 crc kubenswrapper[4733]: I0318 10:14:17.716065 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:17Z","lastTransitionTime":"2026-03-18T10:14:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:17 crc kubenswrapper[4733]: I0318 10:14:17.819095 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:17 crc kubenswrapper[4733]: I0318 10:14:17.819172 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:17 crc kubenswrapper[4733]: I0318 10:14:17.819242 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:17 crc kubenswrapper[4733]: I0318 10:14:17.819273 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:17 crc kubenswrapper[4733]: I0318 10:14:17.819293 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:17Z","lastTransitionTime":"2026-03-18T10:14:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:17 crc kubenswrapper[4733]: I0318 10:14:17.922248 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:17 crc kubenswrapper[4733]: I0318 10:14:17.922317 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:17 crc kubenswrapper[4733]: I0318 10:14:17.922343 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:17 crc kubenswrapper[4733]: I0318 10:14:17.922373 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:17 crc kubenswrapper[4733]: I0318 10:14:17.922402 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:17Z","lastTransitionTime":"2026-03-18T10:14:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:18 crc kubenswrapper[4733]: I0318 10:14:18.025999 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:18 crc kubenswrapper[4733]: I0318 10:14:18.026063 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:18 crc kubenswrapper[4733]: I0318 10:14:18.026081 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:18 crc kubenswrapper[4733]: I0318 10:14:18.026109 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:18 crc kubenswrapper[4733]: I0318 10:14:18.026130 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:18Z","lastTransitionTime":"2026-03-18T10:14:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:18 crc kubenswrapper[4733]: I0318 10:14:18.130608 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:18 crc kubenswrapper[4733]: I0318 10:14:18.130708 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:18 crc kubenswrapper[4733]: I0318 10:14:18.130737 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:18 crc kubenswrapper[4733]: I0318 10:14:18.130782 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:18 crc kubenswrapper[4733]: I0318 10:14:18.130825 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:18Z","lastTransitionTime":"2026-03-18T10:14:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:18 crc kubenswrapper[4733]: I0318 10:14:18.234687 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:18 crc kubenswrapper[4733]: I0318 10:14:18.234746 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:18 crc kubenswrapper[4733]: I0318 10:14:18.234756 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:18 crc kubenswrapper[4733]: I0318 10:14:18.234773 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:18 crc kubenswrapper[4733]: I0318 10:14:18.234783 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:18Z","lastTransitionTime":"2026-03-18T10:14:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:18 crc kubenswrapper[4733]: I0318 10:14:18.338257 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:18 crc kubenswrapper[4733]: I0318 10:14:18.338362 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:18 crc kubenswrapper[4733]: I0318 10:14:18.338386 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:18 crc kubenswrapper[4733]: I0318 10:14:18.338413 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:18 crc kubenswrapper[4733]: I0318 10:14:18.338431 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:18Z","lastTransitionTime":"2026-03-18T10:14:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:18 crc kubenswrapper[4733]: I0318 10:14:18.441006 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:18 crc kubenswrapper[4733]: I0318 10:14:18.441084 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:18 crc kubenswrapper[4733]: I0318 10:14:18.441108 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:18 crc kubenswrapper[4733]: I0318 10:14:18.441140 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:18 crc kubenswrapper[4733]: I0318 10:14:18.441163 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:18Z","lastTransitionTime":"2026-03-18T10:14:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:18 crc kubenswrapper[4733]: I0318 10:14:18.544317 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:18 crc kubenswrapper[4733]: I0318 10:14:18.544398 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:18 crc kubenswrapper[4733]: I0318 10:14:18.544422 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:18 crc kubenswrapper[4733]: I0318 10:14:18.544453 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:18 crc kubenswrapper[4733]: I0318 10:14:18.544475 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:18Z","lastTransitionTime":"2026-03-18T10:14:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:18 crc kubenswrapper[4733]: I0318 10:14:18.647717 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:18 crc kubenswrapper[4733]: I0318 10:14:18.647769 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:18 crc kubenswrapper[4733]: I0318 10:14:18.647786 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:18 crc kubenswrapper[4733]: I0318 10:14:18.647808 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:18 crc kubenswrapper[4733]: I0318 10:14:18.647825 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:18Z","lastTransitionTime":"2026-03-18T10:14:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:18 crc kubenswrapper[4733]: I0318 10:14:18.750702 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:18 crc kubenswrapper[4733]: I0318 10:14:18.750750 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:18 crc kubenswrapper[4733]: I0318 10:14:18.750762 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:18 crc kubenswrapper[4733]: I0318 10:14:18.750781 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:18 crc kubenswrapper[4733]: I0318 10:14:18.750797 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:18Z","lastTransitionTime":"2026-03-18T10:14:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:18 crc kubenswrapper[4733]: I0318 10:14:18.853609 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:18 crc kubenswrapper[4733]: I0318 10:14:18.853647 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:18 crc kubenswrapper[4733]: I0318 10:14:18.853655 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:18 crc kubenswrapper[4733]: I0318 10:14:18.853669 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:18 crc kubenswrapper[4733]: I0318 10:14:18.853681 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:18Z","lastTransitionTime":"2026-03-18T10:14:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:18 crc kubenswrapper[4733]: I0318 10:14:18.956593 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:18 crc kubenswrapper[4733]: I0318 10:14:18.956766 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:18 crc kubenswrapper[4733]: I0318 10:14:18.956857 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:18 crc kubenswrapper[4733]: I0318 10:14:18.956887 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:18 crc kubenswrapper[4733]: I0318 10:14:18.956960 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:18Z","lastTransitionTime":"2026-03-18T10:14:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:19 crc kubenswrapper[4733]: I0318 10:14:19.059941 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:19 crc kubenswrapper[4733]: I0318 10:14:19.059996 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:19 crc kubenswrapper[4733]: I0318 10:14:19.060008 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:19 crc kubenswrapper[4733]: I0318 10:14:19.060027 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:19 crc kubenswrapper[4733]: I0318 10:14:19.060041 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:19Z","lastTransitionTime":"2026-03-18T10:14:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:19 crc kubenswrapper[4733]: I0318 10:14:19.163588 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:19 crc kubenswrapper[4733]: I0318 10:14:19.163737 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:19 crc kubenswrapper[4733]: I0318 10:14:19.163754 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:19 crc kubenswrapper[4733]: I0318 10:14:19.163780 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:19 crc kubenswrapper[4733]: I0318 10:14:19.163798 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:19Z","lastTransitionTime":"2026-03-18T10:14:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:19 crc kubenswrapper[4733]: I0318 10:14:19.175476 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:14:19 crc kubenswrapper[4733]: I0318 10:14:19.175568 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:14:19 crc kubenswrapper[4733]: I0318 10:14:19.175612 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:14:19 crc kubenswrapper[4733]: I0318 10:14:19.175851 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:14:19 crc kubenswrapper[4733]: E0318 10:14:19.175873 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 10:14:19 crc kubenswrapper[4733]: E0318 10:14:19.176003 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 10:14:19 crc kubenswrapper[4733]: E0318 10:14:19.176137 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4s425" podUID="b3650177-e338-4eba-ab42-bc0cd14c9d65" Mar 18 10:14:19 crc kubenswrapper[4733]: E0318 10:14:19.176300 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 10:14:19 crc kubenswrapper[4733]: I0318 10:14:19.267227 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:19 crc kubenswrapper[4733]: I0318 10:14:19.267274 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:19 crc kubenswrapper[4733]: I0318 10:14:19.267284 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:19 crc kubenswrapper[4733]: I0318 10:14:19.267302 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:19 crc kubenswrapper[4733]: I0318 10:14:19.267313 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:19Z","lastTransitionTime":"2026-03-18T10:14:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:19 crc kubenswrapper[4733]: I0318 10:14:19.370642 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:19 crc kubenswrapper[4733]: I0318 10:14:19.370726 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:19 crc kubenswrapper[4733]: I0318 10:14:19.370746 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:19 crc kubenswrapper[4733]: I0318 10:14:19.370777 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:19 crc kubenswrapper[4733]: I0318 10:14:19.370798 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:19Z","lastTransitionTime":"2026-03-18T10:14:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:19 crc kubenswrapper[4733]: I0318 10:14:19.474401 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:19 crc kubenswrapper[4733]: I0318 10:14:19.474491 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:19 crc kubenswrapper[4733]: I0318 10:14:19.474512 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:19 crc kubenswrapper[4733]: I0318 10:14:19.474545 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:19 crc kubenswrapper[4733]: I0318 10:14:19.474566 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:19Z","lastTransitionTime":"2026-03-18T10:14:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:19 crc kubenswrapper[4733]: I0318 10:14:19.579305 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:19 crc kubenswrapper[4733]: I0318 10:14:19.579383 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:19 crc kubenswrapper[4733]: I0318 10:14:19.579411 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:19 crc kubenswrapper[4733]: I0318 10:14:19.579443 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:19 crc kubenswrapper[4733]: I0318 10:14:19.579461 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:19Z","lastTransitionTime":"2026-03-18T10:14:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:19 crc kubenswrapper[4733]: I0318 10:14:19.683409 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:19 crc kubenswrapper[4733]: I0318 10:14:19.683495 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:19 crc kubenswrapper[4733]: I0318 10:14:19.683516 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:19 crc kubenswrapper[4733]: I0318 10:14:19.683549 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:19 crc kubenswrapper[4733]: I0318 10:14:19.683571 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:19Z","lastTransitionTime":"2026-03-18T10:14:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:19 crc kubenswrapper[4733]: I0318 10:14:19.786810 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:19 crc kubenswrapper[4733]: I0318 10:14:19.786902 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:19 crc kubenswrapper[4733]: I0318 10:14:19.786921 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:19 crc kubenswrapper[4733]: I0318 10:14:19.786951 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:19 crc kubenswrapper[4733]: I0318 10:14:19.786972 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:19Z","lastTransitionTime":"2026-03-18T10:14:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:19 crc kubenswrapper[4733]: I0318 10:14:19.890751 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:19 crc kubenswrapper[4733]: I0318 10:14:19.890869 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:19 crc kubenswrapper[4733]: I0318 10:14:19.890891 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:19 crc kubenswrapper[4733]: I0318 10:14:19.890919 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:19 crc kubenswrapper[4733]: I0318 10:14:19.890938 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:19Z","lastTransitionTime":"2026-03-18T10:14:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:19 crc kubenswrapper[4733]: I0318 10:14:19.994569 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:19 crc kubenswrapper[4733]: I0318 10:14:19.994654 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:19 crc kubenswrapper[4733]: I0318 10:14:19.994680 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:19 crc kubenswrapper[4733]: I0318 10:14:19.994716 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:19 crc kubenswrapper[4733]: I0318 10:14:19.994741 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:19Z","lastTransitionTime":"2026-03-18T10:14:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:20 crc kubenswrapper[4733]: I0318 10:14:20.098403 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:20 crc kubenswrapper[4733]: I0318 10:14:20.098478 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:20 crc kubenswrapper[4733]: I0318 10:14:20.098496 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:20 crc kubenswrapper[4733]: I0318 10:14:20.098525 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:20 crc kubenswrapper[4733]: I0318 10:14:20.098544 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:20Z","lastTransitionTime":"2026-03-18T10:14:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:20 crc kubenswrapper[4733]: I0318 10:14:20.203246 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:20 crc kubenswrapper[4733]: I0318 10:14:20.203322 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:20 crc kubenswrapper[4733]: I0318 10:14:20.203350 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:20 crc kubenswrapper[4733]: I0318 10:14:20.203383 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:20 crc kubenswrapper[4733]: I0318 10:14:20.203407 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:20Z","lastTransitionTime":"2026-03-18T10:14:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:20 crc kubenswrapper[4733]: I0318 10:14:20.307093 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:20 crc kubenswrapper[4733]: I0318 10:14:20.307249 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:20 crc kubenswrapper[4733]: I0318 10:14:20.307277 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:20 crc kubenswrapper[4733]: I0318 10:14:20.307309 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:20 crc kubenswrapper[4733]: I0318 10:14:20.307332 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:20Z","lastTransitionTime":"2026-03-18T10:14:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:20 crc kubenswrapper[4733]: I0318 10:14:20.418367 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:20 crc kubenswrapper[4733]: I0318 10:14:20.418888 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:20 crc kubenswrapper[4733]: I0318 10:14:20.419106 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:20 crc kubenswrapper[4733]: I0318 10:14:20.420113 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:20 crc kubenswrapper[4733]: I0318 10:14:20.420253 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:20Z","lastTransitionTime":"2026-03-18T10:14:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:20 crc kubenswrapper[4733]: I0318 10:14:20.525822 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:20 crc kubenswrapper[4733]: I0318 10:14:20.525871 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:20 crc kubenswrapper[4733]: I0318 10:14:20.525893 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:20 crc kubenswrapper[4733]: I0318 10:14:20.525923 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:20 crc kubenswrapper[4733]: I0318 10:14:20.525942 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:20Z","lastTransitionTime":"2026-03-18T10:14:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:20 crc kubenswrapper[4733]: I0318 10:14:20.630276 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:20 crc kubenswrapper[4733]: I0318 10:14:20.630418 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:20 crc kubenswrapper[4733]: I0318 10:14:20.630438 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:20 crc kubenswrapper[4733]: I0318 10:14:20.630468 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:20 crc kubenswrapper[4733]: I0318 10:14:20.630486 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:20Z","lastTransitionTime":"2026-03-18T10:14:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:20 crc kubenswrapper[4733]: I0318 10:14:20.734010 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:20 crc kubenswrapper[4733]: I0318 10:14:20.734087 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:20 crc kubenswrapper[4733]: I0318 10:14:20.734113 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:20 crc kubenswrapper[4733]: I0318 10:14:20.734145 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:20 crc kubenswrapper[4733]: I0318 10:14:20.734168 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:20Z","lastTransitionTime":"2026-03-18T10:14:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:20 crc kubenswrapper[4733]: I0318 10:14:20.838032 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:20 crc kubenswrapper[4733]: I0318 10:14:20.838109 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:20 crc kubenswrapper[4733]: I0318 10:14:20.838130 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:20 crc kubenswrapper[4733]: I0318 10:14:20.838160 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:20 crc kubenswrapper[4733]: I0318 10:14:20.838183 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:20Z","lastTransitionTime":"2026-03-18T10:14:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:20 crc kubenswrapper[4733]: I0318 10:14:20.941559 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:20 crc kubenswrapper[4733]: I0318 10:14:20.941650 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:20 crc kubenswrapper[4733]: I0318 10:14:20.941677 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:20 crc kubenswrapper[4733]: I0318 10:14:20.941713 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:20 crc kubenswrapper[4733]: I0318 10:14:20.941741 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:20Z","lastTransitionTime":"2026-03-18T10:14:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:20 crc kubenswrapper[4733]: I0318 10:14:20.993121 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:14:20 crc kubenswrapper[4733]: E0318 10:14:20.993401 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:14:28.993357173 +0000 UTC m=+108.485091538 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:14:20 crc kubenswrapper[4733]: I0318 10:14:20.993471 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:14:20 crc kubenswrapper[4733]: I0318 10:14:20.993534 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:14:20 crc kubenswrapper[4733]: I0318 10:14:20.993619 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b3650177-e338-4eba-ab42-bc0cd14c9d65-metrics-certs\") pod \"network-metrics-daemon-4s425\" (UID: \"b3650177-e338-4eba-ab42-bc0cd14c9d65\") " pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:14:20 crc kubenswrapper[4733]: I0318 10:14:20.993679 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:14:20 crc kubenswrapper[4733]: E0318 10:14:20.993706 4733 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 10:14:20 crc kubenswrapper[4733]: E0318 10:14:20.993792 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 10:14:28.993767385 +0000 UTC m=+108.485501740 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 10:14:20 crc kubenswrapper[4733]: E0318 10:14:20.993814 4733 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 10:14:20 crc kubenswrapper[4733]: E0318 10:14:20.993909 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 10:14:28.993878618 +0000 UTC m=+108.485612983 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 10:14:20 crc kubenswrapper[4733]: E0318 10:14:20.993933 4733 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 10:14:20 crc kubenswrapper[4733]: E0318 10:14:20.994031 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3650177-e338-4eba-ab42-bc0cd14c9d65-metrics-certs podName:b3650177-e338-4eba-ab42-bc0cd14c9d65 nodeName:}" failed. No retries permitted until 2026-03-18 10:14:28.994000021 +0000 UTC m=+108.485734386 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b3650177-e338-4eba-ab42-bc0cd14c9d65-metrics-certs") pod "network-metrics-daemon-4s425" (UID: "b3650177-e338-4eba-ab42-bc0cd14c9d65") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 10:14:20 crc kubenswrapper[4733]: E0318 10:14:20.994026 4733 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 10:14:20 crc kubenswrapper[4733]: E0318 10:14:20.994090 4733 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 10:14:20 crc kubenswrapper[4733]: E0318 10:14:20.994119 4733 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 10:14:20 crc kubenswrapper[4733]: E0318 10:14:20.994180 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 10:14:28.994163716 +0000 UTC m=+108.485898071 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.044803 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.044901 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.044931 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.044969 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.044994 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:21Z","lastTransitionTime":"2026-03-18T10:14:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.094951 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:14:21 crc kubenswrapper[4733]: E0318 10:14:21.095245 4733 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 10:14:21 crc kubenswrapper[4733]: E0318 10:14:21.095290 4733 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 10:14:21 crc kubenswrapper[4733]: E0318 10:14:21.095306 4733 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 10:14:21 crc kubenswrapper[4733]: E0318 10:14:21.095379 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 10:14:29.09535735 +0000 UTC m=+108.587091685 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.148229 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.148305 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.148323 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.148358 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.148382 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:21Z","lastTransitionTime":"2026-03-18T10:14:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.174526 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.174556 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.174726 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:14:21 crc kubenswrapper[4733]: E0318 10:14:21.174904 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.174973 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:14:21 crc kubenswrapper[4733]: E0318 10:14:21.175220 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 10:14:21 crc kubenswrapper[4733]: E0318 10:14:21.175370 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4s425" podUID="b3650177-e338-4eba-ab42-bc0cd14c9d65" Mar 18 10:14:21 crc kubenswrapper[4733]: E0318 10:14:21.175530 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.190690 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spfjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d693a73-68c1-4595-bbcc-be97691b06fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-spfjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.200535 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hsk58" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2c181c8-3361-40a2-afc5-a677e0ab4ecd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-httph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hsk58\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.210268 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.222404 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g6j2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ph8vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g6j2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.237099 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.249892 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xfvfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb58b528-9013-4fab-9747-60bb6ff1bc1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg7jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xfvfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.252326 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.252403 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.252425 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.252455 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.252481 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:21Z","lastTransitionTime":"2026-03-18T10:14:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.275635 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73327417-4d3b-45f1-b3b6-575fdeeaa31a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pxwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.295664 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t28sh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t28sh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.311983 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f75e1c5-e0c5-43df-944f-77b734070793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2h7dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.324261 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"908bd772-fb33-4f68-8971-d1fef3118c82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3457636bb3e1cc25507158454524b9cee6812beb56c7b22fb86b9438b8082488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.340482 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.357573 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.357755 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.357747 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.357793 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.358023 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.358044 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:21Z","lastTransitionTime":"2026-03-18T10:14:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.370613 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.387606 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.399762 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4s425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3650177-e338-4eba-ab42-bc0cd14c9d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4s425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.461643 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.461696 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.461707 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.461721 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.461731 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:21Z","lastTransitionTime":"2026-03-18T10:14:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.527914 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.527982 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.528004 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.528029 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.528048 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:21Z","lastTransitionTime":"2026-03-18T10:14:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:21 crc kubenswrapper[4733]: E0318 10:14:21.541056 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a826494-c246-4717-869b-fd136e2b8410\\\",\\\"systemUUID\\\":\\\"fe704b25-4cdf-410a-9afb-ebc7963f4bc5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.547359 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.547424 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.547442 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.547473 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.547495 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:21Z","lastTransitionTime":"2026-03-18T10:14:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:21 crc kubenswrapper[4733]: E0318 10:14:21.599008 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a826494-c246-4717-869b-fd136e2b8410\\\",\\\"systemUUID\\\":\\\"fe704b25-4cdf-410a-9afb-ebc7963f4bc5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.604599 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.604671 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.604692 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.604768 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.604825 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:21Z","lastTransitionTime":"2026-03-18T10:14:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:21 crc kubenswrapper[4733]: E0318 10:14:21.621307 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a826494-c246-4717-869b-fd136e2b8410\\\",\\\"systemUUID\\\":\\\"fe704b25-4cdf-410a-9afb-ebc7963f4bc5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.626788 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.626874 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.626894 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.626948 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.626965 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:21Z","lastTransitionTime":"2026-03-18T10:14:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:21 crc kubenswrapper[4733]: E0318 10:14:21.643320 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a826494-c246-4717-869b-fd136e2b8410\\\",\\\"systemUUID\\\":\\\"fe704b25-4cdf-410a-9afb-ebc7963f4bc5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.649950 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.650024 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.650042 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.650072 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.650091 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:21Z","lastTransitionTime":"2026-03-18T10:14:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:21 crc kubenswrapper[4733]: E0318 10:14:21.665972 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a826494-c246-4717-869b-fd136e2b8410\\\",\\\"systemUUID\\\":\\\"fe704b25-4cdf-410a-9afb-ebc7963f4bc5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:21 crc kubenswrapper[4733]: E0318 10:14:21.666144 4733 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.668685 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.668805 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.668896 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.668970 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.669000 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:21Z","lastTransitionTime":"2026-03-18T10:14:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.773101 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.773177 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.773226 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.773248 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.773266 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:21Z","lastTransitionTime":"2026-03-18T10:14:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.877000 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.877071 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.877089 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.877119 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.877139 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:21Z","lastTransitionTime":"2026-03-18T10:14:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.981693 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.981762 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.981779 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.981805 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.981822 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:21Z","lastTransitionTime":"2026-03-18T10:14:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:21 crc kubenswrapper[4733]: I0318 10:14:21.982230 4733 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 18 10:14:22 crc kubenswrapper[4733]: I0318 10:14:22.086448 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:22 crc kubenswrapper[4733]: I0318 10:14:22.086545 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:22 crc kubenswrapper[4733]: I0318 10:14:22.086566 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:22 crc kubenswrapper[4733]: I0318 10:14:22.086599 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:22 crc kubenswrapper[4733]: I0318 10:14:22.086624 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:22Z","lastTransitionTime":"2026-03-18T10:14:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:22 crc kubenswrapper[4733]: I0318 10:14:22.192837 4733 scope.go:117] "RemoveContainer" containerID="ba371d0dc81f8827d305037cab25306e3abe8ed3d243f74923b4709198f7ea38" Mar 18 10:14:22 crc kubenswrapper[4733]: I0318 10:14:22.193627 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:22 crc kubenswrapper[4733]: I0318 10:14:22.193700 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:22 crc kubenswrapper[4733]: I0318 10:14:22.193738 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:22 crc kubenswrapper[4733]: I0318 10:14:22.193770 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:22 crc kubenswrapper[4733]: I0318 10:14:22.193796 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:22Z","lastTransitionTime":"2026-03-18T10:14:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:22 crc kubenswrapper[4733]: I0318 10:14:22.196798 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 18 10:14:22 crc kubenswrapper[4733]: I0318 10:14:22.197021 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 18 10:14:22 crc kubenswrapper[4733]: I0318 10:14:22.298819 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:22 crc kubenswrapper[4733]: I0318 10:14:22.298859 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:22 crc kubenswrapper[4733]: I0318 10:14:22.298872 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:22 crc kubenswrapper[4733]: I0318 10:14:22.298891 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:22 crc kubenswrapper[4733]: I0318 10:14:22.298905 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:22Z","lastTransitionTime":"2026-03-18T10:14:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:22 crc kubenswrapper[4733]: I0318 10:14:22.402249 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:22 crc kubenswrapper[4733]: I0318 10:14:22.402685 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:22 crc kubenswrapper[4733]: I0318 10:14:22.402695 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:22 crc kubenswrapper[4733]: I0318 10:14:22.402709 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:22 crc kubenswrapper[4733]: I0318 10:14:22.402726 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:22Z","lastTransitionTime":"2026-03-18T10:14:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:22 crc kubenswrapper[4733]: I0318 10:14:22.505917 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:22 crc kubenswrapper[4733]: I0318 10:14:22.506053 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:22 crc kubenswrapper[4733]: I0318 10:14:22.506081 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:22 crc kubenswrapper[4733]: I0318 10:14:22.506145 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:22 crc kubenswrapper[4733]: I0318 10:14:22.506173 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:22Z","lastTransitionTime":"2026-03-18T10:14:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:22 crc kubenswrapper[4733]: I0318 10:14:22.610094 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:22 crc kubenswrapper[4733]: I0318 10:14:22.610134 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:22 crc kubenswrapper[4733]: I0318 10:14:22.610149 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:22 crc kubenswrapper[4733]: I0318 10:14:22.610170 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:22 crc kubenswrapper[4733]: I0318 10:14:22.610181 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:22Z","lastTransitionTime":"2026-03-18T10:14:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:22 crc kubenswrapper[4733]: I0318 10:14:22.641157 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 18 10:14:22 crc kubenswrapper[4733]: I0318 10:14:22.643800 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6fa9eed1a11fd6a14b82ea9f34ead9b9c67e9c9d52c2675651b37f9838875052"} Mar 18 10:14:22 crc kubenswrapper[4733]: I0318 10:14:22.660405 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:22 crc kubenswrapper[4733]: I0318 10:14:22.674379 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xfvfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb58b528-9013-4fab-9747-60bb6ff1bc1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg7jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xfvfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:22 crc kubenswrapper[4733]: I0318 10:14:22.701381 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73327417-4d3b-45f1-b3b6-575fdeeaa31a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pxwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:22 crc kubenswrapper[4733]: I0318 10:14:22.712769 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:22 crc kubenswrapper[4733]: I0318 10:14:22.712820 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:22 crc kubenswrapper[4733]: I0318 10:14:22.712834 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:22 crc kubenswrapper[4733]: I0318 10:14:22.712852 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:22 crc kubenswrapper[4733]: I0318 10:14:22.712872 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:22Z","lastTransitionTime":"2026-03-18T10:14:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:22 crc kubenswrapper[4733]: I0318 10:14:22.714322 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"908bd772-fb33-4f68-8971-d1fef3118c82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3457636bb3e1cc25507158454524b9cee6812beb56c7b22fb86b9438b8082488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:22 crc kubenswrapper[4733]: I0318 10:14:22.726105 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:22 crc kubenswrapper[4733]: I0318 10:14:22.740877 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:22 crc kubenswrapper[4733]: I0318 10:14:22.753606 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t28sh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t28sh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:22 crc kubenswrapper[4733]: I0318 10:14:22.765238 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f75e1c5-e0c5-43df-944f-77b734070793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2h7dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:22 crc kubenswrapper[4733]: I0318 10:14:22.795512 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70b23fcc-38c7-420b-ad9a-57d1c547c788\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb888f7a23904596729e28ec137231447f22565be42be8589f1481aa52efd9f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d0f02cb69f907a82795f47bfae39d1f750bb7bedeeb6d0802e84087dd7150df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cfda710da166c7b27fe6df3f38f5f969d0edea58503530ace9d35e3a7ec1420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b454a77a46e10fcea3615e1f59d7849430a461ee7392b37fbbb6ec89e53eb432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://448f3d210c3e435bb68acc8f81dd92e63739d073e0d3746be3985c3d3fe07556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:22 crc kubenswrapper[4733]: I0318 10:14:22.809822 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:22 crc kubenswrapper[4733]: I0318 10:14:22.815458 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:22 crc kubenswrapper[4733]: I0318 10:14:22.815523 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:22 crc kubenswrapper[4733]: I0318 10:14:22.815551 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:22 crc kubenswrapper[4733]: I0318 10:14:22.815588 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:22 crc kubenswrapper[4733]: I0318 10:14:22.815618 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:22Z","lastTransitionTime":"2026-03-18T10:14:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:22 crc kubenswrapper[4733]: I0318 10:14:22.821905 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4s425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3650177-e338-4eba-ab42-bc0cd14c9d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4s425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:22 crc kubenswrapper[4733]: I0318 10:14:22.836297 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:22 crc kubenswrapper[4733]: I0318 10:14:22.848640 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddb303e3-8922-4b43-9bba-2d3f0c30c6b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1614bd2915eb4ab62554cfe72d63669c062baaf25ae2e533788b876ff9544eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aaa002cf5203102149456e58fcc5db02a5e861736d3699e432a91186bac47d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edcafff0c9902e275fc23a2f154d3030c0e751e2f3230a4ca226c9cef8efcbfa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa9eed1a11fd6a14b82ea9f34ead9b9c67e9c9d52c2675651b37f9838875052\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba371d0dc81f8827d305037cab25306e3abe8ed3d243f74923b4709198f7ea38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T10:13:42Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 10:13:41.916017 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 10:13:41.916132 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 10:13:41.917022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1943543564/tls.crt::/tmp/serving-cert-1943543564/tls.key\\\\\\\"\\\\nI0318 10:13:42.070462 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 10:13:42.072416 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 10:13:42.072438 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 10:13:42.072464 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 10:13:42.072469 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 10:13:42.076902 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 10:13:42.076943 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076949 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076959 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 10:13:42.076962 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 10:13:42.076967 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 10:13:42.076974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 10:13:42.077028 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 10:13:42.078631 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:13:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b698902beccdf67c5646c01b34eea131f61dee8d5d6e1f566cdb70c930b2cde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:22 crc kubenswrapper[4733]: I0318 10:14:22.862610 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hsk58" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2c181c8-3361-40a2-afc5-a677e0ab4ecd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-httph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hsk58\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:22 crc kubenswrapper[4733]: I0318 10:14:22.875944 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:22 crc kubenswrapper[4733]: I0318 10:14:22.892327 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g6j2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ph8vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g6j2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:22 crc kubenswrapper[4733]: I0318 10:14:22.905669 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spfjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d693a73-68c1-4595-bbcc-be97691b06fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-spfjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:22 crc kubenswrapper[4733]: I0318 10:14:22.918234 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:22 crc kubenswrapper[4733]: I0318 10:14:22.918327 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:22 crc kubenswrapper[4733]: I0318 10:14:22.918396 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:22 crc kubenswrapper[4733]: I0318 10:14:22.918464 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:22 crc kubenswrapper[4733]: I0318 10:14:22.918523 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:22Z","lastTransitionTime":"2026-03-18T10:14:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:23 crc kubenswrapper[4733]: I0318 10:14:23.021543 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:23 crc kubenswrapper[4733]: I0318 10:14:23.022386 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:23 crc kubenswrapper[4733]: I0318 10:14:23.022503 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:23 crc kubenswrapper[4733]: I0318 10:14:23.022593 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:23 crc kubenswrapper[4733]: I0318 10:14:23.022684 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:23Z","lastTransitionTime":"2026-03-18T10:14:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:23 crc kubenswrapper[4733]: I0318 10:14:23.126231 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:23 crc kubenswrapper[4733]: I0318 10:14:23.126580 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:23 crc kubenswrapper[4733]: I0318 10:14:23.126868 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:23 crc kubenswrapper[4733]: I0318 10:14:23.126978 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:23 crc kubenswrapper[4733]: I0318 10:14:23.127101 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:23Z","lastTransitionTime":"2026-03-18T10:14:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:23 crc kubenswrapper[4733]: I0318 10:14:23.175480 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:14:23 crc kubenswrapper[4733]: I0318 10:14:23.175505 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:14:23 crc kubenswrapper[4733]: I0318 10:14:23.175543 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:14:23 crc kubenswrapper[4733]: I0318 10:14:23.175479 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:14:23 crc kubenswrapper[4733]: E0318 10:14:23.175632 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 10:14:23 crc kubenswrapper[4733]: E0318 10:14:23.175714 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 10:14:23 crc kubenswrapper[4733]: E0318 10:14:23.175785 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4s425" podUID="b3650177-e338-4eba-ab42-bc0cd14c9d65" Mar 18 10:14:23 crc kubenswrapper[4733]: E0318 10:14:23.175862 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 10:14:23 crc kubenswrapper[4733]: I0318 10:14:23.230878 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:23 crc kubenswrapper[4733]: I0318 10:14:23.231271 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:23 crc kubenswrapper[4733]: I0318 10:14:23.231419 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:23 crc kubenswrapper[4733]: I0318 10:14:23.231560 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:23 crc kubenswrapper[4733]: I0318 10:14:23.231708 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:23Z","lastTransitionTime":"2026-03-18T10:14:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:23 crc kubenswrapper[4733]: I0318 10:14:23.335431 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:23 crc kubenswrapper[4733]: I0318 10:14:23.335846 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:23 crc kubenswrapper[4733]: I0318 10:14:23.335933 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:23 crc kubenswrapper[4733]: I0318 10:14:23.336019 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:23 crc kubenswrapper[4733]: I0318 10:14:23.336096 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:23Z","lastTransitionTime":"2026-03-18T10:14:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:23 crc kubenswrapper[4733]: I0318 10:14:23.439369 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:23 crc kubenswrapper[4733]: I0318 10:14:23.439429 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:23 crc kubenswrapper[4733]: I0318 10:14:23.439445 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:23 crc kubenswrapper[4733]: I0318 10:14:23.439471 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:23 crc kubenswrapper[4733]: I0318 10:14:23.439486 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:23Z","lastTransitionTime":"2026-03-18T10:14:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:23 crc kubenswrapper[4733]: I0318 10:14:23.542620 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:23 crc kubenswrapper[4733]: I0318 10:14:23.542666 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:23 crc kubenswrapper[4733]: I0318 10:14:23.542680 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:23 crc kubenswrapper[4733]: I0318 10:14:23.542704 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:23 crc kubenswrapper[4733]: I0318 10:14:23.542719 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:23Z","lastTransitionTime":"2026-03-18T10:14:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:23 crc kubenswrapper[4733]: I0318 10:14:23.645954 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:23 crc kubenswrapper[4733]: I0318 10:14:23.646402 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:23 crc kubenswrapper[4733]: I0318 10:14:23.646507 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:23 crc kubenswrapper[4733]: I0318 10:14:23.646607 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:23 crc kubenswrapper[4733]: I0318 10:14:23.646694 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:23Z","lastTransitionTime":"2026-03-18T10:14:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:23 crc kubenswrapper[4733]: I0318 10:14:23.647038 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 10:14:23 crc kubenswrapper[4733]: I0318 10:14:23.750096 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:23 crc kubenswrapper[4733]: I0318 10:14:23.750610 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:23 crc kubenswrapper[4733]: I0318 10:14:23.750838 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:23 crc kubenswrapper[4733]: I0318 10:14:23.751054 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:23 crc kubenswrapper[4733]: I0318 10:14:23.751267 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:23Z","lastTransitionTime":"2026-03-18T10:14:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:23 crc kubenswrapper[4733]: I0318 10:14:23.855263 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:23 crc kubenswrapper[4733]: I0318 10:14:23.855317 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:23 crc kubenswrapper[4733]: I0318 10:14:23.855330 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:23 crc kubenswrapper[4733]: I0318 10:14:23.855350 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:23 crc kubenswrapper[4733]: I0318 10:14:23.855363 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:23Z","lastTransitionTime":"2026-03-18T10:14:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:23 crc kubenswrapper[4733]: I0318 10:14:23.958767 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:23 crc kubenswrapper[4733]: I0318 10:14:23.958834 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:23 crc kubenswrapper[4733]: I0318 10:14:23.958852 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:23 crc kubenswrapper[4733]: I0318 10:14:23.958881 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:23 crc kubenswrapper[4733]: I0318 10:14:23.958901 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:23Z","lastTransitionTime":"2026-03-18T10:14:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:24 crc kubenswrapper[4733]: I0318 10:14:24.062826 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:24 crc kubenswrapper[4733]: I0318 10:14:24.062890 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:24 crc kubenswrapper[4733]: I0318 10:14:24.062909 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:24 crc kubenswrapper[4733]: I0318 10:14:24.062941 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:24 crc kubenswrapper[4733]: I0318 10:14:24.062973 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:24Z","lastTransitionTime":"2026-03-18T10:14:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:24 crc kubenswrapper[4733]: I0318 10:14:24.167110 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:24 crc kubenswrapper[4733]: I0318 10:14:24.167465 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:24 crc kubenswrapper[4733]: I0318 10:14:24.167536 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:24 crc kubenswrapper[4733]: I0318 10:14:24.167607 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:24 crc kubenswrapper[4733]: I0318 10:14:24.167669 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:24Z","lastTransitionTime":"2026-03-18T10:14:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:24 crc kubenswrapper[4733]: I0318 10:14:24.270492 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:24 crc kubenswrapper[4733]: I0318 10:14:24.270779 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:24 crc kubenswrapper[4733]: I0318 10:14:24.271013 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:24 crc kubenswrapper[4733]: I0318 10:14:24.271183 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:24 crc kubenswrapper[4733]: I0318 10:14:24.271298 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:24Z","lastTransitionTime":"2026-03-18T10:14:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:24 crc kubenswrapper[4733]: I0318 10:14:24.375057 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:24 crc kubenswrapper[4733]: I0318 10:14:24.375400 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:24 crc kubenswrapper[4733]: I0318 10:14:24.375493 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:24 crc kubenswrapper[4733]: I0318 10:14:24.375590 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:24 crc kubenswrapper[4733]: I0318 10:14:24.375665 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:24Z","lastTransitionTime":"2026-03-18T10:14:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:24 crc kubenswrapper[4733]: I0318 10:14:24.478902 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:24 crc kubenswrapper[4733]: I0318 10:14:24.478958 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:24 crc kubenswrapper[4733]: I0318 10:14:24.478973 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:24 crc kubenswrapper[4733]: I0318 10:14:24.478996 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:24 crc kubenswrapper[4733]: I0318 10:14:24.479018 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:24Z","lastTransitionTime":"2026-03-18T10:14:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:24 crc kubenswrapper[4733]: I0318 10:14:24.582713 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:24 crc kubenswrapper[4733]: I0318 10:14:24.583013 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:24 crc kubenswrapper[4733]: I0318 10:14:24.583168 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:24 crc kubenswrapper[4733]: I0318 10:14:24.583312 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:24 crc kubenswrapper[4733]: I0318 10:14:24.583428 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:24Z","lastTransitionTime":"2026-03-18T10:14:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:24 crc kubenswrapper[4733]: I0318 10:14:24.686227 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:24 crc kubenswrapper[4733]: I0318 10:14:24.686284 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:24 crc kubenswrapper[4733]: I0318 10:14:24.686300 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:24 crc kubenswrapper[4733]: I0318 10:14:24.686328 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:24 crc kubenswrapper[4733]: I0318 10:14:24.686342 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:24Z","lastTransitionTime":"2026-03-18T10:14:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:24 crc kubenswrapper[4733]: I0318 10:14:24.790516 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:24 crc kubenswrapper[4733]: I0318 10:14:24.790611 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:24 crc kubenswrapper[4733]: I0318 10:14:24.790638 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:24 crc kubenswrapper[4733]: I0318 10:14:24.790673 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:24 crc kubenswrapper[4733]: I0318 10:14:24.790697 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:24Z","lastTransitionTime":"2026-03-18T10:14:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:24 crc kubenswrapper[4733]: I0318 10:14:24.894109 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:24 crc kubenswrapper[4733]: I0318 10:14:24.894566 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:24 crc kubenswrapper[4733]: I0318 10:14:24.895030 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:24 crc kubenswrapper[4733]: I0318 10:14:24.895427 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:24 crc kubenswrapper[4733]: I0318 10:14:24.895742 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:24Z","lastTransitionTime":"2026-03-18T10:14:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:25 crc kubenswrapper[4733]: I0318 10:14:24.999901 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:25 crc kubenswrapper[4733]: I0318 10:14:25.000740 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:25 crc kubenswrapper[4733]: I0318 10:14:25.000981 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:25 crc kubenswrapper[4733]: I0318 10:14:25.001183 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:25 crc kubenswrapper[4733]: I0318 10:14:25.001473 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:25Z","lastTransitionTime":"2026-03-18T10:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:25 crc kubenswrapper[4733]: I0318 10:14:25.104999 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:25 crc kubenswrapper[4733]: I0318 10:14:25.106381 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:25 crc kubenswrapper[4733]: I0318 10:14:25.106436 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:25 crc kubenswrapper[4733]: I0318 10:14:25.106466 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:25 crc kubenswrapper[4733]: I0318 10:14:25.106485 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:25Z","lastTransitionTime":"2026-03-18T10:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:25 crc kubenswrapper[4733]: I0318 10:14:25.175369 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:14:25 crc kubenswrapper[4733]: I0318 10:14:25.175429 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:14:25 crc kubenswrapper[4733]: I0318 10:14:25.175507 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:14:25 crc kubenswrapper[4733]: E0318 10:14:25.175504 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 10:14:25 crc kubenswrapper[4733]: I0318 10:14:25.175429 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:14:25 crc kubenswrapper[4733]: E0318 10:14:25.175697 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 10:14:25 crc kubenswrapper[4733]: E0318 10:14:25.175795 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 10:14:25 crc kubenswrapper[4733]: E0318 10:14:25.175885 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4s425" podUID="b3650177-e338-4eba-ab42-bc0cd14c9d65" Mar 18 10:14:25 crc kubenswrapper[4733]: I0318 10:14:25.209723 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:25 crc kubenswrapper[4733]: I0318 10:14:25.209785 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:25 crc kubenswrapper[4733]: I0318 10:14:25.209807 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:25 crc kubenswrapper[4733]: I0318 10:14:25.209833 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:25 crc kubenswrapper[4733]: I0318 10:14:25.209853 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:25Z","lastTransitionTime":"2026-03-18T10:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:25 crc kubenswrapper[4733]: I0318 10:14:25.312646 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:25 crc kubenswrapper[4733]: I0318 10:14:25.312716 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:25 crc kubenswrapper[4733]: I0318 10:14:25.312739 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:25 crc kubenswrapper[4733]: I0318 10:14:25.312787 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:25 crc kubenswrapper[4733]: I0318 10:14:25.312809 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:25Z","lastTransitionTime":"2026-03-18T10:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:25 crc kubenswrapper[4733]: I0318 10:14:25.416060 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:25 crc kubenswrapper[4733]: I0318 10:14:25.416129 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:25 crc kubenswrapper[4733]: I0318 10:14:25.416149 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:25 crc kubenswrapper[4733]: I0318 10:14:25.416177 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:25 crc kubenswrapper[4733]: I0318 10:14:25.416227 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:25Z","lastTransitionTime":"2026-03-18T10:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:25 crc kubenswrapper[4733]: I0318 10:14:25.519796 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:25 crc kubenswrapper[4733]: I0318 10:14:25.519868 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:25 crc kubenswrapper[4733]: I0318 10:14:25.519892 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:25 crc kubenswrapper[4733]: I0318 10:14:25.519919 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:25 crc kubenswrapper[4733]: I0318 10:14:25.519940 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:25Z","lastTransitionTime":"2026-03-18T10:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:25 crc kubenswrapper[4733]: I0318 10:14:25.623046 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:25 crc kubenswrapper[4733]: I0318 10:14:25.623125 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:25 crc kubenswrapper[4733]: I0318 10:14:25.623145 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:25 crc kubenswrapper[4733]: I0318 10:14:25.623256 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:25 crc kubenswrapper[4733]: I0318 10:14:25.623279 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:25Z","lastTransitionTime":"2026-03-18T10:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:25 crc kubenswrapper[4733]: I0318 10:14:25.726866 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:25 crc kubenswrapper[4733]: I0318 10:14:25.726915 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:25 crc kubenswrapper[4733]: I0318 10:14:25.726932 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:25 crc kubenswrapper[4733]: I0318 10:14:25.726957 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:25 crc kubenswrapper[4733]: I0318 10:14:25.726973 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:25Z","lastTransitionTime":"2026-03-18T10:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:25 crc kubenswrapper[4733]: I0318 10:14:25.830297 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:25 crc kubenswrapper[4733]: I0318 10:14:25.830388 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:25 crc kubenswrapper[4733]: I0318 10:14:25.830400 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:25 crc kubenswrapper[4733]: I0318 10:14:25.830415 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:25 crc kubenswrapper[4733]: I0318 10:14:25.830425 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:25Z","lastTransitionTime":"2026-03-18T10:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:25 crc kubenswrapper[4733]: I0318 10:14:25.933587 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:25 crc kubenswrapper[4733]: I0318 10:14:25.934079 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:25 crc kubenswrapper[4733]: I0318 10:14:25.934249 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:25 crc kubenswrapper[4733]: I0318 10:14:25.934414 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:25 crc kubenswrapper[4733]: I0318 10:14:25.934557 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:25Z","lastTransitionTime":"2026-03-18T10:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.038128 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.038172 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.038201 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.038221 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.038234 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:26Z","lastTransitionTime":"2026-03-18T10:14:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.141412 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.141484 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.141510 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.141576 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.141599 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:26Z","lastTransitionTime":"2026-03-18T10:14:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.245939 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.246017 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.246039 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.246073 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.246096 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:26Z","lastTransitionTime":"2026-03-18T10:14:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.349251 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.349480 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.349492 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.349511 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.349526 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:26Z","lastTransitionTime":"2026-03-18T10:14:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.452791 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.453471 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.453545 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.453643 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.453673 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:26Z","lastTransitionTime":"2026-03-18T10:14:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.557053 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.557101 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.557112 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.557132 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.557149 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:26Z","lastTransitionTime":"2026-03-18T10:14:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.659120 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.659124 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-g6j2q" event={"ID":"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa","Type":"ContainerStarted","Data":"cf9836f3455051ee686f0ec11ceb1c60cff06c95a16bf2fcff6c4c3ed600b034"} Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.659166 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.659182 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.659225 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.659240 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:26Z","lastTransitionTime":"2026-03-18T10:14:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.662456 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-xfvfl" event={"ID":"bb58b528-9013-4fab-9747-60bb6ff1bc1f","Type":"ContainerStarted","Data":"fc72346f1bb873e40a1063486ebd2adfd16e3958e17730370c00cb3b775a982c"} Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.675134 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hsk58" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2c181c8-3361-40a2-afc5-a677e0ab4ecd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-httph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hsk58\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.690931 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.702334 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g6j2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf9836f3455051ee686f0ec11ceb1c60cff06c95a16bf2fcff6c4c3ed600b034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ph8vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g6j2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.715168 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spfjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d693a73-68c1-4595-bbcc-be97691b06fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-spfjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.725483 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.733709 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xfvfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb58b528-9013-4fab-9747-60bb6ff1bc1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg7jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xfvfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.761309 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73327417-4d3b-45f1-b3b6-575fdeeaa31a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pxwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.767699 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.767740 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.767752 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.767773 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.767786 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:26Z","lastTransitionTime":"2026-03-18T10:14:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.778228 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"908bd772-fb33-4f68-8971-d1fef3118c82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3457636bb3e1cc25507158454524b9cee6812beb56c7b22fb86b9438b8082488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.790625 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.804822 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.826347 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t28sh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t28sh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.838667 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f75e1c5-e0c5-43df-944f-77b734070793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2h7dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.870729 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.870808 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.870826 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.870853 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.870871 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:26Z","lastTransitionTime":"2026-03-18T10:14:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.870839 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70b23fcc-38c7-420b-ad9a-57d1c547c788\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb888f7a23904596729e28ec137231447f22565be42be8589f1481aa52efd9f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d0f02cb69f907a82795f47bfae39d1f750bb7bedeeb6d0802e84087dd7150df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cfda710da166c7b27fe6df3f38f5f969d0edea58503530ace9d35e3a7ec1420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b454a77a46e10fcea3615e1f59d7849430a461ee7392b37fbbb6ec89e53eb432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://448f3d210c3e435bb68acc8f81dd92e63739d073e0d3746be3985c3d3fe07556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.883229 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.895252 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4s425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3650177-e338-4eba-ab42-bc0cd14c9d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4s425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.912435 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.931037 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddb303e3-8922-4b43-9bba-2d3f0c30c6b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1614bd2915eb4ab62554cfe72d63669c062baaf25ae2e533788b876ff9544eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aaa002cf5203102149456e58fcc5db02a5e861736d3699e432a91186bac47d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edcafff0c9902e275fc23a2f154d3030c0e751e2f3230a4ca226c9cef8efcbfa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa9eed1a11fd6a14b82ea9f34ead9b9c67e9c9d52c2675651b37f9838875052\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba371d0dc81f8827d305037cab25306e3abe8ed3d243f74923b4709198f7ea38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T10:13:42Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 10:13:41.916017 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 10:13:41.916132 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 10:13:41.917022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1943543564/tls.crt::/tmp/serving-cert-1943543564/tls.key\\\\\\\"\\\\nI0318 10:13:42.070462 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 10:13:42.072416 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 10:13:42.072438 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 10:13:42.072464 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 10:13:42.072469 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 10:13:42.076902 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 10:13:42.076943 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076949 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076959 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 10:13:42.076962 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 10:13:42.076967 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 10:13:42.076974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 10:13:42.077028 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 10:13:42.078631 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:13:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b698902beccdf67c5646c01b34eea131f61dee8d5d6e1f566cdb70c930b2cde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.946506 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spfjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d693a73-68c1-4595-bbcc-be97691b06fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-spfjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.959855 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hsk58" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2c181c8-3361-40a2-afc5-a677e0ab4ecd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-httph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hsk58\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.974720 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.975164 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.975217 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.975230 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.975252 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.975266 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:26Z","lastTransitionTime":"2026-03-18T10:14:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:26 crc kubenswrapper[4733]: I0318 10:14:26.992958 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g6j2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf9836f3455051ee686f0ec11ceb1c60cff06c95a16bf2fcff6c4c3ed600b034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ph8vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g6j2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:27 crc kubenswrapper[4733]: I0318 10:14:27.004596 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:27 crc kubenswrapper[4733]: I0318 10:14:27.014831 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xfvfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb58b528-9013-4fab-9747-60bb6ff1bc1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc72346f1bb873e40a1063486ebd2adfd16e3958e17730370c00cb3b775a982c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg7jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xfvfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:27 crc kubenswrapper[4733]: I0318 10:14:27.037085 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73327417-4d3b-45f1-b3b6-575fdeeaa31a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pxwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:27 crc kubenswrapper[4733]: I0318 10:14:27.066054 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70b23fcc-38c7-420b-ad9a-57d1c547c788\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb888f7a23904596729e28ec137231447f22565be42be8589f1481aa52efd9f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d0f02cb69f907a82795f47bfae39d1f750bb7bedeeb6d0802e84087dd7150df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cfda710da166c7b27fe6df3f38f5f969d0edea58503530ace9d35e3a7ec1420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b454a77a46e10fcea3615e1f59d7849430a461ee7392b37fbbb6ec89e53eb432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://448f3d210c3e435bb68acc8f81dd92e63739d073e0d3746be3985c3d3fe07556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:27 crc kubenswrapper[4733]: I0318 10:14:27.076673 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"908bd772-fb33-4f68-8971-d1fef3118c82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3457636bb3e1cc25507158454524b9cee6812beb56c7b22fb86b9438b8082488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:27 crc kubenswrapper[4733]: I0318 10:14:27.078591 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:27 crc kubenswrapper[4733]: I0318 10:14:27.078637 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:27 crc kubenswrapper[4733]: I0318 10:14:27.078648 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:27 crc kubenswrapper[4733]: I0318 10:14:27.078670 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:27 crc kubenswrapper[4733]: I0318 10:14:27.078680 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:27Z","lastTransitionTime":"2026-03-18T10:14:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:27 crc kubenswrapper[4733]: I0318 10:14:27.094754 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:27 crc kubenswrapper[4733]: I0318 10:14:27.106156 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:27 crc kubenswrapper[4733]: I0318 10:14:27.120555 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t28sh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t28sh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:27 crc kubenswrapper[4733]: I0318 10:14:27.132947 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f75e1c5-e0c5-43df-944f-77b734070793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2h7dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:27 crc kubenswrapper[4733]: I0318 10:14:27.145748 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddb303e3-8922-4b43-9bba-2d3f0c30c6b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1614bd2915eb4ab62554cfe72d63669c062baaf25ae2e533788b876ff9544eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aaa002cf5203102149456e58fcc5db02a5e861736d3699e432a91186bac47d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edcafff0c9902e275fc23a2f154d3030c0e751e2f3230a4ca226c9cef8efcbfa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa9eed1a11fd6a14b82ea9f34ead9b9c67e9c9d52c2675651b37f9838875052\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba371d0dc81f8827d305037cab25306e3abe8ed3d243f74923b4709198f7ea38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T10:13:42Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 10:13:41.916017 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 10:13:41.916132 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 10:13:41.917022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1943543564/tls.crt::/tmp/serving-cert-1943543564/tls.key\\\\\\\"\\\\nI0318 10:13:42.070462 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 10:13:42.072416 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 10:13:42.072438 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 10:13:42.072464 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 10:13:42.072469 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 10:13:42.076902 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 10:13:42.076943 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076949 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076959 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 10:13:42.076962 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 10:13:42.076967 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 10:13:42.076974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 10:13:42.077028 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 10:13:42.078631 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:13:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b698902beccdf67c5646c01b34eea131f61dee8d5d6e1f566cdb70c930b2cde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:27 crc kubenswrapper[4733]: I0318 10:14:27.156859 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:27 crc kubenswrapper[4733]: I0318 10:14:27.167453 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4s425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3650177-e338-4eba-ab42-bc0cd14c9d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4s425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:27 crc kubenswrapper[4733]: I0318 10:14:27.175417 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:14:27 crc kubenswrapper[4733]: I0318 10:14:27.175471 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:14:27 crc kubenswrapper[4733]: E0318 10:14:27.175614 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 10:14:27 crc kubenswrapper[4733]: I0318 10:14:27.175682 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:14:27 crc kubenswrapper[4733]: E0318 10:14:27.175730 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4s425" podUID="b3650177-e338-4eba-ab42-bc0cd14c9d65" Mar 18 10:14:27 crc kubenswrapper[4733]: E0318 10:14:27.175946 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 10:14:27 crc kubenswrapper[4733]: I0318 10:14:27.176055 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:14:27 crc kubenswrapper[4733]: E0318 10:14:27.176226 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 10:14:27 crc kubenswrapper[4733]: I0318 10:14:27.181003 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:27 crc kubenswrapper[4733]: I0318 10:14:27.181714 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:27 crc kubenswrapper[4733]: I0318 10:14:27.182461 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:27 crc kubenswrapper[4733]: I0318 10:14:27.182661 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:27 crc kubenswrapper[4733]: I0318 10:14:27.182798 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:27Z","lastTransitionTime":"2026-03-18T10:14:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:27 crc kubenswrapper[4733]: I0318 10:14:27.181373 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:27 crc kubenswrapper[4733]: I0318 10:14:27.287602 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:27 crc kubenswrapper[4733]: I0318 10:14:27.287664 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:27 crc kubenswrapper[4733]: I0318 10:14:27.287676 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:27 crc kubenswrapper[4733]: I0318 10:14:27.287695 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:27 crc kubenswrapper[4733]: I0318 10:14:27.287712 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:27Z","lastTransitionTime":"2026-03-18T10:14:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:27 crc kubenswrapper[4733]: I0318 10:14:27.390960 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:27 crc kubenswrapper[4733]: I0318 10:14:27.391016 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:27 crc kubenswrapper[4733]: I0318 10:14:27.391028 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:27 crc kubenswrapper[4733]: I0318 10:14:27.391048 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:27 crc kubenswrapper[4733]: I0318 10:14:27.391063 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:27Z","lastTransitionTime":"2026-03-18T10:14:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:27 crc kubenswrapper[4733]: I0318 10:14:27.493682 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:27 crc kubenswrapper[4733]: I0318 10:14:27.493760 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:27 crc kubenswrapper[4733]: I0318 10:14:27.493832 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:27 crc kubenswrapper[4733]: I0318 10:14:27.493871 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:27 crc kubenswrapper[4733]: I0318 10:14:27.493895 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:27Z","lastTransitionTime":"2026-03-18T10:14:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:27 crc kubenswrapper[4733]: I0318 10:14:27.598370 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:27 crc kubenswrapper[4733]: I0318 10:14:27.598447 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:27 crc kubenswrapper[4733]: I0318 10:14:27.598467 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:27 crc kubenswrapper[4733]: I0318 10:14:27.598492 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:27 crc kubenswrapper[4733]: I0318 10:14:27.598513 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:27Z","lastTransitionTime":"2026-03-18T10:14:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:27 crc kubenswrapper[4733]: I0318 10:14:27.701930 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:27 crc kubenswrapper[4733]: I0318 10:14:27.702004 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:27 crc kubenswrapper[4733]: I0318 10:14:27.702022 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:27 crc kubenswrapper[4733]: I0318 10:14:27.702047 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:27 crc kubenswrapper[4733]: I0318 10:14:27.702067 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:27Z","lastTransitionTime":"2026-03-18T10:14:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:27 crc kubenswrapper[4733]: I0318 10:14:27.805351 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:27 crc kubenswrapper[4733]: I0318 10:14:27.805422 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:27 crc kubenswrapper[4733]: I0318 10:14:27.805439 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:27 crc kubenswrapper[4733]: I0318 10:14:27.805464 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:27 crc kubenswrapper[4733]: I0318 10:14:27.805483 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:27Z","lastTransitionTime":"2026-03-18T10:14:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:27 crc kubenswrapper[4733]: I0318 10:14:27.909394 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:27 crc kubenswrapper[4733]: I0318 10:14:27.909480 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:27 crc kubenswrapper[4733]: I0318 10:14:27.909510 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:27 crc kubenswrapper[4733]: I0318 10:14:27.909578 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:27 crc kubenswrapper[4733]: I0318 10:14:27.909603 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:27Z","lastTransitionTime":"2026-03-18T10:14:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.012737 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.012818 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.012843 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.012876 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.012897 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:28Z","lastTransitionTime":"2026-03-18T10:14:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.116367 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.116451 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.116471 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.116502 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.116525 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:28Z","lastTransitionTime":"2026-03-18T10:14:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.227037 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.227145 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.227264 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.227288 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.227310 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:28Z","lastTransitionTime":"2026-03-18T10:14:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.331682 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.331716 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.331725 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.331741 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.331753 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:28Z","lastTransitionTime":"2026-03-18T10:14:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.434659 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.434702 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.434713 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.434727 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.434737 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:28Z","lastTransitionTime":"2026-03-18T10:14:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.537938 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.537992 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.538036 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.538060 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.538073 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:28Z","lastTransitionTime":"2026-03-18T10:14:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.641702 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.641766 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.641785 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.641812 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.641832 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:28Z","lastTransitionTime":"2026-03-18T10:14:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.678232 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" event={"ID":"6f75e1c5-e0c5-43df-944f-77b734070793","Type":"ContainerStarted","Data":"a5b4eaa631b67f13321cd60f9136da1832c5cd6e226609c01cabfa28410630a1"} Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.678331 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" event={"ID":"6f75e1c5-e0c5-43df-944f-77b734070793","Type":"ContainerStarted","Data":"615e7a90421535b4f8ff5e3b3a0ad9c958710094ffa4e3e4eb3eb41c79f80830"} Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.681291 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hsk58" event={"ID":"c2c181c8-3361-40a2-afc5-a677e0ab4ecd","Type":"ContainerStarted","Data":"a7ffcba189533d7ca155ab3284efac3d072ee3bc46d4b2a61247261bdaecb152"} Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.685447 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t28sh" event={"ID":"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9","Type":"ContainerStarted","Data":"f534576a07c40f1e53709418bbc816e439ac9e036c97e780922c19e36c272642"} Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.688327 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"a14e8a496af63cf1951ed21cfb3b13b1b516b00271dce19cdf858148beff398b"} Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.688384 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"61bc78e89fc84025b585b2a421fa96e8da9f90840b8c78c0658f30d8738c64ae"} Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.704603 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.721594 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.740511 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t28sh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t28sh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.746003 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.746125 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.746152 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.746184 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.746273 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:28Z","lastTransitionTime":"2026-03-18T10:14:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.756132 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f75e1c5-e0c5-43df-944f-77b734070793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5b4eaa631b67f13321cd60f9136da1832c5cd6e226609c01cabfa28410630a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e7a90421535b4f8ff5e3b3a0ad9c958710094ffa4e3e4eb3eb41c79f80830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2h7dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.776737 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70b23fcc-38c7-420b-ad9a-57d1c547c788\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb888f7a23904596729e28ec137231447f22565be42be8589f1481aa52efd9f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d0f02cb69f907a82795f47bfae39d1f750bb7bedeeb6d0802e84087dd7150df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cfda710da166c7b27fe6df3f38f5f969d0edea58503530ace9d35e3a7ec1420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b454a77a46e10fcea3615e1f59d7849430a461ee7392b37fbbb6ec89e53eb432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://448f3d210c3e435bb68acc8f81dd92e63739d073e0d3746be3985c3d3fe07556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.791374 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"908bd772-fb33-4f68-8971-d1fef3118c82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3457636bb3e1cc25507158454524b9cee6812beb56c7b22fb86b9438b8082488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.803762 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.814592 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4s425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3650177-e338-4eba-ab42-bc0cd14c9d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4s425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.830508 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.845913 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddb303e3-8922-4b43-9bba-2d3f0c30c6b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1614bd2915eb4ab62554cfe72d63669c062baaf25ae2e533788b876ff9544eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aaa002cf5203102149456e58fcc5db02a5e861736d3699e432a91186bac47d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edcafff0c9902e275fc23a2f154d3030c0e751e2f3230a4ca226c9cef8efcbfa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa9eed1a11fd6a14b82ea9f34ead9b9c67e9c9d52c2675651b37f9838875052\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba371d0dc81f8827d305037cab25306e3abe8ed3d243f74923b4709198f7ea38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T10:13:42Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 10:13:41.916017 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 10:13:41.916132 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 10:13:41.917022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1943543564/tls.crt::/tmp/serving-cert-1943543564/tls.key\\\\\\\"\\\\nI0318 10:13:42.070462 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 10:13:42.072416 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 10:13:42.072438 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 10:13:42.072464 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 10:13:42.072469 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 10:13:42.076902 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 10:13:42.076943 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076949 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076959 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 10:13:42.076962 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 10:13:42.076967 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 10:13:42.076974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 10:13:42.077028 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 10:13:42.078631 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:13:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b698902beccdf67c5646c01b34eea131f61dee8d5d6e1f566cdb70c930b2cde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.848530 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.848562 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.848572 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.848592 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.848604 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:28Z","lastTransitionTime":"2026-03-18T10:14:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.861120 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.872806 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g6j2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf9836f3455051ee686f0ec11ceb1c60cff06c95a16bf2fcff6c4c3ed600b034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ph8vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g6j2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.883047 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spfjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d693a73-68c1-4595-bbcc-be97691b06fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-spfjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.895632 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hsk58" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2c181c8-3361-40a2-afc5-a677e0ab4ecd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-httph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hsk58\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.906299 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xfvfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb58b528-9013-4fab-9747-60bb6ff1bc1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc72346f1bb873e40a1063486ebd2adfd16e3958e17730370c00cb3b775a982c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg7jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xfvfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.929505 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73327417-4d3b-45f1-b3b6-575fdeeaa31a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pxwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:28Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.949116 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:28Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.951312 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.951360 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.951374 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.951391 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.951402 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:28Z","lastTransitionTime":"2026-03-18T10:14:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.978928 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70b23fcc-38c7-420b-ad9a-57d1c547c788\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb888f7a23904596729e28ec137231447f22565be42be8589f1481aa52efd9f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d0f02cb69f907a82795f47bfae39d1f750bb7bedeeb6d0802e84087dd7150df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cfda710da166c7b27fe6df3f38f5f969d0edea58503530ace9d35e3a7ec1420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b454a77a46e10fcea3615e1f59d7849430a461ee7392b37fbbb6ec89e53eb432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://448f3d210c3e435bb68acc8f81dd92e63739d073e0d3746be3985c3d3fe07556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:28Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:28 crc kubenswrapper[4733]: I0318 10:14:28.993801 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"908bd772-fb33-4f68-8971-d1fef3118c82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3457636bb3e1cc25507158454524b9cee6812beb56c7b22fb86b9438b8082488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:28Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.017805 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:29Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.038366 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:29Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.055234 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.055430 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.055494 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.055572 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.055645 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:29Z","lastTransitionTime":"2026-03-18T10:14:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.055885 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t28sh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f534576a07c40f1e53709418bbc816e439ac9e036c97e780922c19e36c272642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t28sh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:29Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.070278 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f75e1c5-e0c5-43df-944f-77b734070793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5b4eaa631b67f13321cd60f9136da1832c5cd6e226609c01cabfa28410630a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e7a90421535b4f8ff5e3b3a0ad9c958710094ffa4e3e4eb3eb41c79f80830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2h7dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:29Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.084993 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddb303e3-8922-4b43-9bba-2d3f0c30c6b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1614bd2915eb4ab62554cfe72d63669c062baaf25ae2e533788b876ff9544eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aaa002cf5203102149456e58fcc5db02a5e861736d3699e432a91186bac47d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edcafff0c9902e275fc23a2f154d3030c0e751e2f3230a4ca226c9cef8efcbfa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa9eed1a11fd6a14b82ea9f34ead9b9c67e9c9d52c2675651b37f9838875052\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba371d0dc81f8827d305037cab25306e3abe8ed3d243f74923b4709198f7ea38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T10:13:42Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 10:13:41.916017 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 10:13:41.916132 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 10:13:41.917022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1943543564/tls.crt::/tmp/serving-cert-1943543564/tls.key\\\\\\\"\\\\nI0318 10:13:42.070462 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 10:13:42.072416 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 10:13:42.072438 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 10:13:42.072464 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 10:13:42.072469 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 10:13:42.076902 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 10:13:42.076943 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076949 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076959 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 10:13:42.076962 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 10:13:42.076967 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 10:13:42.076974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 10:13:42.077028 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 10:13:42.078631 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:13:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b698902beccdf67c5646c01b34eea131f61dee8d5d6e1f566cdb70c930b2cde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:29Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.089570 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.089763 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b3650177-e338-4eba-ab42-bc0cd14c9d65-metrics-certs\") pod \"network-metrics-daemon-4s425\" (UID: \"b3650177-e338-4eba-ab42-bc0cd14c9d65\") " pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.089817 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.089876 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.089904 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:14:29 crc kubenswrapper[4733]: E0318 10:14:29.090033 4733 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 10:14:29 crc kubenswrapper[4733]: E0318 10:14:29.090117 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 10:14:45.090093874 +0000 UTC m=+124.581828219 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 10:14:29 crc kubenswrapper[4733]: E0318 10:14:29.090171 4733 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 10:14:29 crc kubenswrapper[4733]: E0318 10:14:29.090239 4733 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 10:14:29 crc kubenswrapper[4733]: E0318 10:14:29.090267 4733 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 10:14:29 crc kubenswrapper[4733]: E0318 10:14:29.090236 4733 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 10:14:29 crc kubenswrapper[4733]: E0318 10:14:29.090361 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 10:14:45.090323781 +0000 UTC m=+124.582058146 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 10:14:29 crc kubenswrapper[4733]: E0318 10:14:29.090416 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 10:14:45.090390923 +0000 UTC m=+124.582125438 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 10:14:29 crc kubenswrapper[4733]: E0318 10:14:29.090544 4733 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 10:14:29 crc kubenswrapper[4733]: E0318 10:14:29.090744 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3650177-e338-4eba-ab42-bc0cd14c9d65-metrics-certs podName:b3650177-e338-4eba-ab42-bc0cd14c9d65 nodeName:}" failed. No retries permitted until 2026-03-18 10:14:45.090729672 +0000 UTC m=+124.582464227 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b3650177-e338-4eba-ab42-bc0cd14c9d65-metrics-certs") pod "network-metrics-daemon-4s425" (UID: "b3650177-e338-4eba-ab42-bc0cd14c9d65") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 10:14:29 crc kubenswrapper[4733]: E0318 10:14:29.091023 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:14:45.090987409 +0000 UTC m=+124.582721774 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.103715 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:29Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.122478 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4s425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3650177-e338-4eba-ab42-bc0cd14c9d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4s425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:29Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.139693 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a14e8a496af63cf1951ed21cfb3b13b1b516b00271dce19cdf858148beff398b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61bc78e89fc84025b585b2a421fa96e8da9f90840b8c78c0658f30d8738c64ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:29Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.151014 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spfjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d693a73-68c1-4595-bbcc-be97691b06fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-spfjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:29Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.157756 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.157885 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.157963 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.158047 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.158170 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:29Z","lastTransitionTime":"2026-03-18T10:14:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.159761 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hsk58" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2c181c8-3361-40a2-afc5-a677e0ab4ecd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7ffcba189533d7ca155ab3284efac3d072ee3bc46d4b2a61247261bdaecb152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-httph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hsk58\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:29Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.170943 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:29Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.175375 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.175380 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.175432 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.175659 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:14:29 crc kubenswrapper[4733]: E0318 10:14:29.175792 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 10:14:29 crc kubenswrapper[4733]: E0318 10:14:29.176774 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4s425" podUID="b3650177-e338-4eba-ab42-bc0cd14c9d65" Mar 18 10:14:29 crc kubenswrapper[4733]: E0318 10:14:29.176835 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 10:14:29 crc kubenswrapper[4733]: E0318 10:14:29.176912 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.184224 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g6j2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf9836f3455051ee686f0ec11ceb1c60cff06c95a16bf2fcff6c4c3ed600b034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ph8vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g6j2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:29Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.190595 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:14:29 crc kubenswrapper[4733]: E0318 10:14:29.191475 4733 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 10:14:29 crc kubenswrapper[4733]: E0318 10:14:29.191541 4733 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 10:14:29 crc kubenswrapper[4733]: E0318 10:14:29.191568 4733 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 10:14:29 crc kubenswrapper[4733]: E0318 10:14:29.191651 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 10:14:45.191621658 +0000 UTC m=+124.683356053 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.206623 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:29Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.220237 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xfvfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb58b528-9013-4fab-9747-60bb6ff1bc1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc72346f1bb873e40a1063486ebd2adfd16e3958e17730370c00cb3b775a982c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg7jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xfvfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:29Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.241524 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73327417-4d3b-45f1-b3b6-575fdeeaa31a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pxwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:29Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.262635 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.262676 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.262688 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.262705 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.262716 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:29Z","lastTransitionTime":"2026-03-18T10:14:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.365278 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.365336 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.365351 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.365388 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.365406 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:29Z","lastTransitionTime":"2026-03-18T10:14:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.468081 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.468132 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.468149 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.468172 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.468224 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:29Z","lastTransitionTime":"2026-03-18T10:14:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.572226 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.572288 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.572314 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.572341 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.572362 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:29Z","lastTransitionTime":"2026-03-18T10:14:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.675303 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.675815 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.675834 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.675860 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.675879 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:29Z","lastTransitionTime":"2026-03-18T10:14:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.694758 4733 generic.go:334] "Generic (PLEG): container finished" podID="0f82588a-9dbd-4c55-8cfc-f96e57fa58b9" containerID="f534576a07c40f1e53709418bbc816e439ac9e036c97e780922c19e36c272642" exitCode=0 Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.694868 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t28sh" event={"ID":"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9","Type":"ContainerDied","Data":"f534576a07c40f1e53709418bbc816e439ac9e036c97e780922c19e36c272642"} Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.699715 4733 generic.go:334] "Generic (PLEG): container finished" podID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" containerID="d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378" exitCode=0 Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.699772 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" event={"ID":"73327417-4d3b-45f1-b3b6-575fdeeaa31a","Type":"ContainerDied","Data":"d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378"} Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.720406 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddb303e3-8922-4b43-9bba-2d3f0c30c6b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1614bd2915eb4ab62554cfe72d63669c062baaf25ae2e533788b876ff9544eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aaa002cf5203102149456e58fcc5db02a5e861736d3699e432a91186bac47d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edcafff0c9902e275fc23a2f154d3030c0e751e2f3230a4ca226c9cef8efcbfa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa9eed1a11fd6a14b82ea9f34ead9b9c67e9c9d52c2675651b37f9838875052\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba371d0dc81f8827d305037cab25306e3abe8ed3d243f74923b4709198f7ea38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T10:13:42Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 10:13:41.916017 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 10:13:41.916132 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 10:13:41.917022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1943543564/tls.crt::/tmp/serving-cert-1943543564/tls.key\\\\\\\"\\\\nI0318 10:13:42.070462 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 10:13:42.072416 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 10:13:42.072438 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 10:13:42.072464 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 10:13:42.072469 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 10:13:42.076902 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 10:13:42.076943 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076949 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076959 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 10:13:42.076962 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 10:13:42.076967 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 10:13:42.076974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 10:13:42.077028 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 10:13:42.078631 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:13:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b698902beccdf67c5646c01b34eea131f61dee8d5d6e1f566cdb70c930b2cde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:29Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.740398 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:29Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.757687 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4s425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3650177-e338-4eba-ab42-bc0cd14c9d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4s425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:29Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.777898 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a14e8a496af63cf1951ed21cfb3b13b1b516b00271dce19cdf858148beff398b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61bc78e89fc84025b585b2a421fa96e8da9f90840b8c78c0658f30d8738c64ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:29Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.779593 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.779622 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.779633 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.779652 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.779663 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:29Z","lastTransitionTime":"2026-03-18T10:14:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.794063 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spfjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d693a73-68c1-4595-bbcc-be97691b06fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-spfjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:29Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.806623 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hsk58" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2c181c8-3361-40a2-afc5-a677e0ab4ecd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7ffcba189533d7ca155ab3284efac3d072ee3bc46d4b2a61247261bdaecb152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-httph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hsk58\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:29Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.825520 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:29Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.843961 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g6j2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf9836f3455051ee686f0ec11ceb1c60cff06c95a16bf2fcff6c4c3ed600b034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ph8vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g6j2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:29Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.859096 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:29Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.872564 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xfvfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb58b528-9013-4fab-9747-60bb6ff1bc1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc72346f1bb873e40a1063486ebd2adfd16e3958e17730370c00cb3b775a982c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg7jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xfvfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:29Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.882649 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.882693 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.882704 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.882720 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.882731 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:29Z","lastTransitionTime":"2026-03-18T10:14:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.896487 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73327417-4d3b-45f1-b3b6-575fdeeaa31a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pxwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:29Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.933751 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70b23fcc-38c7-420b-ad9a-57d1c547c788\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb888f7a23904596729e28ec137231447f22565be42be8589f1481aa52efd9f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d0f02cb69f907a82795f47bfae39d1f750bb7bedeeb6d0802e84087dd7150df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cfda710da166c7b27fe6df3f38f5f969d0edea58503530ace9d35e3a7ec1420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b454a77a46e10fcea3615e1f59d7849430a461ee7392b37fbbb6ec89e53eb432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://448f3d210c3e435bb68acc8f81dd92e63739d073e0d3746be3985c3d3fe07556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:29Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.952037 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"908bd772-fb33-4f68-8971-d1fef3118c82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3457636bb3e1cc25507158454524b9cee6812beb56c7b22fb86b9438b8082488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:29Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.970180 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:29Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.986972 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.987343 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.987354 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.987369 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.987379 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:29Z","lastTransitionTime":"2026-03-18T10:14:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:29 crc kubenswrapper[4733]: I0318 10:14:29.987225 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:29Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.008710 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t28sh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f534576a07c40f1e53709418bbc816e439ac9e036c97e780922c19e36c272642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f534576a07c40f1e53709418bbc816e439ac9e036c97e780922c19e36c272642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t28sh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:30Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.024053 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f75e1c5-e0c5-43df-944f-77b734070793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5b4eaa631b67f13321cd60f9136da1832c5cd6e226609c01cabfa28410630a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e7a90421535b4f8ff5e3b3a0ad9c958710094ffa4e3e4eb3eb41c79f80830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2h7dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:30Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.039859 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spfjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d693a73-68c1-4595-bbcc-be97691b06fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-spfjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:30Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.053325 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hsk58" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2c181c8-3361-40a2-afc5-a677e0ab4ecd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7ffcba189533d7ca155ab3284efac3d072ee3bc46d4b2a61247261bdaecb152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-httph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hsk58\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:30Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.065960 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:30Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.078763 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g6j2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf9836f3455051ee686f0ec11ceb1c60cff06c95a16bf2fcff6c4c3ed600b034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ph8vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g6j2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:30Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.089683 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.089722 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.089734 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.089752 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.089765 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:30Z","lastTransitionTime":"2026-03-18T10:14:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.103046 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:30Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.114575 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xfvfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb58b528-9013-4fab-9747-60bb6ff1bc1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc72346f1bb873e40a1063486ebd2adfd16e3958e17730370c00cb3b775a982c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg7jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xfvfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:30Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.140497 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73327417-4d3b-45f1-b3b6-575fdeeaa31a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pxwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:30Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.166200 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70b23fcc-38c7-420b-ad9a-57d1c547c788\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb888f7a23904596729e28ec137231447f22565be42be8589f1481aa52efd9f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d0f02cb69f907a82795f47bfae39d1f750bb7bedeeb6d0802e84087dd7150df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cfda710da166c7b27fe6df3f38f5f969d0edea58503530ace9d35e3a7ec1420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b454a77a46e10fcea3615e1f59d7849430a461ee7392b37fbbb6ec89e53eb432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://448f3d210c3e435bb68acc8f81dd92e63739d073e0d3746be3985c3d3fe07556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:30Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.179711 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"908bd772-fb33-4f68-8971-d1fef3118c82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3457636bb3e1cc25507158454524b9cee6812beb56c7b22fb86b9438b8082488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:30Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.192479 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.192680 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.192771 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.192859 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.192921 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:30Z","lastTransitionTime":"2026-03-18T10:14:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.196671 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:30Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.214622 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:30Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.239955 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t28sh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f534576a07c40f1e53709418bbc816e439ac9e036c97e780922c19e36c272642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f534576a07c40f1e53709418bbc816e439ac9e036c97e780922c19e36c272642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t28sh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:30Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.254985 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f75e1c5-e0c5-43df-944f-77b734070793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5b4eaa631b67f13321cd60f9136da1832c5cd6e226609c01cabfa28410630a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e7a90421535b4f8ff5e3b3a0ad9c958710094ffa4e3e4eb3eb41c79f80830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2h7dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:30Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.276752 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddb303e3-8922-4b43-9bba-2d3f0c30c6b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1614bd2915eb4ab62554cfe72d63669c062baaf25ae2e533788b876ff9544eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aaa002cf5203102149456e58fcc5db02a5e861736d3699e432a91186bac47d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edcafff0c9902e275fc23a2f154d3030c0e751e2f3230a4ca226c9cef8efcbfa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa9eed1a11fd6a14b82ea9f34ead9b9c67e9c9d52c2675651b37f9838875052\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba371d0dc81f8827d305037cab25306e3abe8ed3d243f74923b4709198f7ea38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T10:13:42Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 10:13:41.916017 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 10:13:41.916132 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 10:13:41.917022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1943543564/tls.crt::/tmp/serving-cert-1943543564/tls.key\\\\\\\"\\\\nI0318 10:13:42.070462 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 10:13:42.072416 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 10:13:42.072438 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 10:13:42.072464 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 10:13:42.072469 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 10:13:42.076902 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 10:13:42.076943 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076949 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076959 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 10:13:42.076962 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 10:13:42.076967 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 10:13:42.076974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 10:13:42.077028 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 10:13:42.078631 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:13:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b698902beccdf67c5646c01b34eea131f61dee8d5d6e1f566cdb70c930b2cde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:30Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.293816 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:30Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.295084 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.295129 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.295140 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.295155 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.295165 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:30Z","lastTransitionTime":"2026-03-18T10:14:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.309006 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4s425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3650177-e338-4eba-ab42-bc0cd14c9d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4s425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:30Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.328888 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a14e8a496af63cf1951ed21cfb3b13b1b516b00271dce19cdf858148beff398b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61bc78e89fc84025b585b2a421fa96e8da9f90840b8c78c0658f30d8738c64ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:30Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.398911 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.398954 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.398967 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.398992 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.399004 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:30Z","lastTransitionTime":"2026-03-18T10:14:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.502688 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.502726 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.502767 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.502805 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.502819 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:30Z","lastTransitionTime":"2026-03-18T10:14:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.605279 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.605307 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.605316 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.605331 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.605339 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:30Z","lastTransitionTime":"2026-03-18T10:14:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.710463 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.710503 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.710547 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.710567 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.710580 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:30Z","lastTransitionTime":"2026-03-18T10:14:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.712771 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"1aaa6e82080eecc5cde4d763e00b69fb4234de74431affa584f0b900a811dd2e"} Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.719713 4733 generic.go:334] "Generic (PLEG): container finished" podID="0f82588a-9dbd-4c55-8cfc-f96e57fa58b9" containerID="5819ab8cd6ee5f116abf7861120afb8ac702ade05eb0efc4f73366042f2dd3bc" exitCode=0 Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.719796 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t28sh" event={"ID":"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9","Type":"ContainerDied","Data":"5819ab8cd6ee5f116abf7861120afb8ac702ade05eb0efc4f73366042f2dd3bc"} Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.727272 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" event={"ID":"73327417-4d3b-45f1-b3b6-575fdeeaa31a","Type":"ContainerStarted","Data":"10bb15e1466307914440acf630eb4ea4a442cf4354a9cc5e6ddb40d8147a4d77"} Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.727327 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" event={"ID":"73327417-4d3b-45f1-b3b6-575fdeeaa31a","Type":"ContainerStarted","Data":"3c7ac487e3e86e35e8a1d6ddf67975eb4a67657b219938a69a90ccd5774ee0ea"} Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.727343 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" event={"ID":"73327417-4d3b-45f1-b3b6-575fdeeaa31a","Type":"ContainerStarted","Data":"c6233b06ff88fa77ac74becea9ef44fd4aa09b0ae718390c1b73c78d353ecbc1"} Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.727353 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" event={"ID":"73327417-4d3b-45f1-b3b6-575fdeeaa31a","Type":"ContainerStarted","Data":"8f1102a1e204850c8494267640da7ec93ec67e7341c4eb60d22a7f0772058cb4"} Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.727363 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" event={"ID":"73327417-4d3b-45f1-b3b6-575fdeeaa31a","Type":"ContainerStarted","Data":"9eb388a698a6a2ee1963deeba09459d7190f60ed189ee20a2ba24de317604226"} Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.727373 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" event={"ID":"73327417-4d3b-45f1-b3b6-575fdeeaa31a","Type":"ContainerStarted","Data":"e3043711b80807f025d3bd2e7b4593f22d78dd3e458aa185c18c065af4aca503"} Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.730158 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:30Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.733048 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"2a37904dce4f31563b6bf3db4a4e779fcaebf12e80cdabf402fb1fcf03320f46"} Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.734938 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spfjj" event={"ID":"7d693a73-68c1-4595-bbcc-be97691b06fe","Type":"ContainerStarted","Data":"e784035b634ef119368039982dbafab7f160c3864fe9ef9f5236d906de281b11"} Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.734970 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spfjj" event={"ID":"7d693a73-68c1-4595-bbcc-be97691b06fe","Type":"ContainerStarted","Data":"cb07463e9cec5d204a136bc3da2a197f348b611ad242f9652741da372ebc490f"} Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.740333 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xfvfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb58b528-9013-4fab-9747-60bb6ff1bc1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc72346f1bb873e40a1063486ebd2adfd16e3958e17730370c00cb3b775a982c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg7jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xfvfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:30Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.758430 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73327417-4d3b-45f1-b3b6-575fdeeaa31a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pxwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:30Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.773872 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"908bd772-fb33-4f68-8971-d1fef3118c82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3457636bb3e1cc25507158454524b9cee6812beb56c7b22fb86b9438b8082488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:30Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.789932 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:30Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.802503 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:30Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.812715 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.812901 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.812999 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.813088 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.813169 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:30Z","lastTransitionTime":"2026-03-18T10:14:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.819253 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t28sh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f534576a07c40f1e53709418bbc816e439ac9e036c97e780922c19e36c272642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f534576a07c40f1e53709418bbc816e439ac9e036c97e780922c19e36c272642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t28sh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:30Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.831937 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f75e1c5-e0c5-43df-944f-77b734070793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5b4eaa631b67f13321cd60f9136da1832c5cd6e226609c01cabfa28410630a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e7a90421535b4f8ff5e3b3a0ad9c958710094ffa4e3e4eb3eb41c79f80830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2h7dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:30Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.849825 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70b23fcc-38c7-420b-ad9a-57d1c547c788\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb888f7a23904596729e28ec137231447f22565be42be8589f1481aa52efd9f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d0f02cb69f907a82795f47bfae39d1f750bb7bedeeb6d0802e84087dd7150df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cfda710da166c7b27fe6df3f38f5f969d0edea58503530ace9d35e3a7ec1420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b454a77a46e10fcea3615e1f59d7849430a461ee7392b37fbbb6ec89e53eb432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://448f3d210c3e435bb68acc8f81dd92e63739d073e0d3746be3985c3d3fe07556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:30Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.863764 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:30Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.878909 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4s425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3650177-e338-4eba-ab42-bc0cd14c9d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4s425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:30Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.894913 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a14e8a496af63cf1951ed21cfb3b13b1b516b00271dce19cdf858148beff398b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61bc78e89fc84025b585b2a421fa96e8da9f90840b8c78c0658f30d8738c64ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:30Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.912207 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddb303e3-8922-4b43-9bba-2d3f0c30c6b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1614bd2915eb4ab62554cfe72d63669c062baaf25ae2e533788b876ff9544eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aaa002cf5203102149456e58fcc5db02a5e861736d3699e432a91186bac47d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edcafff0c9902e275fc23a2f154d3030c0e751e2f3230a4ca226c9cef8efcbfa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa9eed1a11fd6a14b82ea9f34ead9b9c67e9c9d52c2675651b37f9838875052\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba371d0dc81f8827d305037cab25306e3abe8ed3d243f74923b4709198f7ea38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T10:13:42Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 10:13:41.916017 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 10:13:41.916132 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 10:13:41.917022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1943543564/tls.crt::/tmp/serving-cert-1943543564/tls.key\\\\\\\"\\\\nI0318 10:13:42.070462 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 10:13:42.072416 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 10:13:42.072438 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 10:13:42.072464 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 10:13:42.072469 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 10:13:42.076902 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 10:13:42.076943 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076949 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076959 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 10:13:42.076962 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 10:13:42.076967 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 10:13:42.076974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 10:13:42.077028 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 10:13:42.078631 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:13:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b698902beccdf67c5646c01b34eea131f61dee8d5d6e1f566cdb70c930b2cde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:30Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.916730 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.916772 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.916784 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.916803 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.916815 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:30Z","lastTransitionTime":"2026-03-18T10:14:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.926558 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hsk58" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2c181c8-3361-40a2-afc5-a677e0ab4ecd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7ffcba189533d7ca155ab3284efac3d072ee3bc46d4b2a61247261bdaecb152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-httph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hsk58\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:30Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.938172 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aaa6e82080eecc5cde4d763e00b69fb4234de74431affa584f0b900a811dd2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:30Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.962921 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g6j2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf9836f3455051ee686f0ec11ceb1c60cff06c95a16bf2fcff6c4c3ed600b034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ph8vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g6j2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:30Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.981624 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spfjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d693a73-68c1-4595-bbcc-be97691b06fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-spfjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:30Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:30 crc kubenswrapper[4733]: I0318 10:14:30.994444 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spfjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d693a73-68c1-4595-bbcc-be97691b06fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb07463e9cec5d204a136bc3da2a197f348b611ad242f9652741da372ebc490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e784035b634ef119368039982dbafab7f160c3864fe9ef9f5236d906de281b11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-spfjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:30Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.007216 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hsk58" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2c181c8-3361-40a2-afc5-a677e0ab4ecd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7ffcba189533d7ca155ab3284efac3d072ee3bc46d4b2a61247261bdaecb152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-httph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hsk58\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:31Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.020140 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aaa6e82080eecc5cde4d763e00b69fb4234de74431affa584f0b900a811dd2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:31Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.021110 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.021265 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.021353 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.021444 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.021532 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:31Z","lastTransitionTime":"2026-03-18T10:14:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.035992 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g6j2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf9836f3455051ee686f0ec11ceb1c60cff06c95a16bf2fcff6c4c3ed600b034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ph8vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g6j2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:31Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.054833 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:31Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.067561 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xfvfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb58b528-9013-4fab-9747-60bb6ff1bc1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc72346f1bb873e40a1063486ebd2adfd16e3958e17730370c00cb3b775a982c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg7jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xfvfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:31Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.098647 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73327417-4d3b-45f1-b3b6-575fdeeaa31a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pxwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:31Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.121928 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70b23fcc-38c7-420b-ad9a-57d1c547c788\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb888f7a23904596729e28ec137231447f22565be42be8589f1481aa52efd9f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d0f02cb69f907a82795f47bfae39d1f750bb7bedeeb6d0802e84087dd7150df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cfda710da166c7b27fe6df3f38f5f969d0edea58503530ace9d35e3a7ec1420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b454a77a46e10fcea3615e1f59d7849430a461ee7392b37fbbb6ec89e53eb432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://448f3d210c3e435bb68acc8f81dd92e63739d073e0d3746be3985c3d3fe07556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:31Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.124519 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.124593 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.124612 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.124641 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.124662 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:31Z","lastTransitionTime":"2026-03-18T10:14:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.137343 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"908bd772-fb33-4f68-8971-d1fef3118c82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3457636bb3e1cc25507158454524b9cee6812beb56c7b22fb86b9438b8082488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:31Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.154381 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a37904dce4f31563b6bf3db4a4e779fcaebf12e80cdabf402fb1fcf03320f46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:31Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.173431 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:31Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.174771 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.174821 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.174885 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.174914 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:14:31 crc kubenswrapper[4733]: E0318 10:14:31.174991 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 10:14:31 crc kubenswrapper[4733]: E0318 10:14:31.175138 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4s425" podUID="b3650177-e338-4eba-ab42-bc0cd14c9d65" Mar 18 10:14:31 crc kubenswrapper[4733]: E0318 10:14:31.175351 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 10:14:31 crc kubenswrapper[4733]: E0318 10:14:31.175458 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.217374 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t28sh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f534576a07c40f1e53709418bbc816e439ac9e036c97e780922c19e36c272642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f534576a07c40f1e53709418bbc816e439ac9e036c97e780922c19e36c272642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819ab8cd6ee5f116abf7861120afb8ac702ade05eb0efc4f73366042f2dd3bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5819ab8cd6ee5f116abf7861120afb8ac702ade05eb0efc4f73366042f2dd3bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t28sh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:31Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.227375 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.227412 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.227422 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.227439 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.227449 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:31Z","lastTransitionTime":"2026-03-18T10:14:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.268077 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f75e1c5-e0c5-43df-944f-77b734070793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5b4eaa631b67f13321cd60f9136da1832c5cd6e226609c01cabfa28410630a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e7a90421535b4f8ff5e3b3a0ad9c958710094ffa4e3e4eb3eb41c79f80830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2h7dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:31Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.289031 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddb303e3-8922-4b43-9bba-2d3f0c30c6b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1614bd2915eb4ab62554cfe72d63669c062baaf25ae2e533788b876ff9544eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aaa002cf5203102149456e58fcc5db02a5e861736d3699e432a91186bac47d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edcafff0c9902e275fc23a2f154d3030c0e751e2f3230a4ca226c9cef8efcbfa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa9eed1a11fd6a14b82ea9f34ead9b9c67e9c9d52c2675651b37f9838875052\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba371d0dc81f8827d305037cab25306e3abe8ed3d243f74923b4709198f7ea38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T10:13:42Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 10:13:41.916017 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 10:13:41.916132 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 10:13:41.917022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1943543564/tls.crt::/tmp/serving-cert-1943543564/tls.key\\\\\\\"\\\\nI0318 10:13:42.070462 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 10:13:42.072416 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 10:13:42.072438 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 10:13:42.072464 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 10:13:42.072469 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 10:13:42.076902 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 10:13:42.076943 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076949 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076959 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 10:13:42.076962 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 10:13:42.076967 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 10:13:42.076974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 10:13:42.077028 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 10:13:42.078631 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:13:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b698902beccdf67c5646c01b34eea131f61dee8d5d6e1f566cdb70c930b2cde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:31Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.310150 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:31Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.325039 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4s425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3650177-e338-4eba-ab42-bc0cd14c9d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4s425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:31Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.329719 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.329748 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.329757 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.329776 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.329785 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:31Z","lastTransitionTime":"2026-03-18T10:14:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.337864 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a14e8a496af63cf1951ed21cfb3b13b1b516b00271dce19cdf858148beff398b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61bc78e89fc84025b585b2a421fa96e8da9f90840b8c78c0658f30d8738c64ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:31Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.350352 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddb303e3-8922-4b43-9bba-2d3f0c30c6b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1614bd2915eb4ab62554cfe72d63669c062baaf25ae2e533788b876ff9544eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aaa002cf5203102149456e58fcc5db02a5e861736d3699e432a91186bac47d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edcafff0c9902e275fc23a2f154d3030c0e751e2f3230a4ca226c9cef8efcbfa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa9eed1a11fd6a14b82ea9f34ead9b9c67e9c9d52c2675651b37f9838875052\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba371d0dc81f8827d305037cab25306e3abe8ed3d243f74923b4709198f7ea38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T10:13:42Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 10:13:41.916017 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 10:13:41.916132 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 10:13:41.917022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1943543564/tls.crt::/tmp/serving-cert-1943543564/tls.key\\\\\\\"\\\\nI0318 10:13:42.070462 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 10:13:42.072416 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 10:13:42.072438 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 10:13:42.072464 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 10:13:42.072469 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 10:13:42.076902 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 10:13:42.076943 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076949 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076959 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 10:13:42.076962 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 10:13:42.076967 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 10:13:42.076974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 10:13:42.077028 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 10:13:42.078631 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:13:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b698902beccdf67c5646c01b34eea131f61dee8d5d6e1f566cdb70c930b2cde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:31Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.359950 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:31Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.370421 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4s425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3650177-e338-4eba-ab42-bc0cd14c9d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4s425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:31Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.383394 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a14e8a496af63cf1951ed21cfb3b13b1b516b00271dce19cdf858148beff398b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61bc78e89fc84025b585b2a421fa96e8da9f90840b8c78c0658f30d8738c64ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:31Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.399095 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spfjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d693a73-68c1-4595-bbcc-be97691b06fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb07463e9cec5d204a136bc3da2a197f348b611ad242f9652741da372ebc490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e784035b634ef119368039982dbafab7f160c3864fe9ef9f5236d906de281b11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-spfjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:31Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.412902 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hsk58" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2c181c8-3361-40a2-afc5-a677e0ab4ecd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7ffcba189533d7ca155ab3284efac3d072ee3bc46d4b2a61247261bdaecb152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-httph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hsk58\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:31Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.425485 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aaa6e82080eecc5cde4d763e00b69fb4234de74431affa584f0b900a811dd2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:31Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.432556 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.432595 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.432605 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.432624 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.432639 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:31Z","lastTransitionTime":"2026-03-18T10:14:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.440957 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g6j2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf9836f3455051ee686f0ec11ceb1c60cff06c95a16bf2fcff6c4c3ed600b034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ph8vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g6j2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:31Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.458625 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:31Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.471737 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xfvfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb58b528-9013-4fab-9747-60bb6ff1bc1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc72346f1bb873e40a1063486ebd2adfd16e3958e17730370c00cb3b775a982c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg7jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xfvfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:31Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.492658 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73327417-4d3b-45f1-b3b6-575fdeeaa31a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pxwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:31Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.521911 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70b23fcc-38c7-420b-ad9a-57d1c547c788\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb888f7a23904596729e28ec137231447f22565be42be8589f1481aa52efd9f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d0f02cb69f907a82795f47bfae39d1f750bb7bedeeb6d0802e84087dd7150df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cfda710da166c7b27fe6df3f38f5f969d0edea58503530ace9d35e3a7ec1420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b454a77a46e10fcea3615e1f59d7849430a461ee7392b37fbbb6ec89e53eb432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://448f3d210c3e435bb68acc8f81dd92e63739d073e0d3746be3985c3d3fe07556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:31Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.532880 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"908bd772-fb33-4f68-8971-d1fef3118c82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3457636bb3e1cc25507158454524b9cee6812beb56c7b22fb86b9438b8082488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:31Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.535499 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.535539 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.535549 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.535566 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.535577 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:31Z","lastTransitionTime":"2026-03-18T10:14:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.550555 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a37904dce4f31563b6bf3db4a4e779fcaebf12e80cdabf402fb1fcf03320f46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:31Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.564870 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:31Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.582759 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t28sh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f534576a07c40f1e53709418bbc816e439ac9e036c97e780922c19e36c272642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f534576a07c40f1e53709418bbc816e439ac9e036c97e780922c19e36c272642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819ab8cd6ee5f116abf7861120afb8ac702ade05eb0efc4f73366042f2dd3bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5819ab8cd6ee5f116abf7861120afb8ac702ade05eb0efc4f73366042f2dd3bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t28sh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:31Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.600719 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f75e1c5-e0c5-43df-944f-77b734070793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5b4eaa631b67f13321cd60f9136da1832c5cd6e226609c01cabfa28410630a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e7a90421535b4f8ff5e3b3a0ad9c958710094ffa4e3e4eb3eb41c79f80830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2h7dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:31Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.639077 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.639116 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.639125 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.639138 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.639147 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:31Z","lastTransitionTime":"2026-03-18T10:14:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.744394 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.744445 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.744460 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.744480 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.744495 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:31Z","lastTransitionTime":"2026-03-18T10:14:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.745604 4733 generic.go:334] "Generic (PLEG): container finished" podID="0f82588a-9dbd-4c55-8cfc-f96e57fa58b9" containerID="50025671e1a19a96b715db63cbfa653f3cf078d3bfcfd94803bca2fac9636637" exitCode=0 Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.745685 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t28sh" event={"ID":"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9","Type":"ContainerDied","Data":"50025671e1a19a96b715db63cbfa653f3cf078d3bfcfd94803bca2fac9636637"} Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.771327 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddb303e3-8922-4b43-9bba-2d3f0c30c6b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1614bd2915eb4ab62554cfe72d63669c062baaf25ae2e533788b876ff9544eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aaa002cf5203102149456e58fcc5db02a5e861736d3699e432a91186bac47d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edcafff0c9902e275fc23a2f154d3030c0e751e2f3230a4ca226c9cef8efcbfa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa9eed1a11fd6a14b82ea9f34ead9b9c67e9c9d52c2675651b37f9838875052\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba371d0dc81f8827d305037cab25306e3abe8ed3d243f74923b4709198f7ea38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T10:13:42Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 10:13:41.916017 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 10:13:41.916132 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 10:13:41.917022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1943543564/tls.crt::/tmp/serving-cert-1943543564/tls.key\\\\\\\"\\\\nI0318 10:13:42.070462 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 10:13:42.072416 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 10:13:42.072438 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 10:13:42.072464 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 10:13:42.072469 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 10:13:42.076902 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 10:13:42.076943 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076949 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076959 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 10:13:42.076962 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 10:13:42.076967 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 10:13:42.076974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 10:13:42.077028 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 10:13:42.078631 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:13:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b698902beccdf67c5646c01b34eea131f61dee8d5d6e1f566cdb70c930b2cde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:31Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.786652 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.786688 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.786698 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.786714 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.786725 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:31Z","lastTransitionTime":"2026-03-18T10:14:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.790769 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:31Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:31 crc kubenswrapper[4733]: E0318 10:14:31.806270 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a826494-c246-4717-869b-fd136e2b8410\\\",\\\"systemUUID\\\":\\\"fe704b25-4cdf-410a-9afb-ebc7963f4bc5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:31Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.811441 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4s425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3650177-e338-4eba-ab42-bc0cd14c9d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4s425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:31Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.813525 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.813564 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.813583 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.813606 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.813626 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:31Z","lastTransitionTime":"2026-03-18T10:14:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:31 crc kubenswrapper[4733]: E0318 10:14:31.837766 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a826494-c246-4717-869b-fd136e2b8410\\\",\\\"systemUUID\\\":\\\"fe704b25-4cdf-410a-9afb-ebc7963f4bc5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:31Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.846869 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a14e8a496af63cf1951ed21cfb3b13b1b516b00271dce19cdf858148beff398b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61bc78e89fc84025b585b2a421fa96e8da9f90840b8c78c0658f30d8738c64ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:31Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.848445 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.848479 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.848489 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.848508 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.848523 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:31Z","lastTransitionTime":"2026-03-18T10:14:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.871721 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spfjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d693a73-68c1-4595-bbcc-be97691b06fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb07463e9cec5d204a136bc3da2a197f348b611ad242f9652741da372ebc490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e784035b634ef119368039982dbafab7f160c3864fe9ef9f5236d906de281b11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-spfjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:31Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:31 crc kubenswrapper[4733]: E0318 10:14:31.874521 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a826494-c246-4717-869b-fd136e2b8410\\\",\\\"systemUUID\\\":\\\"fe704b25-4cdf-410a-9afb-ebc7963f4bc5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:31Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.879985 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.880012 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.880022 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.880039 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.880050 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:31Z","lastTransitionTime":"2026-03-18T10:14:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.887877 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hsk58" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2c181c8-3361-40a2-afc5-a677e0ab4ecd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7ffcba189533d7ca155ab3284efac3d072ee3bc46d4b2a61247261bdaecb152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-httph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hsk58\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:31Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:31 crc kubenswrapper[4733]: E0318 10:14:31.895481 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a826494-c246-4717-869b-fd136e2b8410\\\",\\\"systemUUID\\\":\\\"fe704b25-4cdf-410a-9afb-ebc7963f4bc5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:31Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.899267 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.899307 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.899323 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.899343 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.899366 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:31Z","lastTransitionTime":"2026-03-18T10:14:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.909236 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aaa6e82080eecc5cde4d763e00b69fb4234de74431affa584f0b900a811dd2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:31Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:31 crc kubenswrapper[4733]: E0318 10:14:31.923043 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a826494-c246-4717-869b-fd136e2b8410\\\",\\\"systemUUID\\\":\\\"fe704b25-4cdf-410a-9afb-ebc7963f4bc5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:31Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:31 crc kubenswrapper[4733]: E0318 10:14:31.923252 4733 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.925725 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.925761 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.925775 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.925795 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.925810 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:31Z","lastTransitionTime":"2026-03-18T10:14:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.929805 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g6j2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf9836f3455051ee686f0ec11ceb1c60cff06c95a16bf2fcff6c4c3ed600b034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ph8vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g6j2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:31Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.948254 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:31Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.962412 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xfvfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb58b528-9013-4fab-9747-60bb6ff1bc1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc72346f1bb873e40a1063486ebd2adfd16e3958e17730370c00cb3b775a982c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg7jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xfvfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:31Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:31 crc kubenswrapper[4733]: I0318 10:14:31.991371 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73327417-4d3b-45f1-b3b6-575fdeeaa31a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pxwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:31Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:32 crc kubenswrapper[4733]: I0318 10:14:32.007563 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f75e1c5-e0c5-43df-944f-77b734070793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5b4eaa631b67f13321cd60f9136da1832c5cd6e226609c01cabfa28410630a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e7a90421535b4f8ff5e3b3a0ad9c958710094ffa4e3e4eb3eb41c79f80830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2h7dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:32Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:32 crc kubenswrapper[4733]: I0318 10:14:32.028327 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70b23fcc-38c7-420b-ad9a-57d1c547c788\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb888f7a23904596729e28ec137231447f22565be42be8589f1481aa52efd9f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d0f02cb69f907a82795f47bfae39d1f750bb7bedeeb6d0802e84087dd7150df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cfda710da166c7b27fe6df3f38f5f969d0edea58503530ace9d35e3a7ec1420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b454a77a46e10fcea3615e1f59d7849430a461ee7392b37fbbb6ec89e53eb432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://448f3d210c3e435bb68acc8f81dd92e63739d073e0d3746be3985c3d3fe07556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:32Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:32 crc kubenswrapper[4733]: I0318 10:14:32.029244 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:32 crc kubenswrapper[4733]: I0318 10:14:32.029291 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:32 crc kubenswrapper[4733]: I0318 10:14:32.029302 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:32 crc kubenswrapper[4733]: I0318 10:14:32.029326 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:32 crc kubenswrapper[4733]: I0318 10:14:32.029338 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:32Z","lastTransitionTime":"2026-03-18T10:14:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:32 crc kubenswrapper[4733]: I0318 10:14:32.042401 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"908bd772-fb33-4f68-8971-d1fef3118c82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3457636bb3e1cc25507158454524b9cee6812beb56c7b22fb86b9438b8082488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:32Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:32 crc kubenswrapper[4733]: I0318 10:14:32.063142 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a37904dce4f31563b6bf3db4a4e779fcaebf12e80cdabf402fb1fcf03320f46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:32Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:32 crc kubenswrapper[4733]: I0318 10:14:32.095500 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:32Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:32 crc kubenswrapper[4733]: I0318 10:14:32.132582 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:32 crc kubenswrapper[4733]: I0318 10:14:32.132669 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:32 crc kubenswrapper[4733]: I0318 10:14:32.132698 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:32 crc kubenswrapper[4733]: I0318 10:14:32.132736 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:32 crc kubenswrapper[4733]: I0318 10:14:32.132761 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:32Z","lastTransitionTime":"2026-03-18T10:14:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:32 crc kubenswrapper[4733]: I0318 10:14:32.136522 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t28sh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f534576a07c40f1e53709418bbc816e439ac9e036c97e780922c19e36c272642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f534576a07c40f1e53709418bbc816e439ac9e036c97e780922c19e36c272642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819ab8cd6ee5f116abf7861120afb8ac702ade05eb0efc4f73366042f2dd3bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5819ab8cd6ee5f116abf7861120afb8ac702ade05eb0efc4f73366042f2dd3bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50025671e1a19a96b715db63cbfa653f3cf078d3bfcfd94803bca2fac9636637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50025671e1a19a96b715db63cbfa653f3cf078d3bfcfd94803bca2fac9636637\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t28sh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:32Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:32 crc kubenswrapper[4733]: I0318 10:14:32.242071 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:32 crc kubenswrapper[4733]: I0318 10:14:32.242150 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:32 crc kubenswrapper[4733]: I0318 10:14:32.242171 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:32 crc kubenswrapper[4733]: I0318 10:14:32.242231 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:32 crc kubenswrapper[4733]: I0318 10:14:32.242251 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:32Z","lastTransitionTime":"2026-03-18T10:14:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:32 crc kubenswrapper[4733]: I0318 10:14:32.345709 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:32 crc kubenswrapper[4733]: I0318 10:14:32.345803 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:32 crc kubenswrapper[4733]: I0318 10:14:32.345831 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:32 crc kubenswrapper[4733]: I0318 10:14:32.345866 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:32 crc kubenswrapper[4733]: I0318 10:14:32.345888 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:32Z","lastTransitionTime":"2026-03-18T10:14:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:32 crc kubenswrapper[4733]: I0318 10:14:32.450625 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:32 crc kubenswrapper[4733]: I0318 10:14:32.450705 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:32 crc kubenswrapper[4733]: I0318 10:14:32.450726 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:32 crc kubenswrapper[4733]: I0318 10:14:32.450762 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:32 crc kubenswrapper[4733]: I0318 10:14:32.450781 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:32Z","lastTransitionTime":"2026-03-18T10:14:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:32 crc kubenswrapper[4733]: I0318 10:14:32.554956 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:32 crc kubenswrapper[4733]: I0318 10:14:32.555865 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:32 crc kubenswrapper[4733]: I0318 10:14:32.555909 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:32 crc kubenswrapper[4733]: I0318 10:14:32.555940 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:32 crc kubenswrapper[4733]: I0318 10:14:32.555961 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:32Z","lastTransitionTime":"2026-03-18T10:14:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:32 crc kubenswrapper[4733]: I0318 10:14:32.658908 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:32 crc kubenswrapper[4733]: I0318 10:14:32.658986 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:32 crc kubenswrapper[4733]: I0318 10:14:32.659053 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:32 crc kubenswrapper[4733]: I0318 10:14:32.659091 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:32 crc kubenswrapper[4733]: I0318 10:14:32.659114 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:32Z","lastTransitionTime":"2026-03-18T10:14:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:32 crc kubenswrapper[4733]: I0318 10:14:32.754830 4733 generic.go:334] "Generic (PLEG): container finished" podID="0f82588a-9dbd-4c55-8cfc-f96e57fa58b9" containerID="2bb1ae24448e6a65c00cce421e6bbc772e3cd734f8f35fb411ce8eac5d662256" exitCode=0 Mar 18 10:14:32 crc kubenswrapper[4733]: I0318 10:14:32.754925 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t28sh" event={"ID":"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9","Type":"ContainerDied","Data":"2bb1ae24448e6a65c00cce421e6bbc772e3cd734f8f35fb411ce8eac5d662256"} Mar 18 10:14:32 crc kubenswrapper[4733]: I0318 10:14:32.764232 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:32 crc kubenswrapper[4733]: I0318 10:14:32.764445 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:32 crc kubenswrapper[4733]: I0318 10:14:32.764573 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:32 crc kubenswrapper[4733]: I0318 10:14:32.764733 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:32 crc kubenswrapper[4733]: I0318 10:14:32.764860 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:32Z","lastTransitionTime":"2026-03-18T10:14:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:32 crc kubenswrapper[4733]: I0318 10:14:32.777150 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" event={"ID":"73327417-4d3b-45f1-b3b6-575fdeeaa31a","Type":"ContainerStarted","Data":"de2d3be12ab406039374efc6a0094e21103be62b51ef65c4ccf5529d6ef05291"} Mar 18 10:14:32 crc kubenswrapper[4733]: I0318 10:14:32.786689 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g6j2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf9836f3455051ee686f0ec11ceb1c60cff06c95a16bf2fcff6c4c3ed600b034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ph8vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g6j2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:32Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:32 crc kubenswrapper[4733]: I0318 10:14:32.808914 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spfjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d693a73-68c1-4595-bbcc-be97691b06fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb07463e9cec5d204a136bc3da2a197f348b611ad242f9652741da372ebc490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e784035b634ef119368039982dbafab7f160c3864fe9ef9f5236d906de281b11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-spfjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:32Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:32 crc kubenswrapper[4733]: I0318 10:14:32.832290 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hsk58" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2c181c8-3361-40a2-afc5-a677e0ab4ecd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7ffcba189533d7ca155ab3284efac3d072ee3bc46d4b2a61247261bdaecb152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-httph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hsk58\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:32Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:32 crc kubenswrapper[4733]: I0318 10:14:32.854247 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aaa6e82080eecc5cde4d763e00b69fb4234de74431affa584f0b900a811dd2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:32Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:32 crc kubenswrapper[4733]: I0318 10:14:32.868531 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:32 crc kubenswrapper[4733]: I0318 10:14:32.868605 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:32 crc kubenswrapper[4733]: I0318 10:14:32.868632 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:32 crc kubenswrapper[4733]: I0318 10:14:32.868669 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:32 crc kubenswrapper[4733]: I0318 10:14:32.868699 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:32Z","lastTransitionTime":"2026-03-18T10:14:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:32 crc kubenswrapper[4733]: I0318 10:14:32.888543 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73327417-4d3b-45f1-b3b6-575fdeeaa31a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pxwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:32Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:32 crc kubenswrapper[4733]: I0318 10:14:32.914217 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:32Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:32 crc kubenswrapper[4733]: I0318 10:14:32.933148 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xfvfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb58b528-9013-4fab-9747-60bb6ff1bc1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc72346f1bb873e40a1063486ebd2adfd16e3958e17730370c00cb3b775a982c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg7jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xfvfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:32Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:32 crc kubenswrapper[4733]: I0318 10:14:32.953660 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:32Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:32 crc kubenswrapper[4733]: I0318 10:14:32.971865 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:32 crc kubenswrapper[4733]: I0318 10:14:32.971920 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:32 crc kubenswrapper[4733]: I0318 10:14:32.971939 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:32 crc kubenswrapper[4733]: I0318 10:14:32.971968 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:32 crc kubenswrapper[4733]: I0318 10:14:32.971987 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:32Z","lastTransitionTime":"2026-03-18T10:14:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:32 crc kubenswrapper[4733]: I0318 10:14:32.979301 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t28sh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f534576a07c40f1e53709418bbc816e439ac9e036c97e780922c19e36c272642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f534576a07c40f1e53709418bbc816e439ac9e036c97e780922c19e36c272642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819ab8cd6ee5f116abf7861120afb8ac702ade05eb0efc4f73366042f2dd3bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5819ab8cd6ee5f116abf7861120afb8ac702ade05eb0efc4f73366042f2dd3bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50025671e1a19a96b715db63cbfa653f3cf078d3bfcfd94803bca2fac9636637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50025671e1a19a96b715db63cbfa653f3cf078d3bfcfd94803bca2fac9636637\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bb1ae24448e6a65c00cce421e6bbc772e3cd734f8f35fb411ce8eac5d662256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bb1ae24448e6a65c00cce421e6bbc772e3cd734f8f35fb411ce8eac5d662256\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t28sh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:32Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:33 crc kubenswrapper[4733]: I0318 10:14:33.000454 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f75e1c5-e0c5-43df-944f-77b734070793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5b4eaa631b67f13321cd60f9136da1832c5cd6e226609c01cabfa28410630a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e7a90421535b4f8ff5e3b3a0ad9c958710094ffa4e3e4eb3eb41c79f80830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2h7dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:32Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:33 crc kubenswrapper[4733]: I0318 10:14:33.035868 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70b23fcc-38c7-420b-ad9a-57d1c547c788\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb888f7a23904596729e28ec137231447f22565be42be8589f1481aa52efd9f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d0f02cb69f907a82795f47bfae39d1f750bb7bedeeb6d0802e84087dd7150df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cfda710da166c7b27fe6df3f38f5f969d0edea58503530ace9d35e3a7ec1420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b454a77a46e10fcea3615e1f59d7849430a461ee7392b37fbbb6ec89e53eb432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://448f3d210c3e435bb68acc8f81dd92e63739d073e0d3746be3985c3d3fe07556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:33Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:33 crc kubenswrapper[4733]: I0318 10:14:33.049109 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"908bd772-fb33-4f68-8971-d1fef3118c82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3457636bb3e1cc25507158454524b9cee6812beb56c7b22fb86b9438b8082488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:33Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:33 crc kubenswrapper[4733]: I0318 10:14:33.072749 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a37904dce4f31563b6bf3db4a4e779fcaebf12e80cdabf402fb1fcf03320f46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:33Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:33 crc kubenswrapper[4733]: I0318 10:14:33.076622 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:33 crc kubenswrapper[4733]: I0318 10:14:33.076685 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:33 crc kubenswrapper[4733]: I0318 10:14:33.076703 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:33 crc kubenswrapper[4733]: I0318 10:14:33.076731 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:33 crc kubenswrapper[4733]: I0318 10:14:33.076751 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:33Z","lastTransitionTime":"2026-03-18T10:14:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:33 crc kubenswrapper[4733]: I0318 10:14:33.088921 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4s425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3650177-e338-4eba-ab42-bc0cd14c9d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4s425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:33Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:33 crc kubenswrapper[4733]: I0318 10:14:33.104466 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a14e8a496af63cf1951ed21cfb3b13b1b516b00271dce19cdf858148beff398b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61bc78e89fc84025b585b2a421fa96e8da9f90840b8c78c0658f30d8738c64ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:33Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:33 crc kubenswrapper[4733]: I0318 10:14:33.121702 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddb303e3-8922-4b43-9bba-2d3f0c30c6b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1614bd2915eb4ab62554cfe72d63669c062baaf25ae2e533788b876ff9544eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aaa002cf5203102149456e58fcc5db02a5e861736d3699e432a91186bac47d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edcafff0c9902e275fc23a2f154d3030c0e751e2f3230a4ca226c9cef8efcbfa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa9eed1a11fd6a14b82ea9f34ead9b9c67e9c9d52c2675651b37f9838875052\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba371d0dc81f8827d305037cab25306e3abe8ed3d243f74923b4709198f7ea38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T10:13:42Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 10:13:41.916017 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 10:13:41.916132 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 10:13:41.917022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1943543564/tls.crt::/tmp/serving-cert-1943543564/tls.key\\\\\\\"\\\\nI0318 10:13:42.070462 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 10:13:42.072416 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 10:13:42.072438 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 10:13:42.072464 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 10:13:42.072469 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 10:13:42.076902 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 10:13:42.076943 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076949 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076959 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 10:13:42.076962 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 10:13:42.076967 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 10:13:42.076974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 10:13:42.077028 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 10:13:42.078631 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:13:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b698902beccdf67c5646c01b34eea131f61dee8d5d6e1f566cdb70c930b2cde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:33Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:33 crc kubenswrapper[4733]: I0318 10:14:33.143223 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:33Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:33 crc kubenswrapper[4733]: I0318 10:14:33.175522 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:14:33 crc kubenswrapper[4733]: E0318 10:14:33.175727 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 10:14:33 crc kubenswrapper[4733]: I0318 10:14:33.176408 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:14:33 crc kubenswrapper[4733]: I0318 10:14:33.176447 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:14:33 crc kubenswrapper[4733]: I0318 10:14:33.176846 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:14:33 crc kubenswrapper[4733]: E0318 10:14:33.176857 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4s425" podUID="b3650177-e338-4eba-ab42-bc0cd14c9d65" Mar 18 10:14:33 crc kubenswrapper[4733]: E0318 10:14:33.176961 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 10:14:33 crc kubenswrapper[4733]: E0318 10:14:33.177121 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 10:14:33 crc kubenswrapper[4733]: I0318 10:14:33.179014 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:33 crc kubenswrapper[4733]: I0318 10:14:33.179033 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:33 crc kubenswrapper[4733]: I0318 10:14:33.179041 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:33 crc kubenswrapper[4733]: I0318 10:14:33.179053 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:33 crc kubenswrapper[4733]: I0318 10:14:33.179063 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:33Z","lastTransitionTime":"2026-03-18T10:14:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:33 crc kubenswrapper[4733]: I0318 10:14:33.282508 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:33 crc kubenswrapper[4733]: I0318 10:14:33.282584 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:33 crc kubenswrapper[4733]: I0318 10:14:33.282608 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:33 crc kubenswrapper[4733]: I0318 10:14:33.282632 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:33 crc kubenswrapper[4733]: I0318 10:14:33.282649 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:33Z","lastTransitionTime":"2026-03-18T10:14:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:33 crc kubenswrapper[4733]: I0318 10:14:33.385620 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:33 crc kubenswrapper[4733]: I0318 10:14:33.385667 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:33 crc kubenswrapper[4733]: I0318 10:14:33.385677 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:33 crc kubenswrapper[4733]: I0318 10:14:33.385699 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:33 crc kubenswrapper[4733]: I0318 10:14:33.385710 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:33Z","lastTransitionTime":"2026-03-18T10:14:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:33 crc kubenswrapper[4733]: I0318 10:14:33.489149 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:33 crc kubenswrapper[4733]: I0318 10:14:33.489233 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:33 crc kubenswrapper[4733]: I0318 10:14:33.489244 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:33 crc kubenswrapper[4733]: I0318 10:14:33.489272 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:33 crc kubenswrapper[4733]: I0318 10:14:33.489283 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:33Z","lastTransitionTime":"2026-03-18T10:14:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:33 crc kubenswrapper[4733]: I0318 10:14:33.592813 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:33 crc kubenswrapper[4733]: I0318 10:14:33.592910 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:33 crc kubenswrapper[4733]: I0318 10:14:33.592934 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:33 crc kubenswrapper[4733]: I0318 10:14:33.592969 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:33 crc kubenswrapper[4733]: I0318 10:14:33.593001 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:33Z","lastTransitionTime":"2026-03-18T10:14:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:33 crc kubenswrapper[4733]: I0318 10:14:33.696374 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:33 crc kubenswrapper[4733]: I0318 10:14:33.696421 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:33 crc kubenswrapper[4733]: I0318 10:14:33.696430 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:33 crc kubenswrapper[4733]: I0318 10:14:33.696448 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:33 crc kubenswrapper[4733]: I0318 10:14:33.696458 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:33Z","lastTransitionTime":"2026-03-18T10:14:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:33 crc kubenswrapper[4733]: I0318 10:14:33.789067 4733 generic.go:334] "Generic (PLEG): container finished" podID="0f82588a-9dbd-4c55-8cfc-f96e57fa58b9" containerID="eaee328f999419b8d4f3a3298dbf38fd35bc92f76224d38c6769aa7426bb9a80" exitCode=0 Mar 18 10:14:33 crc kubenswrapper[4733]: I0318 10:14:33.789149 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t28sh" event={"ID":"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9","Type":"ContainerDied","Data":"eaee328f999419b8d4f3a3298dbf38fd35bc92f76224d38c6769aa7426bb9a80"} Mar 18 10:14:33 crc kubenswrapper[4733]: I0318 10:14:33.802844 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:33 crc kubenswrapper[4733]: I0318 10:14:33.803582 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:33 crc kubenswrapper[4733]: I0318 10:14:33.803605 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:33 crc kubenswrapper[4733]: I0318 10:14:33.803635 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:33 crc kubenswrapper[4733]: I0318 10:14:33.803656 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:33Z","lastTransitionTime":"2026-03-18T10:14:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:33 crc kubenswrapper[4733]: I0318 10:14:33.819018 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:33Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:33 crc kubenswrapper[4733]: I0318 10:14:33.838905 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xfvfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb58b528-9013-4fab-9747-60bb6ff1bc1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc72346f1bb873e40a1063486ebd2adfd16e3958e17730370c00cb3b775a982c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg7jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xfvfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:33Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:33 crc kubenswrapper[4733]: I0318 10:14:33.897857 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73327417-4d3b-45f1-b3b6-575fdeeaa31a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pxwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:33Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:33 crc kubenswrapper[4733]: I0318 10:14:33.923843 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:33 crc kubenswrapper[4733]: I0318 10:14:33.923885 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:33 crc kubenswrapper[4733]: I0318 10:14:33.923895 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:33 crc kubenswrapper[4733]: I0318 10:14:33.923917 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:33 crc kubenswrapper[4733]: I0318 10:14:33.923931 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:33Z","lastTransitionTime":"2026-03-18T10:14:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:33 crc kubenswrapper[4733]: I0318 10:14:33.930756 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t28sh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f534576a07c40f1e53709418bbc816e439ac9e036c97e780922c19e36c272642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f534576a07c40f1e53709418bbc816e439ac9e036c97e780922c19e36c272642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819ab8cd6ee5f116abf7861120afb8ac702ade05eb0efc4f73366042f2dd3bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5819ab8cd6ee5f116abf7861120afb8ac702ade05eb0efc4f73366042f2dd3bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50025671e1a19a96b715db63cbfa653f3cf078d3bfcfd94803bca2fac9636637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50025671e1a19a96b715db63cbfa653f3cf078d3bfcfd94803bca2fac9636637\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bb1ae24448e6a65c00cce421e6bbc772e3cd734f8f35fb411ce8eac5d662256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bb1ae24448e6a65c00cce421e6bbc772e3cd734f8f35fb411ce8eac5d662256\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaee328f999419b8d4f3a3298dbf38fd35bc92f76224d38c6769aa7426bb9a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaee328f999419b8d4f3a3298dbf38fd35bc92f76224d38c6769aa7426bb9a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t28sh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:33Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:33 crc kubenswrapper[4733]: I0318 10:14:33.946316 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f75e1c5-e0c5-43df-944f-77b734070793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5b4eaa631b67f13321cd60f9136da1832c5cd6e226609c01cabfa28410630a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e7a90421535b4f8ff5e3b3a0ad9c958710094ffa4e3e4eb3eb41c79f80830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2h7dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:33Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:33 crc kubenswrapper[4733]: I0318 10:14:33.966941 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70b23fcc-38c7-420b-ad9a-57d1c547c788\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb888f7a23904596729e28ec137231447f22565be42be8589f1481aa52efd9f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d0f02cb69f907a82795f47bfae39d1f750bb7bedeeb6d0802e84087dd7150df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cfda710da166c7b27fe6df3f38f5f969d0edea58503530ace9d35e3a7ec1420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b454a77a46e10fcea3615e1f59d7849430a461ee7392b37fbbb6ec89e53eb432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://448f3d210c3e435bb68acc8f81dd92e63739d073e0d3746be3985c3d3fe07556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:33Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:33 crc kubenswrapper[4733]: I0318 10:14:33.977009 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"908bd772-fb33-4f68-8971-d1fef3118c82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3457636bb3e1cc25507158454524b9cee6812beb56c7b22fb86b9438b8082488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:33Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:33 crc kubenswrapper[4733]: I0318 10:14:33.989979 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a37904dce4f31563b6bf3db4a4e779fcaebf12e80cdabf402fb1fcf03320f46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:33Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.002413 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:34Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.017860 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a14e8a496af63cf1951ed21cfb3b13b1b516b00271dce19cdf858148beff398b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61bc78e89fc84025b585b2a421fa96e8da9f90840b8c78c0658f30d8738c64ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:34Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.026117 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.026147 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.026160 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.026181 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.026234 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:34Z","lastTransitionTime":"2026-03-18T10:14:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.031130 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddb303e3-8922-4b43-9bba-2d3f0c30c6b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1614bd2915eb4ab62554cfe72d63669c062baaf25ae2e533788b876ff9544eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aaa002cf5203102149456e58fcc5db02a5e861736d3699e432a91186bac47d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edcafff0c9902e275fc23a2f154d3030c0e751e2f3230a4ca226c9cef8efcbfa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa9eed1a11fd6a14b82ea9f34ead9b9c67e9c9d52c2675651b37f9838875052\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba371d0dc81f8827d305037cab25306e3abe8ed3d243f74923b4709198f7ea38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T10:13:42Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 10:13:41.916017 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 10:13:41.916132 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 10:13:41.917022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1943543564/tls.crt::/tmp/serving-cert-1943543564/tls.key\\\\\\\"\\\\nI0318 10:13:42.070462 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 10:13:42.072416 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 10:13:42.072438 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 10:13:42.072464 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 10:13:42.072469 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 10:13:42.076902 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 10:13:42.076943 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076949 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076959 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 10:13:42.076962 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 10:13:42.076967 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 10:13:42.076974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 10:13:42.077028 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 10:13:42.078631 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:13:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b698902beccdf67c5646c01b34eea131f61dee8d5d6e1f566cdb70c930b2cde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:34Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.049681 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:34Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.061922 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4s425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3650177-e338-4eba-ab42-bc0cd14c9d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4s425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:34Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.076719 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spfjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d693a73-68c1-4595-bbcc-be97691b06fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb07463e9cec5d204a136bc3da2a197f348b611ad242f9652741da372ebc490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e784035b634ef119368039982dbafab7f160c3864fe9ef9f5236d906de281b11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-spfjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:34Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.088696 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hsk58" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2c181c8-3361-40a2-afc5-a677e0ab4ecd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7ffcba189533d7ca155ab3284efac3d072ee3bc46d4b2a61247261bdaecb152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-httph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hsk58\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:34Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.100873 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aaa6e82080eecc5cde4d763e00b69fb4234de74431affa584f0b900a811dd2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:34Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.113159 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g6j2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf9836f3455051ee686f0ec11ceb1c60cff06c95a16bf2fcff6c4c3ed600b034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ph8vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g6j2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:34Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.129005 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.129063 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.129082 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.129108 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.129129 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:34Z","lastTransitionTime":"2026-03-18T10:14:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.232497 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.232542 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.232552 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.232568 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.232577 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:34Z","lastTransitionTime":"2026-03-18T10:14:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.336092 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.336175 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.336261 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.336301 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.336328 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:34Z","lastTransitionTime":"2026-03-18T10:14:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.439374 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.439438 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.439450 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.439469 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.439531 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:34Z","lastTransitionTime":"2026-03-18T10:14:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.542755 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.542813 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.542826 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.542880 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.542893 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:34Z","lastTransitionTime":"2026-03-18T10:14:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.646595 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.646665 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.646687 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.646713 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.646731 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:34Z","lastTransitionTime":"2026-03-18T10:14:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.749915 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.749966 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.749983 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.750011 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.750031 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:34Z","lastTransitionTime":"2026-03-18T10:14:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.800649 4733 generic.go:334] "Generic (PLEG): container finished" podID="0f82588a-9dbd-4c55-8cfc-f96e57fa58b9" containerID="9a0b2c5f56088e948c02d27d94da94aba67e2c6ffc58442adc30586a548271b9" exitCode=0 Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.800710 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t28sh" event={"ID":"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9","Type":"ContainerDied","Data":"9a0b2c5f56088e948c02d27d94da94aba67e2c6ffc58442adc30586a548271b9"} Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.827301 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:34Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.846012 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4s425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3650177-e338-4eba-ab42-bc0cd14c9d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4s425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:34Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.854837 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.854891 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.854905 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.854957 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.854971 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:34Z","lastTransitionTime":"2026-03-18T10:14:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.867965 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a14e8a496af63cf1951ed21cfb3b13b1b516b00271dce19cdf858148beff398b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61bc78e89fc84025b585b2a421fa96e8da9f90840b8c78c0658f30d8738c64ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:34Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.884583 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddb303e3-8922-4b43-9bba-2d3f0c30c6b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1614bd2915eb4ab62554cfe72d63669c062baaf25ae2e533788b876ff9544eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aaa002cf5203102149456e58fcc5db02a5e861736d3699e432a91186bac47d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edcafff0c9902e275fc23a2f154d3030c0e751e2f3230a4ca226c9cef8efcbfa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa9eed1a11fd6a14b82ea9f34ead9b9c67e9c9d52c2675651b37f9838875052\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba371d0dc81f8827d305037cab25306e3abe8ed3d243f74923b4709198f7ea38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T10:13:42Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 10:13:41.916017 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 10:13:41.916132 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 10:13:41.917022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1943543564/tls.crt::/tmp/serving-cert-1943543564/tls.key\\\\\\\"\\\\nI0318 10:13:42.070462 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 10:13:42.072416 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 10:13:42.072438 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 10:13:42.072464 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 10:13:42.072469 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 10:13:42.076902 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 10:13:42.076943 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076949 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076959 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 10:13:42.076962 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 10:13:42.076967 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 10:13:42.076974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 10:13:42.077028 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 10:13:42.078631 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:13:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b698902beccdf67c5646c01b34eea131f61dee8d5d6e1f566cdb70c930b2cde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:34Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.900810 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hsk58" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2c181c8-3361-40a2-afc5-a677e0ab4ecd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7ffcba189533d7ca155ab3284efac3d072ee3bc46d4b2a61247261bdaecb152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-httph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hsk58\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:34Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.917719 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aaa6e82080eecc5cde4d763e00b69fb4234de74431affa584f0b900a811dd2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:34Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.931059 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g6j2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf9836f3455051ee686f0ec11ceb1c60cff06c95a16bf2fcff6c4c3ed600b034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ph8vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g6j2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:34Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.946555 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spfjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d693a73-68c1-4595-bbcc-be97691b06fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb07463e9cec5d204a136bc3da2a197f348b611ad242f9652741da372ebc490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e784035b634ef119368039982dbafab7f160c3864fe9ef9f5236d906de281b11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-spfjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:34Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.960646 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.960683 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.960692 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.960709 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.960720 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:34Z","lastTransitionTime":"2026-03-18T10:14:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.963247 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:34Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:34 crc kubenswrapper[4733]: I0318 10:14:34.976593 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xfvfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb58b528-9013-4fab-9747-60bb6ff1bc1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc72346f1bb873e40a1063486ebd2adfd16e3958e17730370c00cb3b775a982c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg7jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xfvfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:34Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.005237 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73327417-4d3b-45f1-b3b6-575fdeeaa31a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pxwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:35Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.022614 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"908bd772-fb33-4f68-8971-d1fef3118c82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3457636bb3e1cc25507158454524b9cee6812beb56c7b22fb86b9438b8082488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:35Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.042661 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a37904dce4f31563b6bf3db4a4e779fcaebf12e80cdabf402fb1fcf03320f46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:35Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.061717 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:35Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.063782 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.063815 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.063827 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.063847 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.063862 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:35Z","lastTransitionTime":"2026-03-18T10:14:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.083340 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t28sh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f534576a07c40f1e53709418bbc816e439ac9e036c97e780922c19e36c272642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f534576a07c40f1e53709418bbc816e439ac9e036c97e780922c19e36c272642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819ab8cd6ee5f116abf7861120afb8ac702ade05eb0efc4f73366042f2dd3bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5819ab8cd6ee5f116abf7861120afb8ac702ade05eb0efc4f73366042f2dd3bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50025671e1a19a96b715db63cbfa653f3cf078d3bfcfd94803bca2fac9636637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50025671e1a19a96b715db63cbfa653f3cf078d3bfcfd94803bca2fac9636637\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bb1ae24448e6a65c00cce421e6bbc772e3cd734f8f35fb411ce8eac5d662256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bb1ae24448e6a65c00cce421e6bbc772e3cd734f8f35fb411ce8eac5d662256\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaee328f999419b8d4f3a3298dbf38fd35bc92f76224d38c6769aa7426bb9a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaee328f999419b8d4f3a3298dbf38fd35bc92f76224d38c6769aa7426bb9a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a0b2c5f56088e948c02d27d94da94aba67e2c6ffc58442adc30586a548271b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a0b2c5f56088e948c02d27d94da94aba67e2c6ffc58442adc30586a548271b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t28sh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:35Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.098518 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f75e1c5-e0c5-43df-944f-77b734070793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5b4eaa631b67f13321cd60f9136da1832c5cd6e226609c01cabfa28410630a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e7a90421535b4f8ff5e3b3a0ad9c958710094ffa4e3e4eb3eb41c79f80830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2h7dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:35Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.120694 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70b23fcc-38c7-420b-ad9a-57d1c547c788\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb888f7a23904596729e28ec137231447f22565be42be8589f1481aa52efd9f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d0f02cb69f907a82795f47bfae39d1f750bb7bedeeb6d0802e84087dd7150df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cfda710da166c7b27fe6df3f38f5f969d0edea58503530ace9d35e3a7ec1420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b454a77a46e10fcea3615e1f59d7849430a461ee7392b37fbbb6ec89e53eb432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://448f3d210c3e435bb68acc8f81dd92e63739d073e0d3746be3985c3d3fe07556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:35Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.167335 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.167409 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.167421 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.167442 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.167454 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:35Z","lastTransitionTime":"2026-03-18T10:14:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.175219 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.175248 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.175263 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:14:35 crc kubenswrapper[4733]: E0318 10:14:35.175344 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 10:14:35 crc kubenswrapper[4733]: E0318 10:14:35.175527 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.175589 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:14:35 crc kubenswrapper[4733]: E0318 10:14:35.176330 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 10:14:35 crc kubenswrapper[4733]: E0318 10:14:35.176704 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4s425" podUID="b3650177-e338-4eba-ab42-bc0cd14c9d65" Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.194984 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.271026 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.271087 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.271109 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.271133 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.271151 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:35Z","lastTransitionTime":"2026-03-18T10:14:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.373972 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.374012 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.374023 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.374038 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.374075 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:35Z","lastTransitionTime":"2026-03-18T10:14:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.476967 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.477022 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.477036 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.477061 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.477076 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:35Z","lastTransitionTime":"2026-03-18T10:14:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.581452 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.581997 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.582023 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.582058 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.582081 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:35Z","lastTransitionTime":"2026-03-18T10:14:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.684526 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.684562 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.684576 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.684592 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.684605 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:35Z","lastTransitionTime":"2026-03-18T10:14:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.788515 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.788616 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.788634 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.788658 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.788679 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:35Z","lastTransitionTime":"2026-03-18T10:14:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.812137 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" event={"ID":"73327417-4d3b-45f1-b3b6-575fdeeaa31a","Type":"ContainerStarted","Data":"f090e00a8f1c87ebbc1c282f9e7528c15f4755dce93436309480932f74815e96"} Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.812667 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.812727 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.821961 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t28sh" event={"ID":"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9","Type":"ContainerStarted","Data":"14c8bb1225c6c415d19ccaf11f0117aa22ccf43aa3b80472a8779ec5cea1aeb2"} Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.835535 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"908bd772-fb33-4f68-8971-d1fef3118c82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3457636bb3e1cc25507158454524b9cee6812beb56c7b22fb86b9438b8082488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:35Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.852649 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.863918 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a37904dce4f31563b6bf3db4a4e779fcaebf12e80cdabf402fb1fcf03320f46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:35Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.886338 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:35Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.892392 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.892460 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.892478 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.892505 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.892523 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:35Z","lastTransitionTime":"2026-03-18T10:14:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.910315 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t28sh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f534576a07c40f1e53709418bbc816e439ac9e036c97e780922c19e36c272642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f534576a07c40f1e53709418bbc816e439ac9e036c97e780922c19e36c272642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819ab8cd6ee5f116abf7861120afb8ac702ade05eb0efc4f73366042f2dd3bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5819ab8cd6ee5f116abf7861120afb8ac702ade05eb0efc4f73366042f2dd3bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50025671e1a19a96b715db63cbfa653f3cf078d3bfcfd94803bca2fac9636637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50025671e1a19a96b715db63cbfa653f3cf078d3bfcfd94803bca2fac9636637\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bb1ae24448e6a65c00cce421e6bbc772e3cd734f8f35fb411ce8eac5d662256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bb1ae24448e6a65c00cce421e6bbc772e3cd734f8f35fb411ce8eac5d662256\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaee328f999419b8d4f3a3298dbf38fd35bc92f76224d38c6769aa7426bb9a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaee328f999419b8d4f3a3298dbf38fd35bc92f76224d38c6769aa7426bb9a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a0b2c5f56088e948c02d27d94da94aba67e2c6ffc58442adc30586a548271b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a0b2c5f56088e948c02d27d94da94aba67e2c6ffc58442adc30586a548271b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t28sh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:35Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.931265 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f75e1c5-e0c5-43df-944f-77b734070793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5b4eaa631b67f13321cd60f9136da1832c5cd6e226609c01cabfa28410630a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e7a90421535b4f8ff5e3b3a0ad9c958710094ffa4e3e4eb3eb41c79f80830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2h7dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:35Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.955548 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70b23fcc-38c7-420b-ad9a-57d1c547c788\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb888f7a23904596729e28ec137231447f22565be42be8589f1481aa52efd9f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d0f02cb69f907a82795f47bfae39d1f750bb7bedeeb6d0802e84087dd7150df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cfda710da166c7b27fe6df3f38f5f969d0edea58503530ace9d35e3a7ec1420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b454a77a46e10fcea3615e1f59d7849430a461ee7392b37fbbb6ec89e53eb432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://448f3d210c3e435bb68acc8f81dd92e63739d073e0d3746be3985c3d3fe07556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:35Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.974792 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e797e62-fc82-47f7-8c8c-6c11d3463304\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0cb2e53d9e61f6e93594f61ef9614e057a66575c32d18a010ab1ecfd3ac367f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2fe29241779e03381bb946ac650ea8a793785c0c3ed67302dd89f1c5e0d93e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b09c8d5c3c63eb7d9db92ce941aec0f0def87adbc1d46334ccc518a47c60f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b4b403598b0be68c5baba6e126ecad218005a9c2aeea9badf14dfc4859dce03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b4b403598b0be68c5baba6e126ecad218005a9c2aeea9badf14dfc4859dce03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:35Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.992442 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:35Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.995520 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.995589 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.995608 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.995633 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:35 crc kubenswrapper[4733]: I0318 10:14:35.995654 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:35Z","lastTransitionTime":"2026-03-18T10:14:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.012493 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4s425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3650177-e338-4eba-ab42-bc0cd14c9d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4s425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:36Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.036839 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a14e8a496af63cf1951ed21cfb3b13b1b516b00271dce19cdf858148beff398b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61bc78e89fc84025b585b2a421fa96e8da9f90840b8c78c0658f30d8738c64ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:36Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.067786 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddb303e3-8922-4b43-9bba-2d3f0c30c6b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1614bd2915eb4ab62554cfe72d63669c062baaf25ae2e533788b876ff9544eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aaa002cf5203102149456e58fcc5db02a5e861736d3699e432a91186bac47d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edcafff0c9902e275fc23a2f154d3030c0e751e2f3230a4ca226c9cef8efcbfa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa9eed1a11fd6a14b82ea9f34ead9b9c67e9c9d52c2675651b37f9838875052\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba371d0dc81f8827d305037cab25306e3abe8ed3d243f74923b4709198f7ea38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T10:13:42Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 10:13:41.916017 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 10:13:41.916132 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 10:13:41.917022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1943543564/tls.crt::/tmp/serving-cert-1943543564/tls.key\\\\\\\"\\\\nI0318 10:13:42.070462 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 10:13:42.072416 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 10:13:42.072438 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 10:13:42.072464 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 10:13:42.072469 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 10:13:42.076902 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 10:13:42.076943 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076949 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076959 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 10:13:42.076962 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 10:13:42.076967 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 10:13:42.076974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 10:13:42.077028 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 10:13:42.078631 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:13:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b698902beccdf67c5646c01b34eea131f61dee8d5d6e1f566cdb70c930b2cde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:36Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.086737 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hsk58" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2c181c8-3361-40a2-afc5-a677e0ab4ecd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7ffcba189533d7ca155ab3284efac3d072ee3bc46d4b2a61247261bdaecb152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-httph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hsk58\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:36Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.098379 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.098451 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.098471 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.098499 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.098530 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:36Z","lastTransitionTime":"2026-03-18T10:14:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.106419 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aaa6e82080eecc5cde4d763e00b69fb4234de74431affa584f0b900a811dd2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:36Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.129563 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g6j2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf9836f3455051ee686f0ec11ceb1c60cff06c95a16bf2fcff6c4c3ed600b034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ph8vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g6j2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:36Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.147764 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spfjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d693a73-68c1-4595-bbcc-be97691b06fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb07463e9cec5d204a136bc3da2a197f348b611ad242f9652741da372ebc490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e784035b634ef119368039982dbafab7f160c3864fe9ef9f5236d906de281b11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-spfjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:36Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.167998 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:36Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.188234 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xfvfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb58b528-9013-4fab-9747-60bb6ff1bc1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc72346f1bb873e40a1063486ebd2adfd16e3958e17730370c00cb3b775a982c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg7jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xfvfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:36Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.203069 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.203146 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.203167 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.203223 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.203470 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:36Z","lastTransitionTime":"2026-03-18T10:14:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.231617 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73327417-4d3b-45f1-b3b6-575fdeeaa31a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f1102a1e204850c8494267640da7ec93ec67e7341c4eb60d22a7f0772058cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6233b06ff88fa77ac74becea9ef44fd4aa09b0ae718390c1b73c78d353ecbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10bb15e1466307914440acf630eb4ea4a442cf4354a9cc5e6ddb40d8147a4d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7ac487e3e86e35e8a1d6ddf67975eb4a67657b219938a69a90ccd5774ee0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb388a698a6a2ee1963deeba09459d7190f60ed189ee20a2ba24de317604226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3043711b80807f025d3bd2e7b4593f22d78dd3e458aa185c18c065af4aca503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f090e00a8f1c87ebbc1c282f9e7528c15f4755dce93436309480932f74815e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2d3be12ab406039374efc6a0094e21103be62b51ef65c4ccf5529d6ef05291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pxwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:36Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.257090 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t28sh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14c8bb1225c6c415d19ccaf11f0117aa22ccf43aa3b80472a8779ec5cea1aeb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f534576a07c40f1e53709418bbc816e439ac9e036c97e780922c19e36c272642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f534576a07c40f1e53709418bbc816e439ac9e036c97e780922c19e36c272642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819ab8cd6ee5f116abf7861120afb8ac702ade05eb0efc4f73366042f2dd3bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5819ab8cd6ee5f116abf7861120afb8ac702ade05eb0efc4f73366042f2dd3bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50025671e1a19a96b715db63cbfa653f3cf078d3bfcfd94803bca2fac9636637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50025671e1a19a96b715db63cbfa653f3cf078d3bfcfd94803bca2fac9636637\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bb1ae24448e6a65c00cce421e6bbc772e3cd734f8f35fb411ce8eac5d662256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bb1ae24448e6a65c00cce421e6bbc772e3cd734f8f35fb411ce8eac5d662256\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaee328f999419b8d4f3a3298dbf38fd35bc92f76224d38c6769aa7426bb9a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaee328f999419b8d4f3a3298dbf38fd35bc92f76224d38c6769aa7426bb9a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a0b2c5f56088e948c02d27d94da94aba67e2c6ffc58442adc30586a548271b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a0b2c5f56088e948c02d27d94da94aba67e2c6ffc58442adc30586a548271b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t28sh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:36Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.277729 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f75e1c5-e0c5-43df-944f-77b734070793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5b4eaa631b67f13321cd60f9136da1832c5cd6e226609c01cabfa28410630a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e7a90421535b4f8ff5e3b3a0ad9c958710094ffa4e3e4eb3eb41c79f80830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2h7dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:36Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.306674 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.306746 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.306767 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.306794 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.306815 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:36Z","lastTransitionTime":"2026-03-18T10:14:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.312788 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70b23fcc-38c7-420b-ad9a-57d1c547c788\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb888f7a23904596729e28ec137231447f22565be42be8589f1481aa52efd9f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d0f02cb69f907a82795f47bfae39d1f750bb7bedeeb6d0802e84087dd7150df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cfda710da166c7b27fe6df3f38f5f969d0edea58503530ace9d35e3a7ec1420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b454a77a46e10fcea3615e1f59d7849430a461ee7392b37fbbb6ec89e53eb432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://448f3d210c3e435bb68acc8f81dd92e63739d073e0d3746be3985c3d3fe07556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:36Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.331901 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"908bd772-fb33-4f68-8971-d1fef3118c82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3457636bb3e1cc25507158454524b9cee6812beb56c7b22fb86b9438b8082488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:36Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.357014 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a37904dce4f31563b6bf3db4a4e779fcaebf12e80cdabf402fb1fcf03320f46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:36Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.378608 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:36Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.405348 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a14e8a496af63cf1951ed21cfb3b13b1b516b00271dce19cdf858148beff398b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61bc78e89fc84025b585b2a421fa96e8da9f90840b8c78c0658f30d8738c64ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:36Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.409719 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.409797 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.409819 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.409848 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.409868 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:36Z","lastTransitionTime":"2026-03-18T10:14:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.431926 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddb303e3-8922-4b43-9bba-2d3f0c30c6b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1614bd2915eb4ab62554cfe72d63669c062baaf25ae2e533788b876ff9544eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aaa002cf5203102149456e58fcc5db02a5e861736d3699e432a91186bac47d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edcafff0c9902e275fc23a2f154d3030c0e751e2f3230a4ca226c9cef8efcbfa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa9eed1a11fd6a14b82ea9f34ead9b9c67e9c9d52c2675651b37f9838875052\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba371d0dc81f8827d305037cab25306e3abe8ed3d243f74923b4709198f7ea38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T10:13:42Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 10:13:41.916017 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 10:13:41.916132 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 10:13:41.917022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1943543564/tls.crt::/tmp/serving-cert-1943543564/tls.key\\\\\\\"\\\\nI0318 10:13:42.070462 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 10:13:42.072416 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 10:13:42.072438 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 10:13:42.072464 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 10:13:42.072469 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 10:13:42.076902 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 10:13:42.076943 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076949 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076959 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 10:13:42.076962 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 10:13:42.076967 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 10:13:42.076974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 10:13:42.077028 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 10:13:42.078631 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:13:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b698902beccdf67c5646c01b34eea131f61dee8d5d6e1f566cdb70c930b2cde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:36Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.453773 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e797e62-fc82-47f7-8c8c-6c11d3463304\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0cb2e53d9e61f6e93594f61ef9614e057a66575c32d18a010ab1ecfd3ac367f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2fe29241779e03381bb946ac650ea8a793785c0c3ed67302dd89f1c5e0d93e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b09c8d5c3c63eb7d9db92ce941aec0f0def87adbc1d46334ccc518a47c60f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b4b403598b0be68c5baba6e126ecad218005a9c2aeea9badf14dfc4859dce03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b4b403598b0be68c5baba6e126ecad218005a9c2aeea9badf14dfc4859dce03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:36Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.465900 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:36Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.477170 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4s425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3650177-e338-4eba-ab42-bc0cd14c9d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4s425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:36Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.494705 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spfjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d693a73-68c1-4595-bbcc-be97691b06fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb07463e9cec5d204a136bc3da2a197f348b611ad242f9652741da372ebc490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e784035b634ef119368039982dbafab7f160c3864fe9ef9f5236d906de281b11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-spfjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:36Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.512149 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.512213 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.512226 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.512244 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.512258 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:36Z","lastTransitionTime":"2026-03-18T10:14:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.512175 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hsk58" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2c181c8-3361-40a2-afc5-a677e0ab4ecd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7ffcba189533d7ca155ab3284efac3d072ee3bc46d4b2a61247261bdaecb152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-httph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hsk58\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:36Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.521054 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.531485 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aaa6e82080eecc5cde4d763e00b69fb4234de74431affa584f0b900a811dd2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:36Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.552815 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g6j2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf9836f3455051ee686f0ec11ceb1c60cff06c95a16bf2fcff6c4c3ed600b034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ph8vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g6j2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:36Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.570603 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:36Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.583995 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xfvfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb58b528-9013-4fab-9747-60bb6ff1bc1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc72346f1bb873e40a1063486ebd2adfd16e3958e17730370c00cb3b775a982c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg7jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xfvfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:36Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.613375 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73327417-4d3b-45f1-b3b6-575fdeeaa31a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f1102a1e204850c8494267640da7ec93ec67e7341c4eb60d22a7f0772058cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6233b06ff88fa77ac74becea9ef44fd4aa09b0ae718390c1b73c78d353ecbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10bb15e1466307914440acf630eb4ea4a442cf4354a9cc5e6ddb40d8147a4d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7ac487e3e86e35e8a1d6ddf67975eb4a67657b219938a69a90ccd5774ee0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb388a698a6a2ee1963deeba09459d7190f60ed189ee20a2ba24de317604226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3043711b80807f025d3bd2e7b4593f22d78dd3e458aa185c18c065af4aca503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f090e00a8f1c87ebbc1c282f9e7528c15f4755dce93436309480932f74815e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2d3be12ab406039374efc6a0094e21103be62b51ef65c4ccf5529d6ef05291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pxwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:36Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.614319 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.614365 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.614381 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.614406 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.614422 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:36Z","lastTransitionTime":"2026-03-18T10:14:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.649042 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73327417-4d3b-45f1-b3b6-575fdeeaa31a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f1102a1e204850c8494267640da7ec93ec67e7341c4eb60d22a7f0772058cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6233b06ff88fa77ac74becea9ef44fd4aa09b0ae718390c1b73c78d353ecbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10bb15e1466307914440acf630eb4ea4a442cf4354a9cc5e6ddb40d8147a4d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7ac487e3e86e35e8a1d6ddf67975eb4a67657b219938a69a90ccd5774ee0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb388a698a6a2ee1963deeba09459d7190f60ed189ee20a2ba24de317604226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3043711b80807f025d3bd2e7b4593f22d78dd3e458aa185c18c065af4aca503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f090e00a8f1c87ebbc1c282f9e7528c15f4755dce93436309480932f74815e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2d3be12ab406039374efc6a0094e21103be62b51ef65c4ccf5529d6ef05291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pxwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:36Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.668619 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:36Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.689172 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xfvfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb58b528-9013-4fab-9747-60bb6ff1bc1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc72346f1bb873e40a1063486ebd2adfd16e3958e17730370c00cb3b775a982c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg7jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xfvfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:36Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.703434 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:36Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.716612 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.716659 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.716672 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.716691 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.716702 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:36Z","lastTransitionTime":"2026-03-18T10:14:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.726806 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t28sh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14c8bb1225c6c415d19ccaf11f0117aa22ccf43aa3b80472a8779ec5cea1aeb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f534576a07c40f1e53709418bbc816e439ac9e036c97e780922c19e36c272642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f534576a07c40f1e53709418bbc816e439ac9e036c97e780922c19e36c272642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819ab8cd6ee5f116abf7861120afb8ac702ade05eb0efc4f73366042f2dd3bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5819ab8cd6ee5f116abf7861120afb8ac702ade05eb0efc4f73366042f2dd3bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50025671e1a19a96b715db63cbfa653f3cf078d3bfcfd94803bca2fac9636637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50025671e1a19a96b715db63cbfa653f3cf078d3bfcfd94803bca2fac9636637\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bb1ae24448e6a65c00cce421e6bbc772e3cd734f8f35fb411ce8eac5d662256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bb1ae24448e6a65c00cce421e6bbc772e3cd734f8f35fb411ce8eac5d662256\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaee328f999419b8d4f3a3298dbf38fd35bc92f76224d38c6769aa7426bb9a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaee328f999419b8d4f3a3298dbf38fd35bc92f76224d38c6769aa7426bb9a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a0b2c5f56088e948c02d27d94da94aba67e2c6ffc58442adc30586a548271b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a0b2c5f56088e948c02d27d94da94aba67e2c6ffc58442adc30586a548271b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t28sh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:36Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.737661 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f75e1c5-e0c5-43df-944f-77b734070793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5b4eaa631b67f13321cd60f9136da1832c5cd6e226609c01cabfa28410630a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e7a90421535b4f8ff5e3b3a0ad9c958710094ffa4e3e4eb3eb41c79f80830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2h7dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:36Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.775810 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70b23fcc-38c7-420b-ad9a-57d1c547c788\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb888f7a23904596729e28ec137231447f22565be42be8589f1481aa52efd9f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d0f02cb69f907a82795f47bfae39d1f750bb7bedeeb6d0802e84087dd7150df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cfda710da166c7b27fe6df3f38f5f969d0edea58503530ace9d35e3a7ec1420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b454a77a46e10fcea3615e1f59d7849430a461ee7392b37fbbb6ec89e53eb432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://448f3d210c3e435bb68acc8f81dd92e63739d073e0d3746be3985c3d3fe07556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:36Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.797403 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"908bd772-fb33-4f68-8971-d1fef3118c82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3457636bb3e1cc25507158454524b9cee6812beb56c7b22fb86b9438b8082488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:36Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.813680 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a37904dce4f31563b6bf3db4a4e779fcaebf12e80cdabf402fb1fcf03320f46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:36Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.818998 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.819066 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.819079 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.819096 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.819106 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:36Z","lastTransitionTime":"2026-03-18T10:14:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.825221 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.831222 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4s425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3650177-e338-4eba-ab42-bc0cd14c9d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4s425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:36Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.847109 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a14e8a496af63cf1951ed21cfb3b13b1b516b00271dce19cdf858148beff398b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61bc78e89fc84025b585b2a421fa96e8da9f90840b8c78c0658f30d8738c64ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:36Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.853973 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.863109 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddb303e3-8922-4b43-9bba-2d3f0c30c6b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1614bd2915eb4ab62554cfe72d63669c062baaf25ae2e533788b876ff9544eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aaa002cf5203102149456e58fcc5db02a5e861736d3699e432a91186bac47d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edcafff0c9902e275fc23a2f154d3030c0e751e2f3230a4ca226c9cef8efcbfa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa9eed1a11fd6a14b82ea9f34ead9b9c67e9c9d52c2675651b37f9838875052\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba371d0dc81f8827d305037cab25306e3abe8ed3d243f74923b4709198f7ea38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T10:13:42Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 10:13:41.916017 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 10:13:41.916132 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 10:13:41.917022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1943543564/tls.crt::/tmp/serving-cert-1943543564/tls.key\\\\\\\"\\\\nI0318 10:13:42.070462 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 10:13:42.072416 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 10:13:42.072438 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 10:13:42.072464 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 10:13:42.072469 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 10:13:42.076902 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 10:13:42.076943 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076949 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076959 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 10:13:42.076962 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 10:13:42.076967 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 10:13:42.076974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 10:13:42.077028 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 10:13:42.078631 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:13:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b698902beccdf67c5646c01b34eea131f61dee8d5d6e1f566cdb70c930b2cde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:36Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.874951 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e797e62-fc82-47f7-8c8c-6c11d3463304\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0cb2e53d9e61f6e93594f61ef9614e057a66575c32d18a010ab1ecfd3ac367f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2fe29241779e03381bb946ac650ea8a793785c0c3ed67302dd89f1c5e0d93e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b09c8d5c3c63eb7d9db92ce941aec0f0def87adbc1d46334ccc518a47c60f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b4b403598b0be68c5baba6e126ecad218005a9c2aeea9badf14dfc4859dce03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b4b403598b0be68c5baba6e126ecad218005a9c2aeea9badf14dfc4859dce03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:36Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.888870 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:36Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.902009 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g6j2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf9836f3455051ee686f0ec11ceb1c60cff06c95a16bf2fcff6c4c3ed600b034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ph8vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g6j2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:36Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.915698 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spfjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d693a73-68c1-4595-bbcc-be97691b06fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb07463e9cec5d204a136bc3da2a197f348b611ad242f9652741da372ebc490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e784035b634ef119368039982dbafab7f160c3864fe9ef9f5236d906de281b11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-spfjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:36Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.921032 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.921086 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.921103 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.921125 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.921140 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:36Z","lastTransitionTime":"2026-03-18T10:14:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.928805 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hsk58" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2c181c8-3361-40a2-afc5-a677e0ab4ecd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7ffcba189533d7ca155ab3284efac3d072ee3bc46d4b2a61247261bdaecb152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-httph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hsk58\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:36Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.949326 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aaa6e82080eecc5cde4d763e00b69fb4234de74431affa584f0b900a811dd2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:36Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.963593 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:36Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:36 crc kubenswrapper[4733]: I0318 10:14:36.975105 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xfvfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb58b528-9013-4fab-9747-60bb6ff1bc1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc72346f1bb873e40a1063486ebd2adfd16e3958e17730370c00cb3b775a982c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg7jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xfvfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:36Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.010281 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73327417-4d3b-45f1-b3b6-575fdeeaa31a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f1102a1e204850c8494267640da7ec93ec67e7341c4eb60d22a7f0772058cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6233b06ff88fa77ac74becea9ef44fd4aa09b0ae718390c1b73c78d353ecbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10bb15e1466307914440acf630eb4ea4a442cf4354a9cc5e6ddb40d8147a4d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7ac487e3e86e35e8a1d6ddf67975eb4a67657b219938a69a90ccd5774ee0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb388a698a6a2ee1963deeba09459d7190f60ed189ee20a2ba24de317604226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3043711b80807f025d3bd2e7b4593f22d78dd3e458aa185c18c065af4aca503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f090e00a8f1c87ebbc1c282f9e7528c15f4755dce93436309480932f74815e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2d3be12ab406039374efc6a0094e21103be62b51ef65c4ccf5529d6ef05291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pxwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:37Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.026276 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.026335 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.026348 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.026367 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.026381 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:37Z","lastTransitionTime":"2026-03-18T10:14:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.036927 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"908bd772-fb33-4f68-8971-d1fef3118c82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3457636bb3e1cc25507158454524b9cee6812beb56c7b22fb86b9438b8082488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:37Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.050981 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a37904dce4f31563b6bf3db4a4e779fcaebf12e80cdabf402fb1fcf03320f46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:37Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.064932 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:37Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.080734 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t28sh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14c8bb1225c6c415d19ccaf11f0117aa22ccf43aa3b80472a8779ec5cea1aeb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f534576a07c40f1e53709418bbc816e439ac9e036c97e780922c19e36c272642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f534576a07c40f1e53709418bbc816e439ac9e036c97e780922c19e36c272642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819ab8cd6ee5f116abf7861120afb8ac702ade05eb0efc4f73366042f2dd3bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5819ab8cd6ee5f116abf7861120afb8ac702ade05eb0efc4f73366042f2dd3bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50025671e1a19a96b715db63cbfa653f3cf078d3bfcfd94803bca2fac9636637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50025671e1a19a96b715db63cbfa653f3cf078d3bfcfd94803bca2fac9636637\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bb1ae24448e6a65c00cce421e6bbc772e3cd734f8f35fb411ce8eac5d662256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bb1ae24448e6a65c00cce421e6bbc772e3cd734f8f35fb411ce8eac5d662256\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaee328f999419b8d4f3a3298dbf38fd35bc92f76224d38c6769aa7426bb9a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaee328f999419b8d4f3a3298dbf38fd35bc92f76224d38c6769aa7426bb9a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a0b2c5f56088e948c02d27d94da94aba67e2c6ffc58442adc30586a548271b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a0b2c5f56088e948c02d27d94da94aba67e2c6ffc58442adc30586a548271b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t28sh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:37Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.091421 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f75e1c5-e0c5-43df-944f-77b734070793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5b4eaa631b67f13321cd60f9136da1832c5cd6e226609c01cabfa28410630a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e7a90421535b4f8ff5e3b3a0ad9c958710094ffa4e3e4eb3eb41c79f80830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2h7dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:37Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.112945 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70b23fcc-38c7-420b-ad9a-57d1c547c788\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb888f7a23904596729e28ec137231447f22565be42be8589f1481aa52efd9f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d0f02cb69f907a82795f47bfae39d1f750bb7bedeeb6d0802e84087dd7150df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cfda710da166c7b27fe6df3f38f5f969d0edea58503530ace9d35e3a7ec1420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b454a77a46e10fcea3615e1f59d7849430a461ee7392b37fbbb6ec89e53eb432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://448f3d210c3e435bb68acc8f81dd92e63739d073e0d3746be3985c3d3fe07556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:37Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.125272 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e797e62-fc82-47f7-8c8c-6c11d3463304\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0cb2e53d9e61f6e93594f61ef9614e057a66575c32d18a010ab1ecfd3ac367f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2fe29241779e03381bb946ac650ea8a793785c0c3ed67302dd89f1c5e0d93e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b09c8d5c3c63eb7d9db92ce941aec0f0def87adbc1d46334ccc518a47c60f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b4b403598b0be68c5baba6e126ecad218005a9c2aeea9badf14dfc4859dce03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b4b403598b0be68c5baba6e126ecad218005a9c2aeea9badf14dfc4859dce03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:37Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.128759 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.128810 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.128819 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.128837 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.128850 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:37Z","lastTransitionTime":"2026-03-18T10:14:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.138712 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:37Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.149178 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4s425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3650177-e338-4eba-ab42-bc0cd14c9d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4s425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:37Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.162326 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a14e8a496af63cf1951ed21cfb3b13b1b516b00271dce19cdf858148beff398b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61bc78e89fc84025b585b2a421fa96e8da9f90840b8c78c0658f30d8738c64ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:37Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.176499 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.176579 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.176526 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.176652 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:14:37 crc kubenswrapper[4733]: E0318 10:14:37.176643 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4s425" podUID="b3650177-e338-4eba-ab42-bc0cd14c9d65" Mar 18 10:14:37 crc kubenswrapper[4733]: E0318 10:14:37.176756 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 10:14:37 crc kubenswrapper[4733]: E0318 10:14:37.176864 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 10:14:37 crc kubenswrapper[4733]: E0318 10:14:37.176930 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.180971 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddb303e3-8922-4b43-9bba-2d3f0c30c6b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1614bd2915eb4ab62554cfe72d63669c062baaf25ae2e533788b876ff9544eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aaa002cf5203102149456e58fcc5db02a5e861736d3699e432a91186bac47d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edcafff0c9902e275fc23a2f154d3030c0e751e2f3230a4ca226c9cef8efcbfa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa9eed1a11fd6a14b82ea9f34ead9b9c67e9c9d52c2675651b37f9838875052\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba371d0dc81f8827d305037cab25306e3abe8ed3d243f74923b4709198f7ea38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T10:13:42Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 10:13:41.916017 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 10:13:41.916132 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 10:13:41.917022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1943543564/tls.crt::/tmp/serving-cert-1943543564/tls.key\\\\\\\"\\\\nI0318 10:13:42.070462 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 10:13:42.072416 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 10:13:42.072438 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 10:13:42.072464 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 10:13:42.072469 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 10:13:42.076902 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 10:13:42.076943 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076949 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076959 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 10:13:42.076962 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 10:13:42.076967 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 10:13:42.076974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 10:13:42.077028 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 10:13:42.078631 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:13:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b698902beccdf67c5646c01b34eea131f61dee8d5d6e1f566cdb70c930b2cde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:37Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.194876 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hsk58" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2c181c8-3361-40a2-afc5-a677e0ab4ecd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7ffcba189533d7ca155ab3284efac3d072ee3bc46d4b2a61247261bdaecb152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-httph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hsk58\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:37Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.207239 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aaa6e82080eecc5cde4d763e00b69fb4234de74431affa584f0b900a811dd2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:37Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.221996 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g6j2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf9836f3455051ee686f0ec11ceb1c60cff06c95a16bf2fcff6c4c3ed600b034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ph8vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g6j2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:37Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.230911 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.230940 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.230948 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.230961 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.230970 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:37Z","lastTransitionTime":"2026-03-18T10:14:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.235154 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spfjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d693a73-68c1-4595-bbcc-be97691b06fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb07463e9cec5d204a136bc3da2a197f348b611ad242f9652741da372ebc490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e784035b634ef119368039982dbafab7f160c3864fe9ef9f5236d906de281b11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-spfjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:37Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.334091 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.334142 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.334152 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.334166 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.334178 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:37Z","lastTransitionTime":"2026-03-18T10:14:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.437093 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.437406 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.437472 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.437534 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.437598 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:37Z","lastTransitionTime":"2026-03-18T10:14:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.540012 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.540049 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.540060 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.540078 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.540086 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:37Z","lastTransitionTime":"2026-03-18T10:14:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.642917 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.642958 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.642969 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.642983 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.642995 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:37Z","lastTransitionTime":"2026-03-18T10:14:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.746137 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.746175 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.746203 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.746224 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.746235 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:37Z","lastTransitionTime":"2026-03-18T10:14:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.830262 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pxwd_73327417-4d3b-45f1-b3b6-575fdeeaa31a/ovnkube-controller/0.log" Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.833511 4733 generic.go:334] "Generic (PLEG): container finished" podID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" containerID="f090e00a8f1c87ebbc1c282f9e7528c15f4755dce93436309480932f74815e96" exitCode=1 Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.833584 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" event={"ID":"73327417-4d3b-45f1-b3b6-575fdeeaa31a","Type":"ContainerDied","Data":"f090e00a8f1c87ebbc1c282f9e7528c15f4755dce93436309480932f74815e96"} Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.834675 4733 scope.go:117] "RemoveContainer" containerID="f090e00a8f1c87ebbc1c282f9e7528c15f4755dce93436309480932f74815e96" Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.848524 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.848563 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.848576 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.848593 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.848608 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:37Z","lastTransitionTime":"2026-03-18T10:14:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.858759 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a37904dce4f31563b6bf3db4a4e779fcaebf12e80cdabf402fb1fcf03320f46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:37Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.874514 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:37Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.900035 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t28sh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14c8bb1225c6c415d19ccaf11f0117aa22ccf43aa3b80472a8779ec5cea1aeb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f534576a07c40f1e53709418bbc816e439ac9e036c97e780922c19e36c272642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f534576a07c40f1e53709418bbc816e439ac9e036c97e780922c19e36c272642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819ab8cd6ee5f116abf7861120afb8ac702ade05eb0efc4f73366042f2dd3bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5819ab8cd6ee5f116abf7861120afb8ac702ade05eb0efc4f73366042f2dd3bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50025671e1a19a96b715db63cbfa653f3cf078d3bfcfd94803bca2fac9636637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50025671e1a19a96b715db63cbfa653f3cf078d3bfcfd94803bca2fac9636637\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bb1ae24448e6a65c00cce421e6bbc772e3cd734f8f35fb411ce8eac5d662256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bb1ae24448e6a65c00cce421e6bbc772e3cd734f8f35fb411ce8eac5d662256\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaee328f999419b8d4f3a3298dbf38fd35bc92f76224d38c6769aa7426bb9a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaee328f999419b8d4f3a3298dbf38fd35bc92f76224d38c6769aa7426bb9a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a0b2c5f56088e948c02d27d94da94aba67e2c6ffc58442adc30586a548271b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a0b2c5f56088e948c02d27d94da94aba67e2c6ffc58442adc30586a548271b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t28sh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:37Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.915007 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f75e1c5-e0c5-43df-944f-77b734070793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5b4eaa631b67f13321cd60f9136da1832c5cd6e226609c01cabfa28410630a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e7a90421535b4f8ff5e3b3a0ad9c958710094ffa4e3e4eb3eb41c79f80830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2h7dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:37Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.950383 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70b23fcc-38c7-420b-ad9a-57d1c547c788\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb888f7a23904596729e28ec137231447f22565be42be8589f1481aa52efd9f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d0f02cb69f907a82795f47bfae39d1f750bb7bedeeb6d0802e84087dd7150df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cfda710da166c7b27fe6df3f38f5f969d0edea58503530ace9d35e3a7ec1420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b454a77a46e10fcea3615e1f59d7849430a461ee7392b37fbbb6ec89e53eb432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://448f3d210c3e435bb68acc8f81dd92e63739d073e0d3746be3985c3d3fe07556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:37Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.951897 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.951941 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.951956 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.951974 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.951986 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:37Z","lastTransitionTime":"2026-03-18T10:14:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.967586 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"908bd772-fb33-4f68-8971-d1fef3118c82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3457636bb3e1cc25507158454524b9cee6812beb56c7b22fb86b9438b8082488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:37Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.983953 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:37Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:37 crc kubenswrapper[4733]: I0318 10:14:37.997780 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4s425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3650177-e338-4eba-ab42-bc0cd14c9d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4s425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:37Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.020671 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a14e8a496af63cf1951ed21cfb3b13b1b516b00271dce19cdf858148beff398b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61bc78e89fc84025b585b2a421fa96e8da9f90840b8c78c0658f30d8738c64ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:38Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.037549 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddb303e3-8922-4b43-9bba-2d3f0c30c6b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1614bd2915eb4ab62554cfe72d63669c062baaf25ae2e533788b876ff9544eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aaa002cf5203102149456e58fcc5db02a5e861736d3699e432a91186bac47d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edcafff0c9902e275fc23a2f154d3030c0e751e2f3230a4ca226c9cef8efcbfa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa9eed1a11fd6a14b82ea9f34ead9b9c67e9c9d52c2675651b37f9838875052\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba371d0dc81f8827d305037cab25306e3abe8ed3d243f74923b4709198f7ea38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T10:13:42Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 10:13:41.916017 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 10:13:41.916132 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 10:13:41.917022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1943543564/tls.crt::/tmp/serving-cert-1943543564/tls.key\\\\\\\"\\\\nI0318 10:13:42.070462 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 10:13:42.072416 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 10:13:42.072438 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 10:13:42.072464 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 10:13:42.072469 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 10:13:42.076902 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 10:13:42.076943 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076949 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076959 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 10:13:42.076962 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 10:13:42.076967 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 10:13:42.076974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 10:13:42.077028 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 10:13:42.078631 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:13:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b698902beccdf67c5646c01b34eea131f61dee8d5d6e1f566cdb70c930b2cde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:38Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.054996 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.055046 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.055058 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.055077 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.055089 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:38Z","lastTransitionTime":"2026-03-18T10:14:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.056262 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e797e62-fc82-47f7-8c8c-6c11d3463304\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0cb2e53d9e61f6e93594f61ef9614e057a66575c32d18a010ab1ecfd3ac367f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2fe29241779e03381bb946ac650ea8a793785c0c3ed67302dd89f1c5e0d93e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b09c8d5c3c63eb7d9db92ce941aec0f0def87adbc1d46334ccc518a47c60f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b4b403598b0be68c5baba6e126ecad218005a9c2aeea9badf14dfc4859dce03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b4b403598b0be68c5baba6e126ecad218005a9c2aeea9badf14dfc4859dce03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:38Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.070465 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aaa6e82080eecc5cde4d763e00b69fb4234de74431affa584f0b900a811dd2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:38Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.083928 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g6j2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf9836f3455051ee686f0ec11ceb1c60cff06c95a16bf2fcff6c4c3ed600b034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ph8vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g6j2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:38Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.100494 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spfjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d693a73-68c1-4595-bbcc-be97691b06fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb07463e9cec5d204a136bc3da2a197f348b611ad242f9652741da372ebc490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e784035b634ef119368039982dbafab7f160c3864fe9ef9f5236d906de281b11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-spfjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:38Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.125658 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hsk58" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2c181c8-3361-40a2-afc5-a677e0ab4ecd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7ffcba189533d7ca155ab3284efac3d072ee3bc46d4b2a61247261bdaecb152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-httph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hsk58\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:38Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.139779 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xfvfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb58b528-9013-4fab-9747-60bb6ff1bc1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc72346f1bb873e40a1063486ebd2adfd16e3958e17730370c00cb3b775a982c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg7jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xfvfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:38Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.158300 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.158344 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.158354 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.158370 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.158382 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:38Z","lastTransitionTime":"2026-03-18T10:14:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.173045 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73327417-4d3b-45f1-b3b6-575fdeeaa31a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f1102a1e204850c8494267640da7ec93ec67e7341c4eb60d22a7f0772058cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6233b06ff88fa77ac74becea9ef44fd4aa09b0ae718390c1b73c78d353ecbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10bb15e1466307914440acf630eb4ea4a442cf4354a9cc5e6ddb40d8147a4d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7ac487e3e86e35e8a1d6ddf67975eb4a67657b219938a69a90ccd5774ee0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb388a698a6a2ee1963deeba09459d7190f60ed189ee20a2ba24de317604226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3043711b80807f025d3bd2e7b4593f22d78dd3e458aa185c18c065af4aca503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f090e00a8f1c87ebbc1c282f9e7528c15f4755dce93436309480932f74815e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f090e00a8f1c87ebbc1c282f9e7528c15f4755dce93436309480932f74815e96\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T10:14:37Z\\\",\\\"message\\\":\\\"ory.go:160\\\\nI0318 10:14:37.435628 6605 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 10:14:37.435688 6605 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 10:14:37.435917 6605 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 10:14:37.436100 6605 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 10:14:37.436433 6605 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 10:14:37.436884 6605 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 10:14:37.438358 6605 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0318 10:14:37.438401 6605 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0318 10:14:37.438470 6605 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 10:14:37.438508 6605 factory.go:656] Stopping watch factory\\\\nI0318 10:14:37.438541 6605 ovnkube.go:599] Stopped ovnkube\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2d3be12ab406039374efc6a0094e21103be62b51ef65c4ccf5529d6ef05291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pxwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:38Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.195637 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:38Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.261246 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.261346 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.261371 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.261412 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.261438 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:38Z","lastTransitionTime":"2026-03-18T10:14:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.364819 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.364877 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.364904 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.364931 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.364947 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:38Z","lastTransitionTime":"2026-03-18T10:14:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.467704 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.467733 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.467741 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.467753 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.467762 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:38Z","lastTransitionTime":"2026-03-18T10:14:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.570121 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.570153 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.570162 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.570224 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.570233 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:38Z","lastTransitionTime":"2026-03-18T10:14:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.672377 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.672414 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.672424 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.672440 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.672450 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:38Z","lastTransitionTime":"2026-03-18T10:14:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.774207 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.774273 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.774286 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.774305 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.774320 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:38Z","lastTransitionTime":"2026-03-18T10:14:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.839253 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pxwd_73327417-4d3b-45f1-b3b6-575fdeeaa31a/ovnkube-controller/0.log" Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.842507 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" event={"ID":"73327417-4d3b-45f1-b3b6-575fdeeaa31a","Type":"ContainerStarted","Data":"e92f95b4e7499a2bf80c4498b1b592c2cb8a47a2602131b74b47ee685f9562f2"} Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.842906 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.859160 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddb303e3-8922-4b43-9bba-2d3f0c30c6b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1614bd2915eb4ab62554cfe72d63669c062baaf25ae2e533788b876ff9544eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aaa002cf5203102149456e58fcc5db02a5e861736d3699e432a91186bac47d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edcafff0c9902e275fc23a2f154d3030c0e751e2f3230a4ca226c9cef8efcbfa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa9eed1a11fd6a14b82ea9f34ead9b9c67e9c9d52c2675651b37f9838875052\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba371d0dc81f8827d305037cab25306e3abe8ed3d243f74923b4709198f7ea38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T10:13:42Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 10:13:41.916017 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 10:13:41.916132 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 10:13:41.917022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1943543564/tls.crt::/tmp/serving-cert-1943543564/tls.key\\\\\\\"\\\\nI0318 10:13:42.070462 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 10:13:42.072416 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 10:13:42.072438 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 10:13:42.072464 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 10:13:42.072469 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 10:13:42.076902 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 10:13:42.076943 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076949 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076959 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 10:13:42.076962 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 10:13:42.076967 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 10:13:42.076974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 10:13:42.077028 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 10:13:42.078631 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:13:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b698902beccdf67c5646c01b34eea131f61dee8d5d6e1f566cdb70c930b2cde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:38Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.871404 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e797e62-fc82-47f7-8c8c-6c11d3463304\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0cb2e53d9e61f6e93594f61ef9614e057a66575c32d18a010ab1ecfd3ac367f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2fe29241779e03381bb946ac650ea8a793785c0c3ed67302dd89f1c5e0d93e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b09c8d5c3c63eb7d9db92ce941aec0f0def87adbc1d46334ccc518a47c60f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b4b403598b0be68c5baba6e126ecad218005a9c2aeea9badf14dfc4859dce03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b4b403598b0be68c5baba6e126ecad218005a9c2aeea9badf14dfc4859dce03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:38Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.876090 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.876154 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.876172 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.876237 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.876271 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:38Z","lastTransitionTime":"2026-03-18T10:14:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.883535 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:38Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.895511 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4s425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3650177-e338-4eba-ab42-bc0cd14c9d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4s425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:38Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.911481 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a14e8a496af63cf1951ed21cfb3b13b1b516b00271dce19cdf858148beff398b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61bc78e89fc84025b585b2a421fa96e8da9f90840b8c78c0658f30d8738c64ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:38Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.928870 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spfjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d693a73-68c1-4595-bbcc-be97691b06fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb07463e9cec5d204a136bc3da2a197f348b611ad242f9652741da372ebc490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e784035b634ef119368039982dbafab7f160c3864fe9ef9f5236d906de281b11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-spfjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:38Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.943462 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hsk58" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2c181c8-3361-40a2-afc5-a677e0ab4ecd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7ffcba189533d7ca155ab3284efac3d072ee3bc46d4b2a61247261bdaecb152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-httph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hsk58\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:38Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.961151 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aaa6e82080eecc5cde4d763e00b69fb4234de74431affa584f0b900a811dd2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:38Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.975531 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g6j2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf9836f3455051ee686f0ec11ceb1c60cff06c95a16bf2fcff6c4c3ed600b034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ph8vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g6j2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:38Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.978078 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.978113 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.978122 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.978139 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.978149 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:38Z","lastTransitionTime":"2026-03-18T10:14:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.989342 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:38Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:38 crc kubenswrapper[4733]: I0318 10:14:38.998368 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xfvfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb58b528-9013-4fab-9747-60bb6ff1bc1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc72346f1bb873e40a1063486ebd2adfd16e3958e17730370c00cb3b775a982c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg7jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xfvfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:38Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:39 crc kubenswrapper[4733]: I0318 10:14:39.029452 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73327417-4d3b-45f1-b3b6-575fdeeaa31a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f1102a1e204850c8494267640da7ec93ec67e7341c4eb60d22a7f0772058cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6233b06ff88fa77ac74becea9ef44fd4aa09b0ae718390c1b73c78d353ecbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10bb15e1466307914440acf630eb4ea4a442cf4354a9cc5e6ddb40d8147a4d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7ac487e3e86e35e8a1d6ddf67975eb4a67657b219938a69a90ccd5774ee0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb388a698a6a2ee1963deeba09459d7190f60ed189ee20a2ba24de317604226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3043711b80807f025d3bd2e7b4593f22d78dd3e458aa185c18c065af4aca503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e92f95b4e7499a2bf80c4498b1b592c2cb8a47a2602131b74b47ee685f9562f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f090e00a8f1c87ebbc1c282f9e7528c15f4755dce93436309480932f74815e96\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T10:14:37Z\\\",\\\"message\\\":\\\"ory.go:160\\\\nI0318 10:14:37.435628 6605 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 10:14:37.435688 6605 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 10:14:37.435917 6605 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 10:14:37.436100 6605 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 10:14:37.436433 6605 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 10:14:37.436884 6605 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 10:14:37.438358 6605 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0318 10:14:37.438401 6605 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0318 10:14:37.438470 6605 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 10:14:37.438508 6605 factory.go:656] Stopping watch factory\\\\nI0318 10:14:37.438541 6605 ovnkube.go:599] Stopped ovnkube\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2d3be12ab406039374efc6a0094e21103be62b51ef65c4ccf5529d6ef05291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pxwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:39Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:39 crc kubenswrapper[4733]: I0318 10:14:39.061752 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70b23fcc-38c7-420b-ad9a-57d1c547c788\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb888f7a23904596729e28ec137231447f22565be42be8589f1481aa52efd9f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d0f02cb69f907a82795f47bfae39d1f750bb7bedeeb6d0802e84087dd7150df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cfda710da166c7b27fe6df3f38f5f969d0edea58503530ace9d35e3a7ec1420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b454a77a46e10fcea3615e1f59d7849430a461ee7392b37fbbb6ec89e53eb432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://448f3d210c3e435bb68acc8f81dd92e63739d073e0d3746be3985c3d3fe07556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:39Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:39 crc kubenswrapper[4733]: I0318 10:14:39.072390 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"908bd772-fb33-4f68-8971-d1fef3118c82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3457636bb3e1cc25507158454524b9cee6812beb56c7b22fb86b9438b8082488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:39Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:39 crc kubenswrapper[4733]: I0318 10:14:39.080384 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:39 crc kubenswrapper[4733]: I0318 10:14:39.080427 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:39 crc kubenswrapper[4733]: I0318 10:14:39.080440 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:39 crc kubenswrapper[4733]: I0318 10:14:39.080461 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:39 crc kubenswrapper[4733]: I0318 10:14:39.080474 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:39Z","lastTransitionTime":"2026-03-18T10:14:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:39 crc kubenswrapper[4733]: I0318 10:14:39.086336 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a37904dce4f31563b6bf3db4a4e779fcaebf12e80cdabf402fb1fcf03320f46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:39Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:39 crc kubenswrapper[4733]: I0318 10:14:39.098277 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:39Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:39 crc kubenswrapper[4733]: I0318 10:14:39.111251 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t28sh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14c8bb1225c6c415d19ccaf11f0117aa22ccf43aa3b80472a8779ec5cea1aeb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f534576a07c40f1e53709418bbc816e439ac9e036c97e780922c19e36c272642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f534576a07c40f1e53709418bbc816e439ac9e036c97e780922c19e36c272642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819ab8cd6ee5f116abf7861120afb8ac702ade05eb0efc4f73366042f2dd3bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5819ab8cd6ee5f116abf7861120afb8ac702ade05eb0efc4f73366042f2dd3bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50025671e1a19a96b715db63cbfa653f3cf078d3bfcfd94803bca2fac9636637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50025671e1a19a96b715db63cbfa653f3cf078d3bfcfd94803bca2fac9636637\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bb1ae24448e6a65c00cce421e6bbc772e3cd734f8f35fb411ce8eac5d662256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bb1ae24448e6a65c00cce421e6bbc772e3cd734f8f35fb411ce8eac5d662256\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaee328f999419b8d4f3a3298dbf38fd35bc92f76224d38c6769aa7426bb9a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaee328f999419b8d4f3a3298dbf38fd35bc92f76224d38c6769aa7426bb9a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a0b2c5f56088e948c02d27d94da94aba67e2c6ffc58442adc30586a548271b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a0b2c5f56088e948c02d27d94da94aba67e2c6ffc58442adc30586a548271b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t28sh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:39Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:39 crc kubenswrapper[4733]: I0318 10:14:39.121083 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f75e1c5-e0c5-43df-944f-77b734070793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5b4eaa631b67f13321cd60f9136da1832c5cd6e226609c01cabfa28410630a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e7a90421535b4f8ff5e3b3a0ad9c958710094ffa4e3e4eb3eb41c79f80830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2h7dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:39Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:39 crc kubenswrapper[4733]: I0318 10:14:39.175336 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:14:39 crc kubenswrapper[4733]: I0318 10:14:39.175382 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:14:39 crc kubenswrapper[4733]: I0318 10:14:39.175409 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:14:39 crc kubenswrapper[4733]: E0318 10:14:39.175457 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4s425" podUID="b3650177-e338-4eba-ab42-bc0cd14c9d65" Mar 18 10:14:39 crc kubenswrapper[4733]: I0318 10:14:39.175469 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:14:39 crc kubenswrapper[4733]: E0318 10:14:39.175551 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 10:14:39 crc kubenswrapper[4733]: E0318 10:14:39.175605 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 10:14:39 crc kubenswrapper[4733]: E0318 10:14:39.175656 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 10:14:39 crc kubenswrapper[4733]: I0318 10:14:39.182293 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:39 crc kubenswrapper[4733]: I0318 10:14:39.182339 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:39 crc kubenswrapper[4733]: I0318 10:14:39.182353 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:39 crc kubenswrapper[4733]: I0318 10:14:39.182373 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:39 crc kubenswrapper[4733]: I0318 10:14:39.182387 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:39Z","lastTransitionTime":"2026-03-18T10:14:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:39 crc kubenswrapper[4733]: I0318 10:14:39.285423 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:39 crc kubenswrapper[4733]: I0318 10:14:39.285505 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:39 crc kubenswrapper[4733]: I0318 10:14:39.285534 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:39 crc kubenswrapper[4733]: I0318 10:14:39.285567 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:39 crc kubenswrapper[4733]: I0318 10:14:39.285593 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:39Z","lastTransitionTime":"2026-03-18T10:14:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:39 crc kubenswrapper[4733]: I0318 10:14:39.388702 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:39 crc kubenswrapper[4733]: I0318 10:14:39.388741 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:39 crc kubenswrapper[4733]: I0318 10:14:39.388755 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:39 crc kubenswrapper[4733]: I0318 10:14:39.388771 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:39 crc kubenswrapper[4733]: I0318 10:14:39.388785 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:39Z","lastTransitionTime":"2026-03-18T10:14:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:39 crc kubenswrapper[4733]: I0318 10:14:39.492882 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:39 crc kubenswrapper[4733]: I0318 10:14:39.492954 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:39 crc kubenswrapper[4733]: I0318 10:14:39.493008 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:39 crc kubenswrapper[4733]: I0318 10:14:39.493034 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:39 crc kubenswrapper[4733]: I0318 10:14:39.493053 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:39Z","lastTransitionTime":"2026-03-18T10:14:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:39 crc kubenswrapper[4733]: I0318 10:14:39.596422 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:39 crc kubenswrapper[4733]: I0318 10:14:39.596487 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:39 crc kubenswrapper[4733]: I0318 10:14:39.596506 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:39 crc kubenswrapper[4733]: I0318 10:14:39.596532 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:39 crc kubenswrapper[4733]: I0318 10:14:39.596550 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:39Z","lastTransitionTime":"2026-03-18T10:14:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:39 crc kubenswrapper[4733]: I0318 10:14:39.699754 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:39 crc kubenswrapper[4733]: I0318 10:14:39.699805 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:39 crc kubenswrapper[4733]: I0318 10:14:39.699824 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:39 crc kubenswrapper[4733]: I0318 10:14:39.699845 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:39 crc kubenswrapper[4733]: I0318 10:14:39.699860 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:39Z","lastTransitionTime":"2026-03-18T10:14:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:39 crc kubenswrapper[4733]: I0318 10:14:39.803731 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:39 crc kubenswrapper[4733]: I0318 10:14:39.803774 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:39 crc kubenswrapper[4733]: I0318 10:14:39.803786 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:39 crc kubenswrapper[4733]: I0318 10:14:39.803804 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:39 crc kubenswrapper[4733]: I0318 10:14:39.803816 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:39Z","lastTransitionTime":"2026-03-18T10:14:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:39 crc kubenswrapper[4733]: I0318 10:14:39.848131 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pxwd_73327417-4d3b-45f1-b3b6-575fdeeaa31a/ovnkube-controller/1.log" Mar 18 10:14:39 crc kubenswrapper[4733]: I0318 10:14:39.848958 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pxwd_73327417-4d3b-45f1-b3b6-575fdeeaa31a/ovnkube-controller/0.log" Mar 18 10:14:39 crc kubenswrapper[4733]: I0318 10:14:39.852110 4733 generic.go:334] "Generic (PLEG): container finished" podID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" containerID="e92f95b4e7499a2bf80c4498b1b592c2cb8a47a2602131b74b47ee685f9562f2" exitCode=1 Mar 18 10:14:39 crc kubenswrapper[4733]: I0318 10:14:39.852169 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" event={"ID":"73327417-4d3b-45f1-b3b6-575fdeeaa31a","Type":"ContainerDied","Data":"e92f95b4e7499a2bf80c4498b1b592c2cb8a47a2602131b74b47ee685f9562f2"} Mar 18 10:14:39 crc kubenswrapper[4733]: I0318 10:14:39.852263 4733 scope.go:117] "RemoveContainer" containerID="f090e00a8f1c87ebbc1c282f9e7528c15f4755dce93436309480932f74815e96" Mar 18 10:14:39 crc kubenswrapper[4733]: I0318 10:14:39.853038 4733 scope.go:117] "RemoveContainer" containerID="e92f95b4e7499a2bf80c4498b1b592c2cb8a47a2602131b74b47ee685f9562f2" Mar 18 10:14:39 crc kubenswrapper[4733]: E0318 10:14:39.853319 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7pxwd_openshift-ovn-kubernetes(73327417-4d3b-45f1-b3b6-575fdeeaa31a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" podUID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" Mar 18 10:14:39 crc kubenswrapper[4733]: I0318 10:14:39.894477 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70b23fcc-38c7-420b-ad9a-57d1c547c788\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb888f7a23904596729e28ec137231447f22565be42be8589f1481aa52efd9f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d0f02cb69f907a82795f47bfae39d1f750bb7bedeeb6d0802e84087dd7150df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cfda710da166c7b27fe6df3f38f5f969d0edea58503530ace9d35e3a7ec1420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b454a77a46e10fcea3615e1f59d7849430a461ee7392b37fbbb6ec89e53eb432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://448f3d210c3e435bb68acc8f81dd92e63739d073e0d3746be3985c3d3fe07556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:39Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:39 crc kubenswrapper[4733]: I0318 10:14:39.906830 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:39 crc kubenswrapper[4733]: I0318 10:14:39.906885 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:39 crc kubenswrapper[4733]: I0318 10:14:39.906902 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:39 crc kubenswrapper[4733]: I0318 10:14:39.906926 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:39 crc kubenswrapper[4733]: I0318 10:14:39.906943 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:39Z","lastTransitionTime":"2026-03-18T10:14:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:39 crc kubenswrapper[4733]: I0318 10:14:39.907494 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"908bd772-fb33-4f68-8971-d1fef3118c82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3457636bb3e1cc25507158454524b9cee6812beb56c7b22fb86b9438b8082488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:39Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:39 crc kubenswrapper[4733]: I0318 10:14:39.922902 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a37904dce4f31563b6bf3db4a4e779fcaebf12e80cdabf402fb1fcf03320f46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:39Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:39 crc kubenswrapper[4733]: I0318 10:14:39.949584 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:39Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:39 crc kubenswrapper[4733]: I0318 10:14:39.976090 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t28sh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14c8bb1225c6c415d19ccaf11f0117aa22ccf43aa3b80472a8779ec5cea1aeb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f534576a07c40f1e53709418bbc816e439ac9e036c97e780922c19e36c272642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f534576a07c40f1e53709418bbc816e439ac9e036c97e780922c19e36c272642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819ab8cd6ee5f116abf7861120afb8ac702ade05eb0efc4f73366042f2dd3bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5819ab8cd6ee5f116abf7861120afb8ac702ade05eb0efc4f73366042f2dd3bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50025671e1a19a96b715db63cbfa653f3cf078d3bfcfd94803bca2fac9636637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50025671e1a19a96b715db63cbfa653f3cf078d3bfcfd94803bca2fac9636637\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bb1ae24448e6a65c00cce421e6bbc772e3cd734f8f35fb411ce8eac5d662256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bb1ae24448e6a65c00cce421e6bbc772e3cd734f8f35fb411ce8eac5d662256\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaee328f999419b8d4f3a3298dbf38fd35bc92f76224d38c6769aa7426bb9a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaee328f999419b8d4f3a3298dbf38fd35bc92f76224d38c6769aa7426bb9a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a0b2c5f56088e948c02d27d94da94aba67e2c6ffc58442adc30586a548271b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a0b2c5f56088e948c02d27d94da94aba67e2c6ffc58442adc30586a548271b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t28sh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:39Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:39 crc kubenswrapper[4733]: I0318 10:14:39.995730 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f75e1c5-e0c5-43df-944f-77b734070793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5b4eaa631b67f13321cd60f9136da1832c5cd6e226609c01cabfa28410630a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e7a90421535b4f8ff5e3b3a0ad9c958710094ffa4e3e4eb3eb41c79f80830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2h7dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:39Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.009324 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.009363 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.009374 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.009388 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.009399 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:40Z","lastTransitionTime":"2026-03-18T10:14:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.020171 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddb303e3-8922-4b43-9bba-2d3f0c30c6b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1614bd2915eb4ab62554cfe72d63669c062baaf25ae2e533788b876ff9544eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aaa002cf5203102149456e58fcc5db02a5e861736d3699e432a91186bac47d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edcafff0c9902e275fc23a2f154d3030c0e751e2f3230a4ca226c9cef8efcbfa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa9eed1a11fd6a14b82ea9f34ead9b9c67e9c9d52c2675651b37f9838875052\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba371d0dc81f8827d305037cab25306e3abe8ed3d243f74923b4709198f7ea38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T10:13:42Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 10:13:41.916017 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 10:13:41.916132 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 10:13:41.917022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1943543564/tls.crt::/tmp/serving-cert-1943543564/tls.key\\\\\\\"\\\\nI0318 10:13:42.070462 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 10:13:42.072416 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 10:13:42.072438 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 10:13:42.072464 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 10:13:42.072469 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 10:13:42.076902 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 10:13:42.076943 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076949 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076959 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 10:13:42.076962 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 10:13:42.076967 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 10:13:42.076974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 10:13:42.077028 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 10:13:42.078631 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:13:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b698902beccdf67c5646c01b34eea131f61dee8d5d6e1f566cdb70c930b2cde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:40Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.038304 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e797e62-fc82-47f7-8c8c-6c11d3463304\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0cb2e53d9e61f6e93594f61ef9614e057a66575c32d18a010ab1ecfd3ac367f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2fe29241779e03381bb946ac650ea8a793785c0c3ed67302dd89f1c5e0d93e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b09c8d5c3c63eb7d9db92ce941aec0f0def87adbc1d46334ccc518a47c60f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b4b403598b0be68c5baba6e126ecad218005a9c2aeea9badf14dfc4859dce03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b4b403598b0be68c5baba6e126ecad218005a9c2aeea9badf14dfc4859dce03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:40Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.057022 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:40Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.073951 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4s425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3650177-e338-4eba-ab42-bc0cd14c9d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4s425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:40Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.094047 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a14e8a496af63cf1951ed21cfb3b13b1b516b00271dce19cdf858148beff398b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61bc78e89fc84025b585b2a421fa96e8da9f90840b8c78c0658f30d8738c64ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:40Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.111904 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spfjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d693a73-68c1-4595-bbcc-be97691b06fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb07463e9cec5d204a136bc3da2a197f348b611ad242f9652741da372ebc490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e784035b634ef119368039982dbafab7f160c3864fe9ef9f5236d906de281b11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-spfjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:40Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.112156 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.112180 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.112207 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.112221 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.112232 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:40Z","lastTransitionTime":"2026-03-18T10:14:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.126526 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hsk58" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2c181c8-3361-40a2-afc5-a677e0ab4ecd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7ffcba189533d7ca155ab3284efac3d072ee3bc46d4b2a61247261bdaecb152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-httph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hsk58\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:40Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.142405 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aaa6e82080eecc5cde4d763e00b69fb4234de74431affa584f0b900a811dd2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:40Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.160790 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g6j2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf9836f3455051ee686f0ec11ceb1c60cff06c95a16bf2fcff6c4c3ed600b034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ph8vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g6j2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:40Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.177830 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:40Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.194202 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xfvfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb58b528-9013-4fab-9747-60bb6ff1bc1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc72346f1bb873e40a1063486ebd2adfd16e3958e17730370c00cb3b775a982c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg7jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xfvfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:40Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.217508 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.217562 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.217572 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.217587 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.217598 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:40Z","lastTransitionTime":"2026-03-18T10:14:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.221996 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73327417-4d3b-45f1-b3b6-575fdeeaa31a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f1102a1e204850c8494267640da7ec93ec67e7341c4eb60d22a7f0772058cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6233b06ff88fa77ac74becea9ef44fd4aa09b0ae718390c1b73c78d353ecbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10bb15e1466307914440acf630eb4ea4a442cf4354a9cc5e6ddb40d8147a4d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7ac487e3e86e35e8a1d6ddf67975eb4a67657b219938a69a90ccd5774ee0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb388a698a6a2ee1963deeba09459d7190f60ed189ee20a2ba24de317604226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3043711b80807f025d3bd2e7b4593f22d78dd3e458aa185c18c065af4aca503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e92f95b4e7499a2bf80c4498b1b592c2cb8a47a2602131b74b47ee685f9562f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f090e00a8f1c87ebbc1c282f9e7528c15f4755dce93436309480932f74815e96\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T10:14:37Z\\\",\\\"message\\\":\\\"ory.go:160\\\\nI0318 10:14:37.435628 6605 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 10:14:37.435688 6605 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 10:14:37.435917 6605 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 10:14:37.436100 6605 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 10:14:37.436433 6605 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 10:14:37.436884 6605 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 10:14:37.438358 6605 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0318 10:14:37.438401 6605 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0318 10:14:37.438470 6605 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 10:14:37.438508 6605 factory.go:656] Stopping watch factory\\\\nI0318 10:14:37.438541 6605 ovnkube.go:599] Stopped ovnkube\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e92f95b4e7499a2bf80c4498b1b592c2cb8a47a2602131b74b47ee685f9562f2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T10:14:38Z\\\",\\\"message\\\":\\\"xfvfl\\\\nI0318 10:14:38.896757 6729 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-xfvfl\\\\nI0318 10:14:38.896763 6729 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-xfvfl in node crc\\\\nI0318 10:14:38.896768 6729 obj_retry.go:386] Retry successful for *v1.Pod openshift-image-registry/node-ca-xfvfl after 0 failed attempt(s)\\\\nI0318 10:14:38.896772 6729 default_network_controller.go:776] Recording success event on pod openshift-image-registry/node-ca-xfvfl\\\\nI0318 10:14:38.896804 6729 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-g6j2q\\\\nI0318 10:14:38.896809 6729 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-g6j2q\\\\nI0318 10:14:38.896815 6729 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-g6j2q in node crc\\\\nI0318 10:14:38.896820 6729 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-g6j2q after 0 failed attempt(s)\\\\nI0318 10:14:38.896824 6729 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-g6j2q\\\\nI0318 10:14:38.896848 6729 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0318 10:14:38.896915 6729 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2d3be12ab406039374efc6a0094e21103be62b51ef65c4ccf5529d6ef05291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pxwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:40Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.319798 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.319864 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.319877 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.319893 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.319903 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:40Z","lastTransitionTime":"2026-03-18T10:14:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.422547 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.422597 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.422610 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.422629 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.422644 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:40Z","lastTransitionTime":"2026-03-18T10:14:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.525813 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.525894 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.525908 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.525927 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.525938 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:40Z","lastTransitionTime":"2026-03-18T10:14:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.628664 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.628724 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.628735 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.628751 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.628760 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:40Z","lastTransitionTime":"2026-03-18T10:14:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.732237 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.732309 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.732321 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.732338 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.732347 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:40Z","lastTransitionTime":"2026-03-18T10:14:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.835442 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.835501 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.835516 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.835537 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.835552 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:40Z","lastTransitionTime":"2026-03-18T10:14:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.857012 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pxwd_73327417-4d3b-45f1-b3b6-575fdeeaa31a/ovnkube-controller/1.log" Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.861179 4733 scope.go:117] "RemoveContainer" containerID="e92f95b4e7499a2bf80c4498b1b592c2cb8a47a2602131b74b47ee685f9562f2" Mar 18 10:14:40 crc kubenswrapper[4733]: E0318 10:14:40.861324 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7pxwd_openshift-ovn-kubernetes(73327417-4d3b-45f1-b3b6-575fdeeaa31a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" podUID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.873538 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:40Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.884616 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4s425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3650177-e338-4eba-ab42-bc0cd14c9d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4s425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:40Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.896574 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a14e8a496af63cf1951ed21cfb3b13b1b516b00271dce19cdf858148beff398b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61bc78e89fc84025b585b2a421fa96e8da9f90840b8c78c0658f30d8738c64ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:40Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.914033 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddb303e3-8922-4b43-9bba-2d3f0c30c6b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1614bd2915eb4ab62554cfe72d63669c062baaf25ae2e533788b876ff9544eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aaa002cf5203102149456e58fcc5db02a5e861736d3699e432a91186bac47d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edcafff0c9902e275fc23a2f154d3030c0e751e2f3230a4ca226c9cef8efcbfa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa9eed1a11fd6a14b82ea9f34ead9b9c67e9c9d52c2675651b37f9838875052\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba371d0dc81f8827d305037cab25306e3abe8ed3d243f74923b4709198f7ea38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T10:13:42Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 10:13:41.916017 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 10:13:41.916132 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 10:13:41.917022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1943543564/tls.crt::/tmp/serving-cert-1943543564/tls.key\\\\\\\"\\\\nI0318 10:13:42.070462 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 10:13:42.072416 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 10:13:42.072438 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 10:13:42.072464 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 10:13:42.072469 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 10:13:42.076902 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 10:13:42.076943 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076949 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076959 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 10:13:42.076962 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 10:13:42.076967 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 10:13:42.076974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 10:13:42.077028 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 10:13:42.078631 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:13:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b698902beccdf67c5646c01b34eea131f61dee8d5d6e1f566cdb70c930b2cde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:40Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.931808 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e797e62-fc82-47f7-8c8c-6c11d3463304\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0cb2e53d9e61f6e93594f61ef9614e057a66575c32d18a010ab1ecfd3ac367f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2fe29241779e03381bb946ac650ea8a793785c0c3ed67302dd89f1c5e0d93e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b09c8d5c3c63eb7d9db92ce941aec0f0def87adbc1d46334ccc518a47c60f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b4b403598b0be68c5baba6e126ecad218005a9c2aeea9badf14dfc4859dce03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b4b403598b0be68c5baba6e126ecad218005a9c2aeea9badf14dfc4859dce03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:40Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.938031 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.938066 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.938078 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.938116 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.938129 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:40Z","lastTransitionTime":"2026-03-18T10:14:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.944350 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aaa6e82080eecc5cde4d763e00b69fb4234de74431affa584f0b900a811dd2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:40Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.955994 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g6j2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf9836f3455051ee686f0ec11ceb1c60cff06c95a16bf2fcff6c4c3ed600b034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ph8vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g6j2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:40Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.965606 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spfjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d693a73-68c1-4595-bbcc-be97691b06fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb07463e9cec5d204a136bc3da2a197f348b611ad242f9652741da372ebc490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e784035b634ef119368039982dbafab7f160c3864fe9ef9f5236d906de281b11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-spfjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:40Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.976871 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hsk58" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2c181c8-3361-40a2-afc5-a677e0ab4ecd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7ffcba189533d7ca155ab3284efac3d072ee3bc46d4b2a61247261bdaecb152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-httph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hsk58\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:40Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:40 crc kubenswrapper[4733]: I0318 10:14:40.989068 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xfvfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb58b528-9013-4fab-9747-60bb6ff1bc1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc72346f1bb873e40a1063486ebd2adfd16e3958e17730370c00cb3b775a982c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg7jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xfvfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:40Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:41 crc kubenswrapper[4733]: I0318 10:14:41.006010 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73327417-4d3b-45f1-b3b6-575fdeeaa31a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f1102a1e204850c8494267640da7ec93ec67e7341c4eb60d22a7f0772058cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6233b06ff88fa77ac74becea9ef44fd4aa09b0ae718390c1b73c78d353ecbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10bb15e1466307914440acf630eb4ea4a442cf4354a9cc5e6ddb40d8147a4d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7ac487e3e86e35e8a1d6ddf67975eb4a67657b219938a69a90ccd5774ee0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb388a698a6a2ee1963deeba09459d7190f60ed189ee20a2ba24de317604226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3043711b80807f025d3bd2e7b4593f22d78dd3e458aa185c18c065af4aca503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e92f95b4e7499a2bf80c4498b1b592c2cb8a47a2602131b74b47ee685f9562f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e92f95b4e7499a2bf80c4498b1b592c2cb8a47a2602131b74b47ee685f9562f2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T10:14:38Z\\\",\\\"message\\\":\\\"xfvfl\\\\nI0318 10:14:38.896757 6729 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-xfvfl\\\\nI0318 10:14:38.896763 6729 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-xfvfl in node crc\\\\nI0318 10:14:38.896768 6729 obj_retry.go:386] Retry successful for *v1.Pod openshift-image-registry/node-ca-xfvfl after 0 failed attempt(s)\\\\nI0318 10:14:38.896772 6729 default_network_controller.go:776] Recording success event on pod openshift-image-registry/node-ca-xfvfl\\\\nI0318 10:14:38.896804 6729 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-g6j2q\\\\nI0318 10:14:38.896809 6729 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-g6j2q\\\\nI0318 10:14:38.896815 6729 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-g6j2q in node crc\\\\nI0318 10:14:38.896820 6729 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-g6j2q after 0 failed attempt(s)\\\\nI0318 10:14:38.896824 6729 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-g6j2q\\\\nI0318 10:14:38.896848 6729 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0318 10:14:38.896915 6729 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7pxwd_openshift-ovn-kubernetes(73327417-4d3b-45f1-b3b6-575fdeeaa31a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2d3be12ab406039374efc6a0094e21103be62b51ef65c4ccf5529d6ef05291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pxwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:41Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:41 crc kubenswrapper[4733]: I0318 10:14:41.022675 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:41Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:41 crc kubenswrapper[4733]: I0318 10:14:41.040896 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:41 crc kubenswrapper[4733]: I0318 10:14:41.040944 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:41 crc kubenswrapper[4733]: I0318 10:14:41.040958 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:41 crc kubenswrapper[4733]: I0318 10:14:41.040976 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:41 crc kubenswrapper[4733]: I0318 10:14:41.040988 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:41Z","lastTransitionTime":"2026-03-18T10:14:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:41 crc kubenswrapper[4733]: I0318 10:14:41.042634 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a37904dce4f31563b6bf3db4a4e779fcaebf12e80cdabf402fb1fcf03320f46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:41Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:41 crc kubenswrapper[4733]: I0318 10:14:41.055571 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:41Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:41 crc kubenswrapper[4733]: I0318 10:14:41.071028 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t28sh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14c8bb1225c6c415d19ccaf11f0117aa22ccf43aa3b80472a8779ec5cea1aeb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f534576a07c40f1e53709418bbc816e439ac9e036c97e780922c19e36c272642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f534576a07c40f1e53709418bbc816e439ac9e036c97e780922c19e36c272642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819ab8cd6ee5f116abf7861120afb8ac702ade05eb0efc4f73366042f2dd3bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5819ab8cd6ee5f116abf7861120afb8ac702ade05eb0efc4f73366042f2dd3bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50025671e1a19a96b715db63cbfa653f3cf078d3bfcfd94803bca2fac9636637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50025671e1a19a96b715db63cbfa653f3cf078d3bfcfd94803bca2fac9636637\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bb1ae24448e6a65c00cce421e6bbc772e3cd734f8f35fb411ce8eac5d662256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bb1ae24448e6a65c00cce421e6bbc772e3cd734f8f35fb411ce8eac5d662256\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaee328f999419b8d4f3a3298dbf38fd35bc92f76224d38c6769aa7426bb9a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaee328f999419b8d4f3a3298dbf38fd35bc92f76224d38c6769aa7426bb9a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a0b2c5f56088e948c02d27d94da94aba67e2c6ffc58442adc30586a548271b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a0b2c5f56088e948c02d27d94da94aba67e2c6ffc58442adc30586a548271b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t28sh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:41Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:41 crc kubenswrapper[4733]: I0318 10:14:41.083004 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f75e1c5-e0c5-43df-944f-77b734070793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5b4eaa631b67f13321cd60f9136da1832c5cd6e226609c01cabfa28410630a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e7a90421535b4f8ff5e3b3a0ad9c958710094ffa4e3e4eb3eb41c79f80830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2h7dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:41Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:41 crc kubenswrapper[4733]: I0318 10:14:41.101599 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70b23fcc-38c7-420b-ad9a-57d1c547c788\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb888f7a23904596729e28ec137231447f22565be42be8589f1481aa52efd9f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d0f02cb69f907a82795f47bfae39d1f750bb7bedeeb6d0802e84087dd7150df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cfda710da166c7b27fe6df3f38f5f969d0edea58503530ace9d35e3a7ec1420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b454a77a46e10fcea3615e1f59d7849430a461ee7392b37fbbb6ec89e53eb432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://448f3d210c3e435bb68acc8f81dd92e63739d073e0d3746be3985c3d3fe07556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:41Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:41 crc kubenswrapper[4733]: I0318 10:14:41.111861 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"908bd772-fb33-4f68-8971-d1fef3118c82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3457636bb3e1cc25507158454524b9cee6812beb56c7b22fb86b9438b8082488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:41Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:41 crc kubenswrapper[4733]: E0318 10:14:41.141578 4733 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 18 10:14:41 crc kubenswrapper[4733]: I0318 10:14:41.174496 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:14:41 crc kubenswrapper[4733]: E0318 10:14:41.174613 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 10:14:41 crc kubenswrapper[4733]: I0318 10:14:41.174494 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:14:41 crc kubenswrapper[4733]: I0318 10:14:41.174493 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:14:41 crc kubenswrapper[4733]: E0318 10:14:41.174703 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 10:14:41 crc kubenswrapper[4733]: I0318 10:14:41.174716 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:14:41 crc kubenswrapper[4733]: E0318 10:14:41.174839 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 10:14:41 crc kubenswrapper[4733]: E0318 10:14:41.174900 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4s425" podUID="b3650177-e338-4eba-ab42-bc0cd14c9d65" Mar 18 10:14:41 crc kubenswrapper[4733]: I0318 10:14:41.186687 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddb303e3-8922-4b43-9bba-2d3f0c30c6b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1614bd2915eb4ab62554cfe72d63669c062baaf25ae2e533788b876ff9544eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aaa002cf5203102149456e58fcc5db02a5e861736d3699e432a91186bac47d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edcafff0c9902e275fc23a2f154d3030c0e751e2f3230a4ca226c9cef8efcbfa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa9eed1a11fd6a14b82ea9f34ead9b9c67e9c9d52c2675651b37f9838875052\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba371d0dc81f8827d305037cab25306e3abe8ed3d243f74923b4709198f7ea38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T10:13:42Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 10:13:41.916017 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 10:13:41.916132 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 10:13:41.917022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1943543564/tls.crt::/tmp/serving-cert-1943543564/tls.key\\\\\\\"\\\\nI0318 10:13:42.070462 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 10:13:42.072416 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 10:13:42.072438 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 10:13:42.072464 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 10:13:42.072469 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 10:13:42.076902 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 10:13:42.076943 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076949 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076959 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 10:13:42.076962 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 10:13:42.076967 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 10:13:42.076974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 10:13:42.077028 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 10:13:42.078631 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:13:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b698902beccdf67c5646c01b34eea131f61dee8d5d6e1f566cdb70c930b2cde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:41Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:41 crc kubenswrapper[4733]: I0318 10:14:41.197446 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e797e62-fc82-47f7-8c8c-6c11d3463304\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0cb2e53d9e61f6e93594f61ef9614e057a66575c32d18a010ab1ecfd3ac367f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2fe29241779e03381bb946ac650ea8a793785c0c3ed67302dd89f1c5e0d93e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b09c8d5c3c63eb7d9db92ce941aec0f0def87adbc1d46334ccc518a47c60f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b4b403598b0be68c5baba6e126ecad218005a9c2aeea9badf14dfc4859dce03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b4b403598b0be68c5baba6e126ecad218005a9c2aeea9badf14dfc4859dce03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:41Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:41 crc kubenswrapper[4733]: I0318 10:14:41.207910 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:41Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:41 crc kubenswrapper[4733]: I0318 10:14:41.220052 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4s425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3650177-e338-4eba-ab42-bc0cd14c9d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4s425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:41Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:41 crc kubenswrapper[4733]: I0318 10:14:41.232602 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a14e8a496af63cf1951ed21cfb3b13b1b516b00271dce19cdf858148beff398b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61bc78e89fc84025b585b2a421fa96e8da9f90840b8c78c0658f30d8738c64ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:41Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:41 crc kubenswrapper[4733]: I0318 10:14:41.246480 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spfjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d693a73-68c1-4595-bbcc-be97691b06fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb07463e9cec5d204a136bc3da2a197f348b611ad242f9652741da372ebc490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e784035b634ef119368039982dbafab7f160c3864fe9ef9f5236d906de281b11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-spfjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:41Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:41 crc kubenswrapper[4733]: I0318 10:14:41.256809 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hsk58" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2c181c8-3361-40a2-afc5-a677e0ab4ecd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7ffcba189533d7ca155ab3284efac3d072ee3bc46d4b2a61247261bdaecb152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-httph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hsk58\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:41Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:41 crc kubenswrapper[4733]: E0318 10:14:41.266478 4733 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 10:14:41 crc kubenswrapper[4733]: I0318 10:14:41.273766 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aaa6e82080eecc5cde4d763e00b69fb4234de74431affa584f0b900a811dd2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:41Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:41 crc kubenswrapper[4733]: I0318 10:14:41.288988 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g6j2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf9836f3455051ee686f0ec11ceb1c60cff06c95a16bf2fcff6c4c3ed600b034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ph8vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g6j2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:41Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:41 crc kubenswrapper[4733]: I0318 10:14:41.299077 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:41Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:41 crc kubenswrapper[4733]: I0318 10:14:41.308528 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xfvfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb58b528-9013-4fab-9747-60bb6ff1bc1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc72346f1bb873e40a1063486ebd2adfd16e3958e17730370c00cb3b775a982c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg7jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xfvfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:41Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:41 crc kubenswrapper[4733]: I0318 10:14:41.325694 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73327417-4d3b-45f1-b3b6-575fdeeaa31a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f1102a1e204850c8494267640da7ec93ec67e7341c4eb60d22a7f0772058cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6233b06ff88fa77ac74becea9ef44fd4aa09b0ae718390c1b73c78d353ecbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10bb15e1466307914440acf630eb4ea4a442cf4354a9cc5e6ddb40d8147a4d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7ac487e3e86e35e8a1d6ddf67975eb4a67657b219938a69a90ccd5774ee0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb388a698a6a2ee1963deeba09459d7190f60ed189ee20a2ba24de317604226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3043711b80807f025d3bd2e7b4593f22d78dd3e458aa185c18c065af4aca503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e92f95b4e7499a2bf80c4498b1b592c2cb8a47a2602131b74b47ee685f9562f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e92f95b4e7499a2bf80c4498b1b592c2cb8a47a2602131b74b47ee685f9562f2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T10:14:38Z\\\",\\\"message\\\":\\\"xfvfl\\\\nI0318 10:14:38.896757 6729 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-xfvfl\\\\nI0318 10:14:38.896763 6729 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-xfvfl in node crc\\\\nI0318 10:14:38.896768 6729 obj_retry.go:386] Retry successful for *v1.Pod openshift-image-registry/node-ca-xfvfl after 0 failed attempt(s)\\\\nI0318 10:14:38.896772 6729 default_network_controller.go:776] Recording success event on pod openshift-image-registry/node-ca-xfvfl\\\\nI0318 10:14:38.896804 6729 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-g6j2q\\\\nI0318 10:14:38.896809 6729 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-g6j2q\\\\nI0318 10:14:38.896815 6729 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-g6j2q in node crc\\\\nI0318 10:14:38.896820 6729 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-g6j2q after 0 failed attempt(s)\\\\nI0318 10:14:38.896824 6729 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-g6j2q\\\\nI0318 10:14:38.896848 6729 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0318 10:14:38.896915 6729 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7pxwd_openshift-ovn-kubernetes(73327417-4d3b-45f1-b3b6-575fdeeaa31a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2d3be12ab406039374efc6a0094e21103be62b51ef65c4ccf5529d6ef05291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pxwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:41Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:41 crc kubenswrapper[4733]: I0318 10:14:41.337102 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f75e1c5-e0c5-43df-944f-77b734070793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5b4eaa631b67f13321cd60f9136da1832c5cd6e226609c01cabfa28410630a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e7a90421535b4f8ff5e3b3a0ad9c958710094ffa4e3e4eb3eb41c79f80830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2h7dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:41Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:41 crc kubenswrapper[4733]: I0318 10:14:41.353965 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70b23fcc-38c7-420b-ad9a-57d1c547c788\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb888f7a23904596729e28ec137231447f22565be42be8589f1481aa52efd9f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d0f02cb69f907a82795f47bfae39d1f750bb7bedeeb6d0802e84087dd7150df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cfda710da166c7b27fe6df3f38f5f969d0edea58503530ace9d35e3a7ec1420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b454a77a46e10fcea3615e1f59d7849430a461ee7392b37fbbb6ec89e53eb432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://448f3d210c3e435bb68acc8f81dd92e63739d073e0d3746be3985c3d3fe07556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:41Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:41 crc kubenswrapper[4733]: I0318 10:14:41.362569 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"908bd772-fb33-4f68-8971-d1fef3118c82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3457636bb3e1cc25507158454524b9cee6812beb56c7b22fb86b9438b8082488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:41Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:41 crc kubenswrapper[4733]: I0318 10:14:41.372607 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a37904dce4f31563b6bf3db4a4e779fcaebf12e80cdabf402fb1fcf03320f46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:41Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:41 crc kubenswrapper[4733]: I0318 10:14:41.382553 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:41Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:41 crc kubenswrapper[4733]: I0318 10:14:41.394033 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t28sh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14c8bb1225c6c415d19ccaf11f0117aa22ccf43aa3b80472a8779ec5cea1aeb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f534576a07c40f1e53709418bbc816e439ac9e036c97e780922c19e36c272642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f534576a07c40f1e53709418bbc816e439ac9e036c97e780922c19e36c272642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819ab8cd6ee5f116abf7861120afb8ac702ade05eb0efc4f73366042f2dd3bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5819ab8cd6ee5f116abf7861120afb8ac702ade05eb0efc4f73366042f2dd3bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50025671e1a19a96b715db63cbfa653f3cf078d3bfcfd94803bca2fac9636637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50025671e1a19a96b715db63cbfa653f3cf078d3bfcfd94803bca2fac9636637\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bb1ae24448e6a65c00cce421e6bbc772e3cd734f8f35fb411ce8eac5d662256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bb1ae24448e6a65c00cce421e6bbc772e3cd734f8f35fb411ce8eac5d662256\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaee328f999419b8d4f3a3298dbf38fd35bc92f76224d38c6769aa7426bb9a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaee328f999419b8d4f3a3298dbf38fd35bc92f76224d38c6769aa7426bb9a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a0b2c5f56088e948c02d27d94da94aba67e2c6ffc58442adc30586a548271b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a0b2c5f56088e948c02d27d94da94aba67e2c6ffc58442adc30586a548271b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t28sh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:41Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:42 crc kubenswrapper[4733]: I0318 10:14:42.249467 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:42 crc kubenswrapper[4733]: I0318 10:14:42.249775 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:42 crc kubenswrapper[4733]: I0318 10:14:42.249785 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:42 crc kubenswrapper[4733]: I0318 10:14:42.249800 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:42 crc kubenswrapper[4733]: I0318 10:14:42.249811 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:42Z","lastTransitionTime":"2026-03-18T10:14:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:42 crc kubenswrapper[4733]: E0318 10:14:42.263133 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a826494-c246-4717-869b-fd136e2b8410\\\",\\\"systemUUID\\\":\\\"fe704b25-4cdf-410a-9afb-ebc7963f4bc5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:42Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:42 crc kubenswrapper[4733]: I0318 10:14:42.267301 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:42 crc kubenswrapper[4733]: I0318 10:14:42.267368 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:42 crc kubenswrapper[4733]: I0318 10:14:42.267390 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:42 crc kubenswrapper[4733]: I0318 10:14:42.267414 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:42 crc kubenswrapper[4733]: I0318 10:14:42.267432 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:42Z","lastTransitionTime":"2026-03-18T10:14:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:42 crc kubenswrapper[4733]: E0318 10:14:42.284321 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a826494-c246-4717-869b-fd136e2b8410\\\",\\\"systemUUID\\\":\\\"fe704b25-4cdf-410a-9afb-ebc7963f4bc5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:42Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:42 crc kubenswrapper[4733]: I0318 10:14:42.288262 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:42 crc kubenswrapper[4733]: I0318 10:14:42.288296 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:42 crc kubenswrapper[4733]: I0318 10:14:42.288308 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:42 crc kubenswrapper[4733]: I0318 10:14:42.288323 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:42 crc kubenswrapper[4733]: I0318 10:14:42.288333 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:42Z","lastTransitionTime":"2026-03-18T10:14:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:42 crc kubenswrapper[4733]: E0318 10:14:42.304300 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a826494-c246-4717-869b-fd136e2b8410\\\",\\\"systemUUID\\\":\\\"fe704b25-4cdf-410a-9afb-ebc7963f4bc5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:42Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:42 crc kubenswrapper[4733]: I0318 10:14:42.308946 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:42 crc kubenswrapper[4733]: I0318 10:14:42.308995 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:42 crc kubenswrapper[4733]: I0318 10:14:42.309009 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:42 crc kubenswrapper[4733]: I0318 10:14:42.309029 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:42 crc kubenswrapper[4733]: I0318 10:14:42.309043 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:42Z","lastTransitionTime":"2026-03-18T10:14:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:42 crc kubenswrapper[4733]: E0318 10:14:42.323546 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a826494-c246-4717-869b-fd136e2b8410\\\",\\\"systemUUID\\\":\\\"fe704b25-4cdf-410a-9afb-ebc7963f4bc5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:42Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:42 crc kubenswrapper[4733]: I0318 10:14:42.327139 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:42 crc kubenswrapper[4733]: I0318 10:14:42.327217 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:42 crc kubenswrapper[4733]: I0318 10:14:42.327225 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:42 crc kubenswrapper[4733]: I0318 10:14:42.327238 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:42 crc kubenswrapper[4733]: I0318 10:14:42.327247 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:42Z","lastTransitionTime":"2026-03-18T10:14:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:42 crc kubenswrapper[4733]: E0318 10:14:42.337920 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a826494-c246-4717-869b-fd136e2b8410\\\",\\\"systemUUID\\\":\\\"fe704b25-4cdf-410a-9afb-ebc7963f4bc5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:42Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:42 crc kubenswrapper[4733]: E0318 10:14:42.338064 4733 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 10:14:43 crc kubenswrapper[4733]: I0318 10:14:43.175394 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:14:43 crc kubenswrapper[4733]: E0318 10:14:43.175538 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4s425" podUID="b3650177-e338-4eba-ab42-bc0cd14c9d65" Mar 18 10:14:43 crc kubenswrapper[4733]: I0318 10:14:43.175583 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:14:43 crc kubenswrapper[4733]: I0318 10:14:43.175590 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:14:43 crc kubenswrapper[4733]: E0318 10:14:43.175753 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 10:14:43 crc kubenswrapper[4733]: E0318 10:14:43.175902 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 10:14:43 crc kubenswrapper[4733]: I0318 10:14:43.175936 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:14:43 crc kubenswrapper[4733]: E0318 10:14:43.176062 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 10:14:44 crc kubenswrapper[4733]: I0318 10:14:44.187866 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 18 10:14:45 crc kubenswrapper[4733]: I0318 10:14:45.173412 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:14:45 crc kubenswrapper[4733]: I0318 10:14:45.173611 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b3650177-e338-4eba-ab42-bc0cd14c9d65-metrics-certs\") pod \"network-metrics-daemon-4s425\" (UID: \"b3650177-e338-4eba-ab42-bc0cd14c9d65\") " pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:14:45 crc kubenswrapper[4733]: E0318 10:14:45.173654 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:15:17.173626508 +0000 UTC m=+156.665360833 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:14:45 crc kubenswrapper[4733]: I0318 10:14:45.173707 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:14:45 crc kubenswrapper[4733]: E0318 10:14:45.173773 4733 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 10:14:45 crc kubenswrapper[4733]: E0318 10:14:45.173863 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3650177-e338-4eba-ab42-bc0cd14c9d65-metrics-certs podName:b3650177-e338-4eba-ab42-bc0cd14c9d65 nodeName:}" failed. No retries permitted until 2026-03-18 10:15:17.173834423 +0000 UTC m=+156.665568788 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b3650177-e338-4eba-ab42-bc0cd14c9d65-metrics-certs") pod "network-metrics-daemon-4s425" (UID: "b3650177-e338-4eba-ab42-bc0cd14c9d65") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 10:14:45 crc kubenswrapper[4733]: I0318 10:14:45.173901 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:14:45 crc kubenswrapper[4733]: E0318 10:14:45.173970 4733 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 10:14:45 crc kubenswrapper[4733]: E0318 10:14:45.173985 4733 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 10:14:45 crc kubenswrapper[4733]: I0318 10:14:45.173982 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:14:45 crc kubenswrapper[4733]: E0318 10:14:45.173996 4733 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 10:14:45 crc kubenswrapper[4733]: E0318 10:14:45.174085 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 10:15:17.17407014 +0000 UTC m=+156.665804505 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 10:14:45 crc kubenswrapper[4733]: E0318 10:14:45.174179 4733 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 10:14:45 crc kubenswrapper[4733]: E0318 10:14:45.174270 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 10:15:17.174253405 +0000 UTC m=+156.665987760 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 10:14:45 crc kubenswrapper[4733]: E0318 10:14:45.174332 4733 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 10:14:45 crc kubenswrapper[4733]: E0318 10:14:45.174372 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 10:15:17.174360528 +0000 UTC m=+156.666094893 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 10:14:45 crc kubenswrapper[4733]: I0318 10:14:45.174460 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:14:45 crc kubenswrapper[4733]: I0318 10:14:45.174507 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:14:45 crc kubenswrapper[4733]: E0318 10:14:45.174584 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 10:14:45 crc kubenswrapper[4733]: I0318 10:14:45.174644 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:14:45 crc kubenswrapper[4733]: E0318 10:14:45.174729 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 10:14:45 crc kubenswrapper[4733]: I0318 10:14:45.174772 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:14:45 crc kubenswrapper[4733]: E0318 10:14:45.174896 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 10:14:45 crc kubenswrapper[4733]: E0318 10:14:45.174984 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4s425" podUID="b3650177-e338-4eba-ab42-bc0cd14c9d65" Mar 18 10:14:45 crc kubenswrapper[4733]: I0318 10:14:45.275228 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:14:45 crc kubenswrapper[4733]: E0318 10:14:45.275427 4733 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 10:14:45 crc kubenswrapper[4733]: E0318 10:14:45.275455 4733 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 10:14:45 crc kubenswrapper[4733]: E0318 10:14:45.275466 4733 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 10:14:45 crc kubenswrapper[4733]: E0318 10:14:45.275525 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 10:15:17.275508981 +0000 UTC m=+156.767243306 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 10:14:46 crc kubenswrapper[4733]: E0318 10:14:46.268094 4733 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 10:14:47 crc kubenswrapper[4733]: I0318 10:14:47.175301 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:14:47 crc kubenswrapper[4733]: E0318 10:14:47.175464 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 10:14:47 crc kubenswrapper[4733]: I0318 10:14:47.175974 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:14:47 crc kubenswrapper[4733]: E0318 10:14:47.176081 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 10:14:47 crc kubenswrapper[4733]: I0318 10:14:47.176154 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:14:47 crc kubenswrapper[4733]: E0318 10:14:47.176286 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 10:14:47 crc kubenswrapper[4733]: I0318 10:14:47.176506 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:14:47 crc kubenswrapper[4733]: E0318 10:14:47.176769 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4s425" podUID="b3650177-e338-4eba-ab42-bc0cd14c9d65" Mar 18 10:14:49 crc kubenswrapper[4733]: I0318 10:14:49.175039 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:14:49 crc kubenswrapper[4733]: I0318 10:14:49.175117 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:14:49 crc kubenswrapper[4733]: I0318 10:14:49.175123 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:14:49 crc kubenswrapper[4733]: I0318 10:14:49.175036 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:14:49 crc kubenswrapper[4733]: E0318 10:14:49.175309 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 10:14:49 crc kubenswrapper[4733]: E0318 10:14:49.176297 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4s425" podUID="b3650177-e338-4eba-ab42-bc0cd14c9d65" Mar 18 10:14:49 crc kubenswrapper[4733]: E0318 10:14:49.176544 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 10:14:49 crc kubenswrapper[4733]: E0318 10:14:49.176800 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 10:14:51 crc kubenswrapper[4733]: I0318 10:14:51.174686 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:14:51 crc kubenswrapper[4733]: E0318 10:14:51.174897 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 10:14:51 crc kubenswrapper[4733]: I0318 10:14:51.174919 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:14:51 crc kubenswrapper[4733]: I0318 10:14:51.174968 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:14:51 crc kubenswrapper[4733]: E0318 10:14:51.175096 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4s425" podUID="b3650177-e338-4eba-ab42-bc0cd14c9d65" Mar 18 10:14:51 crc kubenswrapper[4733]: I0318 10:14:51.174947 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:14:51 crc kubenswrapper[4733]: E0318 10:14:51.175312 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 10:14:51 crc kubenswrapper[4733]: E0318 10:14:51.175438 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 10:14:51 crc kubenswrapper[4733]: I0318 10:14:51.197478 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g6j2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf9836f3455051ee686f0ec11ceb1c60cff06c95a16bf2fcff6c4c3ed600b034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ph8vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g6j2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:51Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:51 crc kubenswrapper[4733]: I0318 10:14:51.213385 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spfjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d693a73-68c1-4595-bbcc-be97691b06fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb07463e9cec5d204a136bc3da2a197f348b611ad242f9652741da372ebc490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e784035b634ef119368039982dbafab7f160c3864fe9ef9f5236d906de281b11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-spfjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:51Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:51 crc kubenswrapper[4733]: I0318 10:14:51.227804 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hsk58" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2c181c8-3361-40a2-afc5-a677e0ab4ecd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7ffcba189533d7ca155ab3284efac3d072ee3bc46d4b2a61247261bdaecb152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-httph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hsk58\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:51Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:51 crc kubenswrapper[4733]: I0318 10:14:51.244431 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aaa6e82080eecc5cde4d763e00b69fb4234de74431affa584f0b900a811dd2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:51Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:51 crc kubenswrapper[4733]: E0318 10:14:51.268847 4733 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 10:14:51 crc kubenswrapper[4733]: I0318 10:14:51.277026 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73327417-4d3b-45f1-b3b6-575fdeeaa31a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f1102a1e204850c8494267640da7ec93ec67e7341c4eb60d22a7f0772058cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6233b06ff88fa77ac74becea9ef44fd4aa09b0ae718390c1b73c78d353ecbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10bb15e1466307914440acf630eb4ea4a442cf4354a9cc5e6ddb40d8147a4d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7ac487e3e86e35e8a1d6ddf67975eb4a67657b219938a69a90ccd5774ee0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb388a698a6a2ee1963deeba09459d7190f60ed189ee20a2ba24de317604226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3043711b80807f025d3bd2e7b4593f22d78dd3e458aa185c18c065af4aca503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e92f95b4e7499a2bf80c4498b1b592c2cb8a47a2602131b74b47ee685f9562f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e92f95b4e7499a2bf80c4498b1b592c2cb8a47a2602131b74b47ee685f9562f2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T10:14:38Z\\\",\\\"message\\\":\\\"xfvfl\\\\nI0318 10:14:38.896757 6729 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-xfvfl\\\\nI0318 10:14:38.896763 6729 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-xfvfl in node crc\\\\nI0318 10:14:38.896768 6729 obj_retry.go:386] Retry successful for *v1.Pod openshift-image-registry/node-ca-xfvfl after 0 failed attempt(s)\\\\nI0318 10:14:38.896772 6729 default_network_controller.go:776] Recording success event on pod openshift-image-registry/node-ca-xfvfl\\\\nI0318 10:14:38.896804 6729 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-g6j2q\\\\nI0318 10:14:38.896809 6729 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-g6j2q\\\\nI0318 10:14:38.896815 6729 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-g6j2q in node crc\\\\nI0318 10:14:38.896820 6729 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-g6j2q after 0 failed attempt(s)\\\\nI0318 10:14:38.896824 6729 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-g6j2q\\\\nI0318 10:14:38.896848 6729 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0318 10:14:38.896915 6729 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7pxwd_openshift-ovn-kubernetes(73327417-4d3b-45f1-b3b6-575fdeeaa31a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2d3be12ab406039374efc6a0094e21103be62b51ef65c4ccf5529d6ef05291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pxwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:51Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:51 crc kubenswrapper[4733]: I0318 10:14:51.304752 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"353ee984-b20f-41fa-978a-0167c20ede36\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4287a7d43815108131e4b725925805740a64682bc2a9c96ff054f65517e501f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7c73fedb720681572ba31d10e49b7fc28537f98b4afb32bee611e6265eafaff\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T10:13:13Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 10:12:43.210581 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 10:12:43.213660 1 observer_polling.go:159] Starting file observer\\\\nI0318 10:12:43.251533 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 10:12:43.256315 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0318 10:13:13.491530 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:13:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e84c65c99c9c698f4097bbffe0efebd320e4fc2c4a58788a606e7f0b98e1822\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b54b5cea02ea38b404d6b5730afbab0f729978207023e1dfa7cc49ea9736795\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28b29e8c4af41ef6391d7ea79821c7caa64424b8113473541a96ae936db10015\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:51Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:51 crc kubenswrapper[4733]: I0318 10:14:51.324215 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:51Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:51 crc kubenswrapper[4733]: I0318 10:14:51.337375 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xfvfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb58b528-9013-4fab-9747-60bb6ff1bc1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc72346f1bb873e40a1063486ebd2adfd16e3958e17730370c00cb3b775a982c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg7jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xfvfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:51Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:51 crc kubenswrapper[4733]: I0318 10:14:51.355079 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:51Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:51 crc kubenswrapper[4733]: I0318 10:14:51.376204 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t28sh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14c8bb1225c6c415d19ccaf11f0117aa22ccf43aa3b80472a8779ec5cea1aeb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f534576a07c40f1e53709418bbc816e439ac9e036c97e780922c19e36c272642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f534576a07c40f1e53709418bbc816e439ac9e036c97e780922c19e36c272642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819ab8cd6ee5f116abf7861120afb8ac702ade05eb0efc4f73366042f2dd3bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5819ab8cd6ee5f116abf7861120afb8ac702ade05eb0efc4f73366042f2dd3bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50025671e1a19a96b715db63cbfa653f3cf078d3bfcfd94803bca2fac9636637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50025671e1a19a96b715db63cbfa653f3cf078d3bfcfd94803bca2fac9636637\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bb1ae24448e6a65c00cce421e6bbc772e3cd734f8f35fb411ce8eac5d662256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bb1ae24448e6a65c00cce421e6bbc772e3cd734f8f35fb411ce8eac5d662256\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaee328f999419b8d4f3a3298dbf38fd35bc92f76224d38c6769aa7426bb9a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaee328f999419b8d4f3a3298dbf38fd35bc92f76224d38c6769aa7426bb9a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a0b2c5f56088e948c02d27d94da94aba67e2c6ffc58442adc30586a548271b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a0b2c5f56088e948c02d27d94da94aba67e2c6ffc58442adc30586a548271b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t28sh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:51Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:51 crc kubenswrapper[4733]: I0318 10:14:51.386694 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f75e1c5-e0c5-43df-944f-77b734070793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5b4eaa631b67f13321cd60f9136da1832c5cd6e226609c01cabfa28410630a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e7a90421535b4f8ff5e3b3a0ad9c958710094ffa4e3e4eb3eb41c79f80830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2h7dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:51Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:51 crc kubenswrapper[4733]: I0318 10:14:51.419941 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70b23fcc-38c7-420b-ad9a-57d1c547c788\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb888f7a23904596729e28ec137231447f22565be42be8589f1481aa52efd9f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d0f02cb69f907a82795f47bfae39d1f750bb7bedeeb6d0802e84087dd7150df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cfda710da166c7b27fe6df3f38f5f969d0edea58503530ace9d35e3a7ec1420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b454a77a46e10fcea3615e1f59d7849430a461ee7392b37fbbb6ec89e53eb432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://448f3d210c3e435bb68acc8f81dd92e63739d073e0d3746be3985c3d3fe07556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:51Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:51 crc kubenswrapper[4733]: I0318 10:14:51.436271 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"908bd772-fb33-4f68-8971-d1fef3118c82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3457636bb3e1cc25507158454524b9cee6812beb56c7b22fb86b9438b8082488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:51Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:51 crc kubenswrapper[4733]: I0318 10:14:51.450031 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a37904dce4f31563b6bf3db4a4e779fcaebf12e80cdabf402fb1fcf03320f46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:51Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:51 crc kubenswrapper[4733]: I0318 10:14:51.462472 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4s425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3650177-e338-4eba-ab42-bc0cd14c9d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4s425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:51Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:51 crc kubenswrapper[4733]: I0318 10:14:51.477077 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a14e8a496af63cf1951ed21cfb3b13b1b516b00271dce19cdf858148beff398b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61bc78e89fc84025b585b2a421fa96e8da9f90840b8c78c0658f30d8738c64ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:51Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:51 crc kubenswrapper[4733]: I0318 10:14:51.498070 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddb303e3-8922-4b43-9bba-2d3f0c30c6b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1614bd2915eb4ab62554cfe72d63669c062baaf25ae2e533788b876ff9544eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aaa002cf5203102149456e58fcc5db02a5e861736d3699e432a91186bac47d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edcafff0c9902e275fc23a2f154d3030c0e751e2f3230a4ca226c9cef8efcbfa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa9eed1a11fd6a14b82ea9f34ead9b9c67e9c9d52c2675651b37f9838875052\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba371d0dc81f8827d305037cab25306e3abe8ed3d243f74923b4709198f7ea38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T10:13:42Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 10:13:41.916017 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 10:13:41.916132 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 10:13:41.917022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1943543564/tls.crt::/tmp/serving-cert-1943543564/tls.key\\\\\\\"\\\\nI0318 10:13:42.070462 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 10:13:42.072416 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 10:13:42.072438 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 10:13:42.072464 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 10:13:42.072469 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 10:13:42.076902 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 10:13:42.076943 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076949 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076959 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 10:13:42.076962 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 10:13:42.076967 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 10:13:42.076974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 10:13:42.077028 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 10:13:42.078631 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:13:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b698902beccdf67c5646c01b34eea131f61dee8d5d6e1f566cdb70c930b2cde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:51Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:51 crc kubenswrapper[4733]: I0318 10:14:51.511794 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e797e62-fc82-47f7-8c8c-6c11d3463304\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0cb2e53d9e61f6e93594f61ef9614e057a66575c32d18a010ab1ecfd3ac367f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2fe29241779e03381bb946ac650ea8a793785c0c3ed67302dd89f1c5e0d93e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b09c8d5c3c63eb7d9db92ce941aec0f0def87adbc1d46334ccc518a47c60f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b4b403598b0be68c5baba6e126ecad218005a9c2aeea9badf14dfc4859dce03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b4b403598b0be68c5baba6e126ecad218005a9c2aeea9badf14dfc4859dce03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:51Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:51 crc kubenswrapper[4733]: I0318 10:14:51.531587 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:51Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:52 crc kubenswrapper[4733]: I0318 10:14:52.497025 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:52 crc kubenswrapper[4733]: I0318 10:14:52.497058 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:52 crc kubenswrapper[4733]: I0318 10:14:52.497067 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:52 crc kubenswrapper[4733]: I0318 10:14:52.497080 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:52 crc kubenswrapper[4733]: I0318 10:14:52.497088 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:52Z","lastTransitionTime":"2026-03-18T10:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:52 crc kubenswrapper[4733]: E0318 10:14:52.515536 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a826494-c246-4717-869b-fd136e2b8410\\\",\\\"systemUUID\\\":\\\"fe704b25-4cdf-410a-9afb-ebc7963f4bc5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:52Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:52 crc kubenswrapper[4733]: I0318 10:14:52.519519 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:52 crc kubenswrapper[4733]: I0318 10:14:52.519554 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:52 crc kubenswrapper[4733]: I0318 10:14:52.519564 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:52 crc kubenswrapper[4733]: I0318 10:14:52.519578 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:52 crc kubenswrapper[4733]: I0318 10:14:52.519590 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:52Z","lastTransitionTime":"2026-03-18T10:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:52 crc kubenswrapper[4733]: E0318 10:14:52.536990 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a826494-c246-4717-869b-fd136e2b8410\\\",\\\"systemUUID\\\":\\\"fe704b25-4cdf-410a-9afb-ebc7963f4bc5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:52Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:52 crc kubenswrapper[4733]: I0318 10:14:52.540853 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:52 crc kubenswrapper[4733]: I0318 10:14:52.540945 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:52 crc kubenswrapper[4733]: I0318 10:14:52.540970 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:52 crc kubenswrapper[4733]: I0318 10:14:52.541003 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:52 crc kubenswrapper[4733]: I0318 10:14:52.541026 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:52Z","lastTransitionTime":"2026-03-18T10:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:52 crc kubenswrapper[4733]: E0318 10:14:52.561103 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a826494-c246-4717-869b-fd136e2b8410\\\",\\\"systemUUID\\\":\\\"fe704b25-4cdf-410a-9afb-ebc7963f4bc5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:52Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:52 crc kubenswrapper[4733]: I0318 10:14:52.565998 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:52 crc kubenswrapper[4733]: I0318 10:14:52.566066 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:52 crc kubenswrapper[4733]: I0318 10:14:52.566091 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:52 crc kubenswrapper[4733]: I0318 10:14:52.566122 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:52 crc kubenswrapper[4733]: I0318 10:14:52.566147 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:52Z","lastTransitionTime":"2026-03-18T10:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:52 crc kubenswrapper[4733]: E0318 10:14:52.582121 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a826494-c246-4717-869b-fd136e2b8410\\\",\\\"systemUUID\\\":\\\"fe704b25-4cdf-410a-9afb-ebc7963f4bc5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:52Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:52 crc kubenswrapper[4733]: I0318 10:14:52.587979 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:14:52 crc kubenswrapper[4733]: I0318 10:14:52.588046 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:14:52 crc kubenswrapper[4733]: I0318 10:14:52.588093 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:14:52 crc kubenswrapper[4733]: I0318 10:14:52.588127 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:14:52 crc kubenswrapper[4733]: I0318 10:14:52.588152 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:14:52Z","lastTransitionTime":"2026-03-18T10:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:14:52 crc kubenswrapper[4733]: E0318 10:14:52.603084 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:14:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a826494-c246-4717-869b-fd136e2b8410\\\",\\\"systemUUID\\\":\\\"fe704b25-4cdf-410a-9afb-ebc7963f4bc5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:52Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:52 crc kubenswrapper[4733]: E0318 10:14:52.603208 4733 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 10:14:53 crc kubenswrapper[4733]: I0318 10:14:53.175060 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:14:53 crc kubenswrapper[4733]: I0318 10:14:53.175109 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:14:53 crc kubenswrapper[4733]: I0318 10:14:53.175067 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:14:53 crc kubenswrapper[4733]: I0318 10:14:53.175067 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:14:53 crc kubenswrapper[4733]: E0318 10:14:53.175275 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 10:14:53 crc kubenswrapper[4733]: E0318 10:14:53.175392 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4s425" podUID="b3650177-e338-4eba-ab42-bc0cd14c9d65" Mar 18 10:14:53 crc kubenswrapper[4733]: E0318 10:14:53.175485 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 10:14:53 crc kubenswrapper[4733]: E0318 10:14:53.175864 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 10:14:54 crc kubenswrapper[4733]: I0318 10:14:54.175524 4733 scope.go:117] "RemoveContainer" containerID="e92f95b4e7499a2bf80c4498b1b592c2cb8a47a2602131b74b47ee685f9562f2" Mar 18 10:14:54 crc kubenswrapper[4733]: I0318 10:14:54.916831 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pxwd_73327417-4d3b-45f1-b3b6-575fdeeaa31a/ovnkube-controller/1.log" Mar 18 10:14:54 crc kubenswrapper[4733]: I0318 10:14:54.919356 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" event={"ID":"73327417-4d3b-45f1-b3b6-575fdeeaa31a","Type":"ContainerStarted","Data":"b1a81a8bb9ca8ad4c87fd9b3cd1ae0f5c21d0e4b39a32bd67b6c63b41175d0a6"} Mar 18 10:14:54 crc kubenswrapper[4733]: I0318 10:14:54.919799 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:14:54 crc kubenswrapper[4733]: I0318 10:14:54.931877 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g6j2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf9836f3455051ee686f0ec11ceb1c60cff06c95a16bf2fcff6c4c3ed600b034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ph8vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g6j2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:54Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:54 crc kubenswrapper[4733]: I0318 10:14:54.942016 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spfjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d693a73-68c1-4595-bbcc-be97691b06fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb07463e9cec5d204a136bc3da2a197f348b611ad242f9652741da372ebc490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e784035b634ef119368039982dbafab7f160c3864fe9ef9f5236d906de281b11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-spfjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:54Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:54 crc kubenswrapper[4733]: I0318 10:14:54.950484 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hsk58" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2c181c8-3361-40a2-afc5-a677e0ab4ecd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7ffcba189533d7ca155ab3284efac3d072ee3bc46d4b2a61247261bdaecb152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-httph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hsk58\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:54Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:54 crc kubenswrapper[4733]: I0318 10:14:54.960741 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aaa6e82080eecc5cde4d763e00b69fb4234de74431affa584f0b900a811dd2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:54Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:54 crc kubenswrapper[4733]: I0318 10:14:54.976935 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73327417-4d3b-45f1-b3b6-575fdeeaa31a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f1102a1e204850c8494267640da7ec93ec67e7341c4eb60d22a7f0772058cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6233b06ff88fa77ac74becea9ef44fd4aa09b0ae718390c1b73c78d353ecbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10bb15e1466307914440acf630eb4ea4a442cf4354a9cc5e6ddb40d8147a4d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7ac487e3e86e35e8a1d6ddf67975eb4a67657b219938a69a90ccd5774ee0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb388a698a6a2ee1963deeba09459d7190f60ed189ee20a2ba24de317604226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3043711b80807f025d3bd2e7b4593f22d78dd3e458aa185c18c065af4aca503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1a81a8bb9ca8ad4c87fd9b3cd1ae0f5c21d0e4b39a32bd67b6c63b41175d0a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e92f95b4e7499a2bf80c4498b1b592c2cb8a47a2602131b74b47ee685f9562f2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T10:14:38Z\\\",\\\"message\\\":\\\"xfvfl\\\\nI0318 10:14:38.896757 6729 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-xfvfl\\\\nI0318 10:14:38.896763 6729 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-xfvfl in node crc\\\\nI0318 10:14:38.896768 6729 obj_retry.go:386] Retry successful for *v1.Pod openshift-image-registry/node-ca-xfvfl after 0 failed attempt(s)\\\\nI0318 10:14:38.896772 6729 default_network_controller.go:776] Recording success event on pod openshift-image-registry/node-ca-xfvfl\\\\nI0318 10:14:38.896804 6729 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-g6j2q\\\\nI0318 10:14:38.896809 6729 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-g6j2q\\\\nI0318 10:14:38.896815 6729 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-g6j2q in node crc\\\\nI0318 10:14:38.896820 6729 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-g6j2q after 0 failed attempt(s)\\\\nI0318 10:14:38.896824 6729 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-g6j2q\\\\nI0318 10:14:38.896848 6729 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0318 10:14:38.896915 6729 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2d3be12ab406039374efc6a0094e21103be62b51ef65c4ccf5529d6ef05291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pxwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:54Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:54 crc kubenswrapper[4733]: I0318 10:14:54.989474 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"353ee984-b20f-41fa-978a-0167c20ede36\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4287a7d43815108131e4b725925805740a64682bc2a9c96ff054f65517e501f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7c73fedb720681572ba31d10e49b7fc28537f98b4afb32bee611e6265eafaff\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T10:13:13Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 10:12:43.210581 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 10:12:43.213660 1 observer_polling.go:159] Starting file observer\\\\nI0318 10:12:43.251533 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 10:12:43.256315 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0318 10:13:13.491530 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:13:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e84c65c99c9c698f4097bbffe0efebd320e4fc2c4a58788a606e7f0b98e1822\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b54b5cea02ea38b404d6b5730afbab0f729978207023e1dfa7cc49ea9736795\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28b29e8c4af41ef6391d7ea79821c7caa64424b8113473541a96ae936db10015\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:54Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:54 crc kubenswrapper[4733]: I0318 10:14:54.999176 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:54Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:55 crc kubenswrapper[4733]: I0318 10:14:55.007466 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xfvfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb58b528-9013-4fab-9747-60bb6ff1bc1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc72346f1bb873e40a1063486ebd2adfd16e3958e17730370c00cb3b775a982c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg7jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xfvfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:55Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:55 crc kubenswrapper[4733]: I0318 10:14:55.016848 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:55Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:55 crc kubenswrapper[4733]: I0318 10:14:55.031318 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t28sh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14c8bb1225c6c415d19ccaf11f0117aa22ccf43aa3b80472a8779ec5cea1aeb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f534576a07c40f1e53709418bbc816e439ac9e036c97e780922c19e36c272642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f534576a07c40f1e53709418bbc816e439ac9e036c97e780922c19e36c272642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819ab8cd6ee5f116abf7861120afb8ac702ade05eb0efc4f73366042f2dd3bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5819ab8cd6ee5f116abf7861120afb8ac702ade05eb0efc4f73366042f2dd3bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50025671e1a19a96b715db63cbfa653f3cf078d3bfcfd94803bca2fac9636637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50025671e1a19a96b715db63cbfa653f3cf078d3bfcfd94803bca2fac9636637\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bb1ae24448e6a65c00cce421e6bbc772e3cd734f8f35fb411ce8eac5d662256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bb1ae24448e6a65c00cce421e6bbc772e3cd734f8f35fb411ce8eac5d662256\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaee328f999419b8d4f3a3298dbf38fd35bc92f76224d38c6769aa7426bb9a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaee328f999419b8d4f3a3298dbf38fd35bc92f76224d38c6769aa7426bb9a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a0b2c5f56088e948c02d27d94da94aba67e2c6ffc58442adc30586a548271b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a0b2c5f56088e948c02d27d94da94aba67e2c6ffc58442adc30586a548271b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t28sh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:55Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:55 crc kubenswrapper[4733]: I0318 10:14:55.040726 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f75e1c5-e0c5-43df-944f-77b734070793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5b4eaa631b67f13321cd60f9136da1832c5cd6e226609c01cabfa28410630a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e7a90421535b4f8ff5e3b3a0ad9c958710094ffa4e3e4eb3eb41c79f80830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2h7dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:55Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:55 crc kubenswrapper[4733]: I0318 10:14:55.066143 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70b23fcc-38c7-420b-ad9a-57d1c547c788\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb888f7a23904596729e28ec137231447f22565be42be8589f1481aa52efd9f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d0f02cb69f907a82795f47bfae39d1f750bb7bedeeb6d0802e84087dd7150df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cfda710da166c7b27fe6df3f38f5f969d0edea58503530ace9d35e3a7ec1420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b454a77a46e10fcea3615e1f59d7849430a461ee7392b37fbbb6ec89e53eb432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://448f3d210c3e435bb68acc8f81dd92e63739d073e0d3746be3985c3d3fe07556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:55Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:55 crc kubenswrapper[4733]: I0318 10:14:55.074706 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"908bd772-fb33-4f68-8971-d1fef3118c82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3457636bb3e1cc25507158454524b9cee6812beb56c7b22fb86b9438b8082488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:55Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:55 crc kubenswrapper[4733]: I0318 10:14:55.085244 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a37904dce4f31563b6bf3db4a4e779fcaebf12e80cdabf402fb1fcf03320f46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:55Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:55 crc kubenswrapper[4733]: I0318 10:14:55.094856 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4s425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3650177-e338-4eba-ab42-bc0cd14c9d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4s425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:55Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:55 crc kubenswrapper[4733]: I0318 10:14:55.104815 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a14e8a496af63cf1951ed21cfb3b13b1b516b00271dce19cdf858148beff398b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61bc78e89fc84025b585b2a421fa96e8da9f90840b8c78c0658f30d8738c64ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:55Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:55 crc kubenswrapper[4733]: I0318 10:14:55.115387 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddb303e3-8922-4b43-9bba-2d3f0c30c6b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1614bd2915eb4ab62554cfe72d63669c062baaf25ae2e533788b876ff9544eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aaa002cf5203102149456e58fcc5db02a5e861736d3699e432a91186bac47d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edcafff0c9902e275fc23a2f154d3030c0e751e2f3230a4ca226c9cef8efcbfa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa9eed1a11fd6a14b82ea9f34ead9b9c67e9c9d52c2675651b37f9838875052\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba371d0dc81f8827d305037cab25306e3abe8ed3d243f74923b4709198f7ea38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T10:13:42Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 10:13:41.916017 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 10:13:41.916132 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 10:13:41.917022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1943543564/tls.crt::/tmp/serving-cert-1943543564/tls.key\\\\\\\"\\\\nI0318 10:13:42.070462 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 10:13:42.072416 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 10:13:42.072438 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 10:13:42.072464 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 10:13:42.072469 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 10:13:42.076902 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 10:13:42.076943 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076949 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076959 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 10:13:42.076962 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 10:13:42.076967 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 10:13:42.076974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 10:13:42.077028 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 10:13:42.078631 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:13:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b698902beccdf67c5646c01b34eea131f61dee8d5d6e1f566cdb70c930b2cde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:55Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:55 crc kubenswrapper[4733]: I0318 10:14:55.126915 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e797e62-fc82-47f7-8c8c-6c11d3463304\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0cb2e53d9e61f6e93594f61ef9614e057a66575c32d18a010ab1ecfd3ac367f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2fe29241779e03381bb946ac650ea8a793785c0c3ed67302dd89f1c5e0d93e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b09c8d5c3c63eb7d9db92ce941aec0f0def87adbc1d46334ccc518a47c60f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b4b403598b0be68c5baba6e126ecad218005a9c2aeea9badf14dfc4859dce03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b4b403598b0be68c5baba6e126ecad218005a9c2aeea9badf14dfc4859dce03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:55Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:55 crc kubenswrapper[4733]: I0318 10:14:55.138828 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:55Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:55 crc kubenswrapper[4733]: I0318 10:14:55.174869 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:14:55 crc kubenswrapper[4733]: E0318 10:14:55.174994 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4s425" podUID="b3650177-e338-4eba-ab42-bc0cd14c9d65" Mar 18 10:14:55 crc kubenswrapper[4733]: I0318 10:14:55.175162 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:14:55 crc kubenswrapper[4733]: I0318 10:14:55.175243 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:14:55 crc kubenswrapper[4733]: E0318 10:14:55.175253 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 10:14:55 crc kubenswrapper[4733]: E0318 10:14:55.175313 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 10:14:55 crc kubenswrapper[4733]: I0318 10:14:55.174882 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:14:55 crc kubenswrapper[4733]: E0318 10:14:55.175382 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 10:14:55 crc kubenswrapper[4733]: I0318 10:14:55.925084 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pxwd_73327417-4d3b-45f1-b3b6-575fdeeaa31a/ovnkube-controller/2.log" Mar 18 10:14:55 crc kubenswrapper[4733]: I0318 10:14:55.926019 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pxwd_73327417-4d3b-45f1-b3b6-575fdeeaa31a/ovnkube-controller/1.log" Mar 18 10:14:55 crc kubenswrapper[4733]: I0318 10:14:55.929163 4733 generic.go:334] "Generic (PLEG): container finished" podID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" containerID="b1a81a8bb9ca8ad4c87fd9b3cd1ae0f5c21d0e4b39a32bd67b6c63b41175d0a6" exitCode=1 Mar 18 10:14:55 crc kubenswrapper[4733]: I0318 10:14:55.929229 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" event={"ID":"73327417-4d3b-45f1-b3b6-575fdeeaa31a","Type":"ContainerDied","Data":"b1a81a8bb9ca8ad4c87fd9b3cd1ae0f5c21d0e4b39a32bd67b6c63b41175d0a6"} Mar 18 10:14:55 crc kubenswrapper[4733]: I0318 10:14:55.929330 4733 scope.go:117] "RemoveContainer" containerID="e92f95b4e7499a2bf80c4498b1b592c2cb8a47a2602131b74b47ee685f9562f2" Mar 18 10:14:55 crc kubenswrapper[4733]: I0318 10:14:55.929835 4733 scope.go:117] "RemoveContainer" containerID="b1a81a8bb9ca8ad4c87fd9b3cd1ae0f5c21d0e4b39a32bd67b6c63b41175d0a6" Mar 18 10:14:55 crc kubenswrapper[4733]: E0318 10:14:55.929980 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7pxwd_openshift-ovn-kubernetes(73327417-4d3b-45f1-b3b6-575fdeeaa31a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" podUID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" Mar 18 10:14:55 crc kubenswrapper[4733]: I0318 10:14:55.942780 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spfjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d693a73-68c1-4595-bbcc-be97691b06fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb07463e9cec5d204a136bc3da2a197f348b611ad242f9652741da372ebc490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e784035b634ef119368039982dbafab7f160c3864fe9ef9f5236d906de281b11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-spfjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:55Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:55 crc kubenswrapper[4733]: I0318 10:14:55.956859 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hsk58" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2c181c8-3361-40a2-afc5-a677e0ab4ecd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7ffcba189533d7ca155ab3284efac3d072ee3bc46d4b2a61247261bdaecb152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-httph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hsk58\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:55Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:55 crc kubenswrapper[4733]: I0318 10:14:55.972846 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aaa6e82080eecc5cde4d763e00b69fb4234de74431affa584f0b900a811dd2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:55Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:55 crc kubenswrapper[4733]: I0318 10:14:55.987898 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g6j2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf9836f3455051ee686f0ec11ceb1c60cff06c95a16bf2fcff6c4c3ed600b034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ph8vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g6j2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:55Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:56 crc kubenswrapper[4733]: I0318 10:14:56.007421 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"353ee984-b20f-41fa-978a-0167c20ede36\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4287a7d43815108131e4b725925805740a64682bc2a9c96ff054f65517e501f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7c73fedb720681572ba31d10e49b7fc28537f98b4afb32bee611e6265eafaff\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T10:13:13Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 10:12:43.210581 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 10:12:43.213660 1 observer_polling.go:159] Starting file observer\\\\nI0318 10:12:43.251533 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 10:12:43.256315 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0318 10:13:13.491530 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:13:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e84c65c99c9c698f4097bbffe0efebd320e4fc2c4a58788a606e7f0b98e1822\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b54b5cea02ea38b404d6b5730afbab0f729978207023e1dfa7cc49ea9736795\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28b29e8c4af41ef6391d7ea79821c7caa64424b8113473541a96ae936db10015\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:56Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:56 crc kubenswrapper[4733]: I0318 10:14:56.025320 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:56Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:56 crc kubenswrapper[4733]: I0318 10:14:56.038855 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xfvfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb58b528-9013-4fab-9747-60bb6ff1bc1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc72346f1bb873e40a1063486ebd2adfd16e3958e17730370c00cb3b775a982c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg7jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xfvfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:56Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:56 crc kubenswrapper[4733]: I0318 10:14:56.061699 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73327417-4d3b-45f1-b3b6-575fdeeaa31a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f1102a1e204850c8494267640da7ec93ec67e7341c4eb60d22a7f0772058cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6233b06ff88fa77ac74becea9ef44fd4aa09b0ae718390c1b73c78d353ecbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10bb15e1466307914440acf630eb4ea4a442cf4354a9cc5e6ddb40d8147a4d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7ac487e3e86e35e8a1d6ddf67975eb4a67657b219938a69a90ccd5774ee0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb388a698a6a2ee1963deeba09459d7190f60ed189ee20a2ba24de317604226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3043711b80807f025d3bd2e7b4593f22d78dd3e458aa185c18c065af4aca503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1a81a8bb9ca8ad4c87fd9b3cd1ae0f5c21d0e4b39a32bd67b6c63b41175d0a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e92f95b4e7499a2bf80c4498b1b592c2cb8a47a2602131b74b47ee685f9562f2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T10:14:38Z\\\",\\\"message\\\":\\\"xfvfl\\\\nI0318 10:14:38.896757 6729 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-xfvfl\\\\nI0318 10:14:38.896763 6729 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-xfvfl in node crc\\\\nI0318 10:14:38.896768 6729 obj_retry.go:386] Retry successful for *v1.Pod openshift-image-registry/node-ca-xfvfl after 0 failed attempt(s)\\\\nI0318 10:14:38.896772 6729 default_network_controller.go:776] Recording success event on pod openshift-image-registry/node-ca-xfvfl\\\\nI0318 10:14:38.896804 6729 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-g6j2q\\\\nI0318 10:14:38.896809 6729 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-g6j2q\\\\nI0318 10:14:38.896815 6729 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-g6j2q in node crc\\\\nI0318 10:14:38.896820 6729 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-g6j2q after 0 failed attempt(s)\\\\nI0318 10:14:38.896824 6729 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-g6j2q\\\\nI0318 10:14:38.896848 6729 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0318 10:14:38.896915 6729 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1a81a8bb9ca8ad4c87fd9b3cd1ae0f5c21d0e4b39a32bd67b6c63b41175d0a6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T10:14:54Z\\\",\\\"message\\\":\\\"\\\\nI0318 10:14:54.959635 6930 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0318 10:14:54.959527 6930 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI0318 10:14:54.959642 6930 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI0318 10:14:54.959646 6930 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0318 10:14:54.959649 6930 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nI0318 10:14:54.959466 6930 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0318 10:14:54.959658 6930 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI0318 10:14:54.959662 6930 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI0318 10:14:54.959665 6930 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nF0318 10:14:54.959243 6930 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2d3be12ab406039374efc6a0094e21103be62b51ef65c4ccf5529d6ef05291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pxwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:56Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:56 crc kubenswrapper[4733]: I0318 10:14:56.086440 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70b23fcc-38c7-420b-ad9a-57d1c547c788\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb888f7a23904596729e28ec137231447f22565be42be8589f1481aa52efd9f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d0f02cb69f907a82795f47bfae39d1f750bb7bedeeb6d0802e84087dd7150df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cfda710da166c7b27fe6df3f38f5f969d0edea58503530ace9d35e3a7ec1420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b454a77a46e10fcea3615e1f59d7849430a461ee7392b37fbbb6ec89e53eb432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://448f3d210c3e435bb68acc8f81dd92e63739d073e0d3746be3985c3d3fe07556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:56Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:56 crc kubenswrapper[4733]: I0318 10:14:56.100389 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"908bd772-fb33-4f68-8971-d1fef3118c82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3457636bb3e1cc25507158454524b9cee6812beb56c7b22fb86b9438b8082488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:56Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:56 crc kubenswrapper[4733]: I0318 10:14:56.122352 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a37904dce4f31563b6bf3db4a4e779fcaebf12e80cdabf402fb1fcf03320f46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:56Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:56 crc kubenswrapper[4733]: I0318 10:14:56.139781 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:56Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:56 crc kubenswrapper[4733]: I0318 10:14:56.161175 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t28sh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14c8bb1225c6c415d19ccaf11f0117aa22ccf43aa3b80472a8779ec5cea1aeb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f534576a07c40f1e53709418bbc816e439ac9e036c97e780922c19e36c272642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f534576a07c40f1e53709418bbc816e439ac9e036c97e780922c19e36c272642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819ab8cd6ee5f116abf7861120afb8ac702ade05eb0efc4f73366042f2dd3bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5819ab8cd6ee5f116abf7861120afb8ac702ade05eb0efc4f73366042f2dd3bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50025671e1a19a96b715db63cbfa653f3cf078d3bfcfd94803bca2fac9636637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50025671e1a19a96b715db63cbfa653f3cf078d3bfcfd94803bca2fac9636637\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bb1ae24448e6a65c00cce421e6bbc772e3cd734f8f35fb411ce8eac5d662256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bb1ae24448e6a65c00cce421e6bbc772e3cd734f8f35fb411ce8eac5d662256\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaee328f999419b8d4f3a3298dbf38fd35bc92f76224d38c6769aa7426bb9a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaee328f999419b8d4f3a3298dbf38fd35bc92f76224d38c6769aa7426bb9a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a0b2c5f56088e948c02d27d94da94aba67e2c6ffc58442adc30586a548271b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a0b2c5f56088e948c02d27d94da94aba67e2c6ffc58442adc30586a548271b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t28sh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:56Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:56 crc kubenswrapper[4733]: I0318 10:14:56.176234 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f75e1c5-e0c5-43df-944f-77b734070793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5b4eaa631b67f13321cd60f9136da1832c5cd6e226609c01cabfa28410630a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e7a90421535b4f8ff5e3b3a0ad9c958710094ffa4e3e4eb3eb41c79f80830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2h7dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:56Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:56 crc kubenswrapper[4733]: I0318 10:14:56.192121 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddb303e3-8922-4b43-9bba-2d3f0c30c6b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1614bd2915eb4ab62554cfe72d63669c062baaf25ae2e533788b876ff9544eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aaa002cf5203102149456e58fcc5db02a5e861736d3699e432a91186bac47d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edcafff0c9902e275fc23a2f154d3030c0e751e2f3230a4ca226c9cef8efcbfa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa9eed1a11fd6a14b82ea9f34ead9b9c67e9c9d52c2675651b37f9838875052\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba371d0dc81f8827d305037cab25306e3abe8ed3d243f74923b4709198f7ea38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T10:13:42Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 10:13:41.916017 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 10:13:41.916132 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 10:13:41.917022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1943543564/tls.crt::/tmp/serving-cert-1943543564/tls.key\\\\\\\"\\\\nI0318 10:13:42.070462 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 10:13:42.072416 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 10:13:42.072438 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 10:13:42.072464 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 10:13:42.072469 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 10:13:42.076902 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 10:13:42.076943 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076949 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076959 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 10:13:42.076962 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 10:13:42.076967 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 10:13:42.076974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 10:13:42.077028 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 10:13:42.078631 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:13:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b698902beccdf67c5646c01b34eea131f61dee8d5d6e1f566cdb70c930b2cde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:56Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:56 crc kubenswrapper[4733]: I0318 10:14:56.208509 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e797e62-fc82-47f7-8c8c-6c11d3463304\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0cb2e53d9e61f6e93594f61ef9614e057a66575c32d18a010ab1ecfd3ac367f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2fe29241779e03381bb946ac650ea8a793785c0c3ed67302dd89f1c5e0d93e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b09c8d5c3c63eb7d9db92ce941aec0f0def87adbc1d46334ccc518a47c60f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b4b403598b0be68c5baba6e126ecad218005a9c2aeea9badf14dfc4859dce03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b4b403598b0be68c5baba6e126ecad218005a9c2aeea9badf14dfc4859dce03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:56Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:56 crc kubenswrapper[4733]: I0318 10:14:56.224002 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:56Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:56 crc kubenswrapper[4733]: I0318 10:14:56.238700 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4s425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3650177-e338-4eba-ab42-bc0cd14c9d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4s425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:56Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:56 crc kubenswrapper[4733]: I0318 10:14:56.255831 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a14e8a496af63cf1951ed21cfb3b13b1b516b00271dce19cdf858148beff398b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61bc78e89fc84025b585b2a421fa96e8da9f90840b8c78c0658f30d8738c64ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:56Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:56 crc kubenswrapper[4733]: E0318 10:14:56.270720 4733 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 10:14:56 crc kubenswrapper[4733]: I0318 10:14:56.935910 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pxwd_73327417-4d3b-45f1-b3b6-575fdeeaa31a/ovnkube-controller/2.log" Mar 18 10:14:56 crc kubenswrapper[4733]: I0318 10:14:56.942835 4733 scope.go:117] "RemoveContainer" containerID="b1a81a8bb9ca8ad4c87fd9b3cd1ae0f5c21d0e4b39a32bd67b6c63b41175d0a6" Mar 18 10:14:56 crc kubenswrapper[4733]: E0318 10:14:56.943417 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7pxwd_openshift-ovn-kubernetes(73327417-4d3b-45f1-b3b6-575fdeeaa31a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" podUID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" Mar 18 10:14:56 crc kubenswrapper[4733]: I0318 10:14:56.966674 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddb303e3-8922-4b43-9bba-2d3f0c30c6b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1614bd2915eb4ab62554cfe72d63669c062baaf25ae2e533788b876ff9544eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aaa002cf5203102149456e58fcc5db02a5e861736d3699e432a91186bac47d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edcafff0c9902e275fc23a2f154d3030c0e751e2f3230a4ca226c9cef8efcbfa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa9eed1a11fd6a14b82ea9f34ead9b9c67e9c9d52c2675651b37f9838875052\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba371d0dc81f8827d305037cab25306e3abe8ed3d243f74923b4709198f7ea38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T10:13:42Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 10:13:41.916017 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 10:13:41.916132 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 10:13:41.917022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1943543564/tls.crt::/tmp/serving-cert-1943543564/tls.key\\\\\\\"\\\\nI0318 10:13:42.070462 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 10:13:42.072416 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 10:13:42.072438 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 10:13:42.072464 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 10:13:42.072469 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 10:13:42.076902 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 10:13:42.076943 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076949 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076959 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 10:13:42.076962 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 10:13:42.076967 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 10:13:42.076974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 10:13:42.077028 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 10:13:42.078631 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:13:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b698902beccdf67c5646c01b34eea131f61dee8d5d6e1f566cdb70c930b2cde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:56Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:56 crc kubenswrapper[4733]: I0318 10:14:56.986285 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e797e62-fc82-47f7-8c8c-6c11d3463304\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0cb2e53d9e61f6e93594f61ef9614e057a66575c32d18a010ab1ecfd3ac367f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2fe29241779e03381bb946ac650ea8a793785c0c3ed67302dd89f1c5e0d93e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b09c8d5c3c63eb7d9db92ce941aec0f0def87adbc1d46334ccc518a47c60f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b4b403598b0be68c5baba6e126ecad218005a9c2aeea9badf14dfc4859dce03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b4b403598b0be68c5baba6e126ecad218005a9c2aeea9badf14dfc4859dce03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:56Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:57 crc kubenswrapper[4733]: I0318 10:14:57.004403 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:57Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:57 crc kubenswrapper[4733]: I0318 10:14:57.019374 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4s425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3650177-e338-4eba-ab42-bc0cd14c9d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4s425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:57Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:57 crc kubenswrapper[4733]: I0318 10:14:57.033793 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a14e8a496af63cf1951ed21cfb3b13b1b516b00271dce19cdf858148beff398b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61bc78e89fc84025b585b2a421fa96e8da9f90840b8c78c0658f30d8738c64ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:57Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:57 crc kubenswrapper[4733]: I0318 10:14:57.051817 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spfjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d693a73-68c1-4595-bbcc-be97691b06fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb07463e9cec5d204a136bc3da2a197f348b611ad242f9652741da372ebc490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e784035b634ef119368039982dbafab7f160c3864fe9ef9f5236d906de281b11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-spfjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:57Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:57 crc kubenswrapper[4733]: I0318 10:14:57.063491 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hsk58" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2c181c8-3361-40a2-afc5-a677e0ab4ecd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7ffcba189533d7ca155ab3284efac3d072ee3bc46d4b2a61247261bdaecb152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-httph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hsk58\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:57Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:57 crc kubenswrapper[4733]: I0318 10:14:57.078298 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aaa6e82080eecc5cde4d763e00b69fb4234de74431affa584f0b900a811dd2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:57Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:57 crc kubenswrapper[4733]: I0318 10:14:57.097706 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g6j2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf9836f3455051ee686f0ec11ceb1c60cff06c95a16bf2fcff6c4c3ed600b034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ph8vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g6j2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:57Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:57 crc kubenswrapper[4733]: I0318 10:14:57.113289 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"353ee984-b20f-41fa-978a-0167c20ede36\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4287a7d43815108131e4b725925805740a64682bc2a9c96ff054f65517e501f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7c73fedb720681572ba31d10e49b7fc28537f98b4afb32bee611e6265eafaff\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T10:13:13Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 10:12:43.210581 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 10:12:43.213660 1 observer_polling.go:159] Starting file observer\\\\nI0318 10:12:43.251533 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 10:12:43.256315 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0318 10:13:13.491530 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:13:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e84c65c99c9c698f4097bbffe0efebd320e4fc2c4a58788a606e7f0b98e1822\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b54b5cea02ea38b404d6b5730afbab0f729978207023e1dfa7cc49ea9736795\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28b29e8c4af41ef6391d7ea79821c7caa64424b8113473541a96ae936db10015\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:57Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:57 crc kubenswrapper[4733]: I0318 10:14:57.133381 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:57Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:57 crc kubenswrapper[4733]: I0318 10:14:57.147521 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xfvfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb58b528-9013-4fab-9747-60bb6ff1bc1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc72346f1bb873e40a1063486ebd2adfd16e3958e17730370c00cb3b775a982c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg7jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xfvfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:57Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:57 crc kubenswrapper[4733]: I0318 10:14:57.171967 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73327417-4d3b-45f1-b3b6-575fdeeaa31a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f1102a1e204850c8494267640da7ec93ec67e7341c4eb60d22a7f0772058cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6233b06ff88fa77ac74becea9ef44fd4aa09b0ae718390c1b73c78d353ecbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10bb15e1466307914440acf630eb4ea4a442cf4354a9cc5e6ddb40d8147a4d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7ac487e3e86e35e8a1d6ddf67975eb4a67657b219938a69a90ccd5774ee0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb388a698a6a2ee1963deeba09459d7190f60ed189ee20a2ba24de317604226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3043711b80807f025d3bd2e7b4593f22d78dd3e458aa185c18c065af4aca503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1a81a8bb9ca8ad4c87fd9b3cd1ae0f5c21d0e4b39a32bd67b6c63b41175d0a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1a81a8bb9ca8ad4c87fd9b3cd1ae0f5c21d0e4b39a32bd67b6c63b41175d0a6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T10:14:54Z\\\",\\\"message\\\":\\\"\\\\nI0318 10:14:54.959635 6930 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0318 10:14:54.959527 6930 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI0318 10:14:54.959642 6930 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI0318 10:14:54.959646 6930 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0318 10:14:54.959649 6930 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nI0318 10:14:54.959466 6930 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0318 10:14:54.959658 6930 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI0318 10:14:54.959662 6930 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI0318 10:14:54.959665 6930 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nF0318 10:14:54.959243 6930 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7pxwd_openshift-ovn-kubernetes(73327417-4d3b-45f1-b3b6-575fdeeaa31a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2d3be12ab406039374efc6a0094e21103be62b51ef65c4ccf5529d6ef05291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pxwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:57Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:57 crc kubenswrapper[4733]: I0318 10:14:57.176306 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:14:57 crc kubenswrapper[4733]: I0318 10:14:57.176473 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:14:57 crc kubenswrapper[4733]: I0318 10:14:57.176321 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:14:57 crc kubenswrapper[4733]: E0318 10:14:57.176566 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 10:14:57 crc kubenswrapper[4733]: E0318 10:14:57.176707 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4s425" podUID="b3650177-e338-4eba-ab42-bc0cd14c9d65" Mar 18 10:14:57 crc kubenswrapper[4733]: I0318 10:14:57.176760 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:14:57 crc kubenswrapper[4733]: E0318 10:14:57.176886 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 10:14:57 crc kubenswrapper[4733]: E0318 10:14:57.177046 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 10:14:57 crc kubenswrapper[4733]: I0318 10:14:57.190214 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f75e1c5-e0c5-43df-944f-77b734070793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5b4eaa631b67f13321cd60f9136da1832c5cd6e226609c01cabfa28410630a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e7a90421535b4f8ff5e3b3a0ad9c958710094ffa4e3e4eb3eb41c79f80830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2h7dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:57Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:57 crc kubenswrapper[4733]: I0318 10:14:57.218271 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70b23fcc-38c7-420b-ad9a-57d1c547c788\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb888f7a23904596729e28ec137231447f22565be42be8589f1481aa52efd9f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d0f02cb69f907a82795f47bfae39d1f750bb7bedeeb6d0802e84087dd7150df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cfda710da166c7b27fe6df3f38f5f969d0edea58503530ace9d35e3a7ec1420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b454a77a46e10fcea3615e1f59d7849430a461ee7392b37fbbb6ec89e53eb432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://448f3d210c3e435bb68acc8f81dd92e63739d073e0d3746be3985c3d3fe07556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:57Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:57 crc kubenswrapper[4733]: I0318 10:14:57.231840 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"908bd772-fb33-4f68-8971-d1fef3118c82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3457636bb3e1cc25507158454524b9cee6812beb56c7b22fb86b9438b8082488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:57Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:57 crc kubenswrapper[4733]: I0318 10:14:57.248673 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a37904dce4f31563b6bf3db4a4e779fcaebf12e80cdabf402fb1fcf03320f46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:57Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:57 crc kubenswrapper[4733]: I0318 10:14:57.267768 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:57Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:57 crc kubenswrapper[4733]: I0318 10:14:57.284342 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t28sh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14c8bb1225c6c415d19ccaf11f0117aa22ccf43aa3b80472a8779ec5cea1aeb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f534576a07c40f1e53709418bbc816e439ac9e036c97e780922c19e36c272642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f534576a07c40f1e53709418bbc816e439ac9e036c97e780922c19e36c272642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819ab8cd6ee5f116abf7861120afb8ac702ade05eb0efc4f73366042f2dd3bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5819ab8cd6ee5f116abf7861120afb8ac702ade05eb0efc4f73366042f2dd3bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50025671e1a19a96b715db63cbfa653f3cf078d3bfcfd94803bca2fac9636637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50025671e1a19a96b715db63cbfa653f3cf078d3bfcfd94803bca2fac9636637\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bb1ae24448e6a65c00cce421e6bbc772e3cd734f8f35fb411ce8eac5d662256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bb1ae24448e6a65c00cce421e6bbc772e3cd734f8f35fb411ce8eac5d662256\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaee328f999419b8d4f3a3298dbf38fd35bc92f76224d38c6769aa7426bb9a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaee328f999419b8d4f3a3298dbf38fd35bc92f76224d38c6769aa7426bb9a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a0b2c5f56088e948c02d27d94da94aba67e2c6ffc58442adc30586a548271b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a0b2c5f56088e948c02d27d94da94aba67e2c6ffc58442adc30586a548271b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t28sh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:14:57Z is after 2025-08-24T17:21:41Z" Mar 18 10:14:59 crc kubenswrapper[4733]: I0318 10:14:59.175473 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:14:59 crc kubenswrapper[4733]: I0318 10:14:59.175488 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:14:59 crc kubenswrapper[4733]: E0318 10:14:59.176235 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4s425" podUID="b3650177-e338-4eba-ab42-bc0cd14c9d65" Mar 18 10:14:59 crc kubenswrapper[4733]: I0318 10:14:59.175562 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:14:59 crc kubenswrapper[4733]: E0318 10:14:59.176343 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 10:14:59 crc kubenswrapper[4733]: I0318 10:14:59.175527 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:14:59 crc kubenswrapper[4733]: E0318 10:14:59.176442 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 10:14:59 crc kubenswrapper[4733]: E0318 10:14:59.176516 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 10:15:01 crc kubenswrapper[4733]: I0318 10:15:01.175477 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:15:01 crc kubenswrapper[4733]: E0318 10:15:01.175673 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 10:15:01 crc kubenswrapper[4733]: I0318 10:15:01.176305 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:15:01 crc kubenswrapper[4733]: I0318 10:15:01.176370 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:15:01 crc kubenswrapper[4733]: E0318 10:15:01.176399 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 10:15:01 crc kubenswrapper[4733]: E0318 10:15:01.176525 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4s425" podUID="b3650177-e338-4eba-ab42-bc0cd14c9d65" Mar 18 10:15:01 crc kubenswrapper[4733]: I0318 10:15:01.176760 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:15:01 crc kubenswrapper[4733]: E0318 10:15:01.177062 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 10:15:01 crc kubenswrapper[4733]: I0318 10:15:01.188760 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f75e1c5-e0c5-43df-944f-77b734070793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5b4eaa631b67f13321cd60f9136da1832c5cd6e226609c01cabfa28410630a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e7a90421535b4f8ff5e3b3a0ad9c958710094ffa4e3e4eb3eb41c79f80830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2h7dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:01Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:01 crc kubenswrapper[4733]: I0318 10:15:01.210958 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70b23fcc-38c7-420b-ad9a-57d1c547c788\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb888f7a23904596729e28ec137231447f22565be42be8589f1481aa52efd9f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d0f02cb69f907a82795f47bfae39d1f750bb7bedeeb6d0802e84087dd7150df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cfda710da166c7b27fe6df3f38f5f969d0edea58503530ace9d35e3a7ec1420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b454a77a46e10fcea3615e1f59d7849430a461ee7392b37fbbb6ec89e53eb432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://448f3d210c3e435bb68acc8f81dd92e63739d073e0d3746be3985c3d3fe07556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:01Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:01 crc kubenswrapper[4733]: I0318 10:15:01.222583 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"908bd772-fb33-4f68-8971-d1fef3118c82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3457636bb3e1cc25507158454524b9cee6812beb56c7b22fb86b9438b8082488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:01Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:01 crc kubenswrapper[4733]: I0318 10:15:01.239280 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a37904dce4f31563b6bf3db4a4e779fcaebf12e80cdabf402fb1fcf03320f46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:01Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:01 crc kubenswrapper[4733]: I0318 10:15:01.253731 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:01Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:01 crc kubenswrapper[4733]: E0318 10:15:01.271328 4733 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 10:15:01 crc kubenswrapper[4733]: I0318 10:15:01.275245 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t28sh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14c8bb1225c6c415d19ccaf11f0117aa22ccf43aa3b80472a8779ec5cea1aeb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f534576a07c40f1e53709418bbc816e439ac9e036c97e780922c19e36c272642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f534576a07c40f1e53709418bbc816e439ac9e036c97e780922c19e36c272642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819ab8cd6ee5f116abf7861120afb8ac702ade05eb0efc4f73366042f2dd3bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5819ab8cd6ee5f116abf7861120afb8ac702ade05eb0efc4f73366042f2dd3bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50025671e1a19a96b715db63cbfa653f3cf078d3bfcfd94803bca2fac9636637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50025671e1a19a96b715db63cbfa653f3cf078d3bfcfd94803bca2fac9636637\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bb1ae24448e6a65c00cce421e6bbc772e3cd734f8f35fb411ce8eac5d662256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bb1ae24448e6a65c00cce421e6bbc772e3cd734f8f35fb411ce8eac5d662256\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaee328f999419b8d4f3a3298dbf38fd35bc92f76224d38c6769aa7426bb9a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaee328f999419b8d4f3a3298dbf38fd35bc92f76224d38c6769aa7426bb9a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a0b2c5f56088e948c02d27d94da94aba67e2c6ffc58442adc30586a548271b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a0b2c5f56088e948c02d27d94da94aba67e2c6ffc58442adc30586a548271b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t28sh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:01Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:01 crc kubenswrapper[4733]: I0318 10:15:01.293909 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddb303e3-8922-4b43-9bba-2d3f0c30c6b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1614bd2915eb4ab62554cfe72d63669c062baaf25ae2e533788b876ff9544eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aaa002cf5203102149456e58fcc5db02a5e861736d3699e432a91186bac47d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edcafff0c9902e275fc23a2f154d3030c0e751e2f3230a4ca226c9cef8efcbfa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa9eed1a11fd6a14b82ea9f34ead9b9c67e9c9d52c2675651b37f9838875052\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba371d0dc81f8827d305037cab25306e3abe8ed3d243f74923b4709198f7ea38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T10:13:42Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 10:13:41.916017 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 10:13:41.916132 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 10:13:41.917022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1943543564/tls.crt::/tmp/serving-cert-1943543564/tls.key\\\\\\\"\\\\nI0318 10:13:42.070462 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 10:13:42.072416 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 10:13:42.072438 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 10:13:42.072464 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 10:13:42.072469 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 10:13:42.076902 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 10:13:42.076943 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076949 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076959 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 10:13:42.076962 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 10:13:42.076967 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 10:13:42.076974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 10:13:42.077028 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 10:13:42.078631 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:13:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b698902beccdf67c5646c01b34eea131f61dee8d5d6e1f566cdb70c930b2cde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:01Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:01 crc kubenswrapper[4733]: I0318 10:15:01.304311 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e797e62-fc82-47f7-8c8c-6c11d3463304\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0cb2e53d9e61f6e93594f61ef9614e057a66575c32d18a010ab1ecfd3ac367f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2fe29241779e03381bb946ac650ea8a793785c0c3ed67302dd89f1c5e0d93e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b09c8d5c3c63eb7d9db92ce941aec0f0def87adbc1d46334ccc518a47c60f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b4b403598b0be68c5baba6e126ecad218005a9c2aeea9badf14dfc4859dce03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b4b403598b0be68c5baba6e126ecad218005a9c2aeea9badf14dfc4859dce03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:01Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:01 crc kubenswrapper[4733]: I0318 10:15:01.314873 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:01Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:01 crc kubenswrapper[4733]: I0318 10:15:01.325401 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4s425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3650177-e338-4eba-ab42-bc0cd14c9d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4s425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:01Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:01 crc kubenswrapper[4733]: I0318 10:15:01.337759 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a14e8a496af63cf1951ed21cfb3b13b1b516b00271dce19cdf858148beff398b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61bc78e89fc84025b585b2a421fa96e8da9f90840b8c78c0658f30d8738c64ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:01Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:01 crc kubenswrapper[4733]: I0318 10:15:01.348788 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spfjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d693a73-68c1-4595-bbcc-be97691b06fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb07463e9cec5d204a136bc3da2a197f348b611ad242f9652741da372ebc490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e784035b634ef119368039982dbafab7f160c3864fe9ef9f5236d906de281b11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-spfjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:01Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:01 crc kubenswrapper[4733]: I0318 10:15:01.357859 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hsk58" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2c181c8-3361-40a2-afc5-a677e0ab4ecd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7ffcba189533d7ca155ab3284efac3d072ee3bc46d4b2a61247261bdaecb152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-httph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hsk58\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:01Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:01 crc kubenswrapper[4733]: I0318 10:15:01.367692 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aaa6e82080eecc5cde4d763e00b69fb4234de74431affa584f0b900a811dd2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:01Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:01 crc kubenswrapper[4733]: I0318 10:15:01.381964 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g6j2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf9836f3455051ee686f0ec11ceb1c60cff06c95a16bf2fcff6c4c3ed600b034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ph8vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g6j2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:01Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:01 crc kubenswrapper[4733]: I0318 10:15:01.398315 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"353ee984-b20f-41fa-978a-0167c20ede36\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4287a7d43815108131e4b725925805740a64682bc2a9c96ff054f65517e501f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7c73fedb720681572ba31d10e49b7fc28537f98b4afb32bee611e6265eafaff\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T10:13:13Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 10:12:43.210581 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 10:12:43.213660 1 observer_polling.go:159] Starting file observer\\\\nI0318 10:12:43.251533 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 10:12:43.256315 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0318 10:13:13.491530 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:13:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e84c65c99c9c698f4097bbffe0efebd320e4fc2c4a58788a606e7f0b98e1822\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b54b5cea02ea38b404d6b5730afbab0f729978207023e1dfa7cc49ea9736795\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28b29e8c4af41ef6391d7ea79821c7caa64424b8113473541a96ae936db10015\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:01Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:01 crc kubenswrapper[4733]: I0318 10:15:01.413659 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:01Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:01 crc kubenswrapper[4733]: I0318 10:15:01.427309 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xfvfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb58b528-9013-4fab-9747-60bb6ff1bc1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc72346f1bb873e40a1063486ebd2adfd16e3958e17730370c00cb3b775a982c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg7jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xfvfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:01Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:01 crc kubenswrapper[4733]: I0318 10:15:01.451100 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73327417-4d3b-45f1-b3b6-575fdeeaa31a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f1102a1e204850c8494267640da7ec93ec67e7341c4eb60d22a7f0772058cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6233b06ff88fa77ac74becea9ef44fd4aa09b0ae718390c1b73c78d353ecbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10bb15e1466307914440acf630eb4ea4a442cf4354a9cc5e6ddb40d8147a4d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7ac487e3e86e35e8a1d6ddf67975eb4a67657b219938a69a90ccd5774ee0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb388a698a6a2ee1963deeba09459d7190f60ed189ee20a2ba24de317604226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3043711b80807f025d3bd2e7b4593f22d78dd3e458aa185c18c065af4aca503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1a81a8bb9ca8ad4c87fd9b3cd1ae0f5c21d0e4b39a32bd67b6c63b41175d0a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1a81a8bb9ca8ad4c87fd9b3cd1ae0f5c21d0e4b39a32bd67b6c63b41175d0a6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T10:14:54Z\\\",\\\"message\\\":\\\"\\\\nI0318 10:14:54.959635 6930 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0318 10:14:54.959527 6930 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI0318 10:14:54.959642 6930 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI0318 10:14:54.959646 6930 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0318 10:14:54.959649 6930 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nI0318 10:14:54.959466 6930 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0318 10:14:54.959658 6930 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI0318 10:14:54.959662 6930 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI0318 10:14:54.959665 6930 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nF0318 10:14:54.959243 6930 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7pxwd_openshift-ovn-kubernetes(73327417-4d3b-45f1-b3b6-575fdeeaa31a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2d3be12ab406039374efc6a0094e21103be62b51ef65c4ccf5529d6ef05291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pxwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:01Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:02 crc kubenswrapper[4733]: I0318 10:15:02.655925 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:15:02 crc kubenswrapper[4733]: I0318 10:15:02.656010 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:15:02 crc kubenswrapper[4733]: I0318 10:15:02.656023 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:15:02 crc kubenswrapper[4733]: I0318 10:15:02.656051 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:15:02 crc kubenswrapper[4733]: I0318 10:15:02.656069 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:15:02Z","lastTransitionTime":"2026-03-18T10:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:15:02 crc kubenswrapper[4733]: E0318 10:15:02.675082 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a826494-c246-4717-869b-fd136e2b8410\\\",\\\"systemUUID\\\":\\\"fe704b25-4cdf-410a-9afb-ebc7963f4bc5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:02Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:02 crc kubenswrapper[4733]: I0318 10:15:02.680016 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:15:02 crc kubenswrapper[4733]: I0318 10:15:02.680090 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:15:02 crc kubenswrapper[4733]: I0318 10:15:02.680104 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:15:02 crc kubenswrapper[4733]: I0318 10:15:02.680129 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:15:02 crc kubenswrapper[4733]: I0318 10:15:02.680144 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:15:02Z","lastTransitionTime":"2026-03-18T10:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:15:02 crc kubenswrapper[4733]: E0318 10:15:02.697743 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a826494-c246-4717-869b-fd136e2b8410\\\",\\\"systemUUID\\\":\\\"fe704b25-4cdf-410a-9afb-ebc7963f4bc5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:02Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:02 crc kubenswrapper[4733]: I0318 10:15:02.701690 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:15:02 crc kubenswrapper[4733]: I0318 10:15:02.701787 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:15:02 crc kubenswrapper[4733]: I0318 10:15:02.701818 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:15:02 crc kubenswrapper[4733]: I0318 10:15:02.701854 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:15:02 crc kubenswrapper[4733]: I0318 10:15:02.701881 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:15:02Z","lastTransitionTime":"2026-03-18T10:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:15:02 crc kubenswrapper[4733]: E0318 10:15:02.718177 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a826494-c246-4717-869b-fd136e2b8410\\\",\\\"systemUUID\\\":\\\"fe704b25-4cdf-410a-9afb-ebc7963f4bc5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:02Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:02 crc kubenswrapper[4733]: I0318 10:15:02.722282 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:15:02 crc kubenswrapper[4733]: I0318 10:15:02.722318 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:15:02 crc kubenswrapper[4733]: I0318 10:15:02.722331 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:15:02 crc kubenswrapper[4733]: I0318 10:15:02.722355 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:15:02 crc kubenswrapper[4733]: I0318 10:15:02.722369 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:15:02Z","lastTransitionTime":"2026-03-18T10:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:15:02 crc kubenswrapper[4733]: E0318 10:15:02.738335 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a826494-c246-4717-869b-fd136e2b8410\\\",\\\"systemUUID\\\":\\\"fe704b25-4cdf-410a-9afb-ebc7963f4bc5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:02Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:02 crc kubenswrapper[4733]: I0318 10:15:02.743042 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:15:02 crc kubenswrapper[4733]: I0318 10:15:02.743114 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:15:02 crc kubenswrapper[4733]: I0318 10:15:02.743136 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:15:02 crc kubenswrapper[4733]: I0318 10:15:02.743164 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:15:02 crc kubenswrapper[4733]: I0318 10:15:02.743227 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:15:02Z","lastTransitionTime":"2026-03-18T10:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:15:02 crc kubenswrapper[4733]: E0318 10:15:02.760068 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a826494-c246-4717-869b-fd136e2b8410\\\",\\\"systemUUID\\\":\\\"fe704b25-4cdf-410a-9afb-ebc7963f4bc5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:02Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:02 crc kubenswrapper[4733]: E0318 10:15:02.760349 4733 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 10:15:03 crc kubenswrapper[4733]: I0318 10:15:03.175447 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:15:03 crc kubenswrapper[4733]: I0318 10:15:03.175568 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:15:03 crc kubenswrapper[4733]: I0318 10:15:03.175601 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:15:03 crc kubenswrapper[4733]: E0318 10:15:03.175702 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 10:15:03 crc kubenswrapper[4733]: I0318 10:15:03.175763 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:15:03 crc kubenswrapper[4733]: E0318 10:15:03.175887 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 10:15:03 crc kubenswrapper[4733]: E0318 10:15:03.175968 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4s425" podUID="b3650177-e338-4eba-ab42-bc0cd14c9d65" Mar 18 10:15:03 crc kubenswrapper[4733]: E0318 10:15:03.176021 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 10:15:05 crc kubenswrapper[4733]: I0318 10:15:05.175065 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:15:05 crc kubenswrapper[4733]: I0318 10:15:05.175138 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:15:05 crc kubenswrapper[4733]: I0318 10:15:05.175237 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:15:05 crc kubenswrapper[4733]: E0318 10:15:05.176555 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4s425" podUID="b3650177-e338-4eba-ab42-bc0cd14c9d65" Mar 18 10:15:05 crc kubenswrapper[4733]: E0318 10:15:05.176718 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 10:15:05 crc kubenswrapper[4733]: I0318 10:15:05.175272 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:15:05 crc kubenswrapper[4733]: E0318 10:15:05.176931 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 10:15:05 crc kubenswrapper[4733]: E0318 10:15:05.177139 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 10:15:06 crc kubenswrapper[4733]: E0318 10:15:06.272751 4733 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 10:15:07 crc kubenswrapper[4733]: I0318 10:15:07.174814 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:15:07 crc kubenswrapper[4733]: I0318 10:15:07.174929 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:15:07 crc kubenswrapper[4733]: I0318 10:15:07.174929 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:15:07 crc kubenswrapper[4733]: E0318 10:15:07.175571 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4s425" podUID="b3650177-e338-4eba-ab42-bc0cd14c9d65" Mar 18 10:15:07 crc kubenswrapper[4733]: E0318 10:15:07.175556 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 10:15:07 crc kubenswrapper[4733]: I0318 10:15:07.175081 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:15:07 crc kubenswrapper[4733]: E0318 10:15:07.175863 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 10:15:07 crc kubenswrapper[4733]: E0318 10:15:07.176308 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 10:15:09 crc kubenswrapper[4733]: I0318 10:15:09.175045 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:15:09 crc kubenswrapper[4733]: E0318 10:15:09.175313 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 10:15:09 crc kubenswrapper[4733]: I0318 10:15:09.175539 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:15:09 crc kubenswrapper[4733]: I0318 10:15:09.175646 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:15:09 crc kubenswrapper[4733]: E0318 10:15:09.175781 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 10:15:09 crc kubenswrapper[4733]: I0318 10:15:09.175802 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:15:09 crc kubenswrapper[4733]: E0318 10:15:09.176056 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4s425" podUID="b3650177-e338-4eba-ab42-bc0cd14c9d65" Mar 18 10:15:09 crc kubenswrapper[4733]: E0318 10:15:09.176137 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 10:15:11 crc kubenswrapper[4733]: I0318 10:15:11.174629 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:15:11 crc kubenswrapper[4733]: I0318 10:15:11.174741 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:15:11 crc kubenswrapper[4733]: I0318 10:15:11.175469 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:15:11 crc kubenswrapper[4733]: I0318 10:15:11.175570 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:15:11 crc kubenswrapper[4733]: E0318 10:15:11.175667 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4s425" podUID="b3650177-e338-4eba-ab42-bc0cd14c9d65" Mar 18 10:15:11 crc kubenswrapper[4733]: E0318 10:15:11.175689 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 10:15:11 crc kubenswrapper[4733]: E0318 10:15:11.175851 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 10:15:11 crc kubenswrapper[4733]: E0318 10:15:11.175908 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 10:15:11 crc kubenswrapper[4733]: I0318 10:15:11.176675 4733 scope.go:117] "RemoveContainer" containerID="b1a81a8bb9ca8ad4c87fd9b3cd1ae0f5c21d0e4b39a32bd67b6c63b41175d0a6" Mar 18 10:15:11 crc kubenswrapper[4733]: E0318 10:15:11.176849 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7pxwd_openshift-ovn-kubernetes(73327417-4d3b-45f1-b3b6-575fdeeaa31a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" podUID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" Mar 18 10:15:11 crc kubenswrapper[4733]: I0318 10:15:11.196469 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73327417-4d3b-45f1-b3b6-575fdeeaa31a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f1102a1e204850c8494267640da7ec93ec67e7341c4eb60d22a7f0772058cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6233b06ff88fa77ac74becea9ef44fd4aa09b0ae718390c1b73c78d353ecbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10bb15e1466307914440acf630eb4ea4a442cf4354a9cc5e6ddb40d8147a4d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7ac487e3e86e35e8a1d6ddf67975eb4a67657b219938a69a90ccd5774ee0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb388a698a6a2ee1963deeba09459d7190f60ed189ee20a2ba24de317604226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3043711b80807f025d3bd2e7b4593f22d78dd3e458aa185c18c065af4aca503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1a81a8bb9ca8ad4c87fd9b3cd1ae0f5c21d0e4b39a32bd67b6c63b41175d0a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1a81a8bb9ca8ad4c87fd9b3cd1ae0f5c21d0e4b39a32bd67b6c63b41175d0a6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T10:14:54Z\\\",\\\"message\\\":\\\"\\\\nI0318 10:14:54.959635 6930 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0318 10:14:54.959527 6930 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI0318 10:14:54.959642 6930 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI0318 10:14:54.959646 6930 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0318 10:14:54.959649 6930 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nI0318 10:14:54.959466 6930 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0318 10:14:54.959658 6930 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI0318 10:14:54.959662 6930 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI0318 10:14:54.959665 6930 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nF0318 10:14:54.959243 6930 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7pxwd_openshift-ovn-kubernetes(73327417-4d3b-45f1-b3b6-575fdeeaa31a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2d3be12ab406039374efc6a0094e21103be62b51ef65c4ccf5529d6ef05291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pxwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:11Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:11 crc kubenswrapper[4733]: I0318 10:15:11.210322 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"353ee984-b20f-41fa-978a-0167c20ede36\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4287a7d43815108131e4b725925805740a64682bc2a9c96ff054f65517e501f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7c73fedb720681572ba31d10e49b7fc28537f98b4afb32bee611e6265eafaff\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T10:13:13Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 10:12:43.210581 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 10:12:43.213660 1 observer_polling.go:159] Starting file observer\\\\nI0318 10:12:43.251533 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 10:12:43.256315 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0318 10:13:13.491530 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:13:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e84c65c99c9c698f4097bbffe0efebd320e4fc2c4a58788a606e7f0b98e1822\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b54b5cea02ea38b404d6b5730afbab0f729978207023e1dfa7cc49ea9736795\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28b29e8c4af41ef6391d7ea79821c7caa64424b8113473541a96ae936db10015\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:11Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:11 crc kubenswrapper[4733]: I0318 10:15:11.224138 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:11Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:11 crc kubenswrapper[4733]: I0318 10:15:11.237846 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xfvfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb58b528-9013-4fab-9747-60bb6ff1bc1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc72346f1bb873e40a1063486ebd2adfd16e3958e17730370c00cb3b775a982c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg7jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xfvfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:11Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:11 crc kubenswrapper[4733]: I0318 10:15:11.255401 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:11Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:11 crc kubenswrapper[4733]: E0318 10:15:11.274913 4733 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 10:15:11 crc kubenswrapper[4733]: I0318 10:15:11.280116 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t28sh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14c8bb1225c6c415d19ccaf11f0117aa22ccf43aa3b80472a8779ec5cea1aeb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f534576a07c40f1e53709418bbc816e439ac9e036c97e780922c19e36c272642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f534576a07c40f1e53709418bbc816e439ac9e036c97e780922c19e36c272642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819ab8cd6ee5f116abf7861120afb8ac702ade05eb0efc4f73366042f2dd3bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5819ab8cd6ee5f116abf7861120afb8ac702ade05eb0efc4f73366042f2dd3bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50025671e1a19a96b715db63cbfa653f3cf078d3bfcfd94803bca2fac9636637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50025671e1a19a96b715db63cbfa653f3cf078d3bfcfd94803bca2fac9636637\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bb1ae24448e6a65c00cce421e6bbc772e3cd734f8f35fb411ce8eac5d662256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bb1ae24448e6a65c00cce421e6bbc772e3cd734f8f35fb411ce8eac5d662256\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaee328f999419b8d4f3a3298dbf38fd35bc92f76224d38c6769aa7426bb9a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaee328f999419b8d4f3a3298dbf38fd35bc92f76224d38c6769aa7426bb9a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a0b2c5f56088e948c02d27d94da94aba67e2c6ffc58442adc30586a548271b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a0b2c5f56088e948c02d27d94da94aba67e2c6ffc58442adc30586a548271b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t28sh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:11Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:11 crc kubenswrapper[4733]: I0318 10:15:11.294034 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f75e1c5-e0c5-43df-944f-77b734070793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5b4eaa631b67f13321cd60f9136da1832c5cd6e226609c01cabfa28410630a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e7a90421535b4f8ff5e3b3a0ad9c958710094ffa4e3e4eb3eb41c79f80830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2h7dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:11Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:11 crc kubenswrapper[4733]: I0318 10:15:11.313272 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70b23fcc-38c7-420b-ad9a-57d1c547c788\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb888f7a23904596729e28ec137231447f22565be42be8589f1481aa52efd9f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d0f02cb69f907a82795f47bfae39d1f750bb7bedeeb6d0802e84087dd7150df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cfda710da166c7b27fe6df3f38f5f969d0edea58503530ace9d35e3a7ec1420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b454a77a46e10fcea3615e1f59d7849430a461ee7392b37fbbb6ec89e53eb432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://448f3d210c3e435bb68acc8f81dd92e63739d073e0d3746be3985c3d3fe07556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:11Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:11 crc kubenswrapper[4733]: I0318 10:15:11.322645 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"908bd772-fb33-4f68-8971-d1fef3118c82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3457636bb3e1cc25507158454524b9cee6812beb56c7b22fb86b9438b8082488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:11Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:11 crc kubenswrapper[4733]: I0318 10:15:11.334663 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a37904dce4f31563b6bf3db4a4e779fcaebf12e80cdabf402fb1fcf03320f46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:11Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:11 crc kubenswrapper[4733]: I0318 10:15:11.347609 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4s425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3650177-e338-4eba-ab42-bc0cd14c9d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4s425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:11Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:11 crc kubenswrapper[4733]: I0318 10:15:11.360403 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a14e8a496af63cf1951ed21cfb3b13b1b516b00271dce19cdf858148beff398b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61bc78e89fc84025b585b2a421fa96e8da9f90840b8c78c0658f30d8738c64ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:11Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:11 crc kubenswrapper[4733]: I0318 10:15:11.373636 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddb303e3-8922-4b43-9bba-2d3f0c30c6b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1614bd2915eb4ab62554cfe72d63669c062baaf25ae2e533788b876ff9544eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aaa002cf5203102149456e58fcc5db02a5e861736d3699e432a91186bac47d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edcafff0c9902e275fc23a2f154d3030c0e751e2f3230a4ca226c9cef8efcbfa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa9eed1a11fd6a14b82ea9f34ead9b9c67e9c9d52c2675651b37f9838875052\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba371d0dc81f8827d305037cab25306e3abe8ed3d243f74923b4709198f7ea38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T10:13:42Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 10:13:41.916017 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 10:13:41.916132 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 10:13:41.917022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1943543564/tls.crt::/tmp/serving-cert-1943543564/tls.key\\\\\\\"\\\\nI0318 10:13:42.070462 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 10:13:42.072416 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 10:13:42.072438 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 10:13:42.072464 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 10:13:42.072469 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 10:13:42.076902 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 10:13:42.076943 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076949 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076959 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 10:13:42.076962 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 10:13:42.076967 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 10:13:42.076974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 10:13:42.077028 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 10:13:42.078631 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:13:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b698902beccdf67c5646c01b34eea131f61dee8d5d6e1f566cdb70c930b2cde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:11Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:11 crc kubenswrapper[4733]: I0318 10:15:11.395366 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e797e62-fc82-47f7-8c8c-6c11d3463304\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0cb2e53d9e61f6e93594f61ef9614e057a66575c32d18a010ab1ecfd3ac367f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2fe29241779e03381bb946ac650ea8a793785c0c3ed67302dd89f1c5e0d93e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b09c8d5c3c63eb7d9db92ce941aec0f0def87adbc1d46334ccc518a47c60f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b4b403598b0be68c5baba6e126ecad218005a9c2aeea9badf14dfc4859dce03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b4b403598b0be68c5baba6e126ecad218005a9c2aeea9badf14dfc4859dce03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:11Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:11 crc kubenswrapper[4733]: I0318 10:15:11.408778 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:11Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:11 crc kubenswrapper[4733]: I0318 10:15:11.425612 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g6j2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf9836f3455051ee686f0ec11ceb1c60cff06c95a16bf2fcff6c4c3ed600b034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ph8vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g6j2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:11Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:11 crc kubenswrapper[4733]: I0318 10:15:11.441157 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spfjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d693a73-68c1-4595-bbcc-be97691b06fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb07463e9cec5d204a136bc3da2a197f348b611ad242f9652741da372ebc490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e784035b634ef119368039982dbafab7f160c3864fe9ef9f5236d906de281b11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-spfjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:11Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:11 crc kubenswrapper[4733]: I0318 10:15:11.455601 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hsk58" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2c181c8-3361-40a2-afc5-a677e0ab4ecd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7ffcba189533d7ca155ab3284efac3d072ee3bc46d4b2a61247261bdaecb152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-httph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hsk58\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:11Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:11 crc kubenswrapper[4733]: I0318 10:15:11.470455 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aaa6e82080eecc5cde4d763e00b69fb4234de74431affa584f0b900a811dd2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:11Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:12 crc kubenswrapper[4733]: I0318 10:15:12.866621 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:15:12 crc kubenswrapper[4733]: I0318 10:15:12.866664 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:15:12 crc kubenswrapper[4733]: I0318 10:15:12.866679 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:15:12 crc kubenswrapper[4733]: I0318 10:15:12.866695 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:15:12 crc kubenswrapper[4733]: I0318 10:15:12.866708 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:15:12Z","lastTransitionTime":"2026-03-18T10:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:15:12 crc kubenswrapper[4733]: E0318 10:15:12.882245 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a826494-c246-4717-869b-fd136e2b8410\\\",\\\"systemUUID\\\":\\\"fe704b25-4cdf-410a-9afb-ebc7963f4bc5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:12Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:12 crc kubenswrapper[4733]: I0318 10:15:12.886998 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:15:12 crc kubenswrapper[4733]: I0318 10:15:12.887042 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:15:12 crc kubenswrapper[4733]: I0318 10:15:12.887057 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:15:12 crc kubenswrapper[4733]: I0318 10:15:12.887077 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:15:12 crc kubenswrapper[4733]: I0318 10:15:12.887095 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:15:12Z","lastTransitionTime":"2026-03-18T10:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:15:12 crc kubenswrapper[4733]: E0318 10:15:12.905114 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a826494-c246-4717-869b-fd136e2b8410\\\",\\\"systemUUID\\\":\\\"fe704b25-4cdf-410a-9afb-ebc7963f4bc5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:12Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:12 crc kubenswrapper[4733]: I0318 10:15:12.909883 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:15:12 crc kubenswrapper[4733]: I0318 10:15:12.909961 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:15:12 crc kubenswrapper[4733]: I0318 10:15:12.909984 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:15:12 crc kubenswrapper[4733]: I0318 10:15:12.910017 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:15:12 crc kubenswrapper[4733]: I0318 10:15:12.910041 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:15:12Z","lastTransitionTime":"2026-03-18T10:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:15:12 crc kubenswrapper[4733]: E0318 10:15:12.924365 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a826494-c246-4717-869b-fd136e2b8410\\\",\\\"systemUUID\\\":\\\"fe704b25-4cdf-410a-9afb-ebc7963f4bc5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:12Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:12 crc kubenswrapper[4733]: I0318 10:15:12.929334 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:15:12 crc kubenswrapper[4733]: I0318 10:15:12.929381 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:15:12 crc kubenswrapper[4733]: I0318 10:15:12.929396 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:15:12 crc kubenswrapper[4733]: I0318 10:15:12.929417 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:15:12 crc kubenswrapper[4733]: I0318 10:15:12.929431 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:15:12Z","lastTransitionTime":"2026-03-18T10:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:15:12 crc kubenswrapper[4733]: E0318 10:15:12.943848 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a826494-c246-4717-869b-fd136e2b8410\\\",\\\"systemUUID\\\":\\\"fe704b25-4cdf-410a-9afb-ebc7963f4bc5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:12Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:12 crc kubenswrapper[4733]: I0318 10:15:12.948587 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:15:12 crc kubenswrapper[4733]: I0318 10:15:12.948663 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:15:12 crc kubenswrapper[4733]: I0318 10:15:12.948686 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:15:12 crc kubenswrapper[4733]: I0318 10:15:12.948717 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:15:12 crc kubenswrapper[4733]: I0318 10:15:12.948740 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:15:12Z","lastTransitionTime":"2026-03-18T10:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:15:12 crc kubenswrapper[4733]: E0318 10:15:12.965637 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a826494-c246-4717-869b-fd136e2b8410\\\",\\\"systemUUID\\\":\\\"fe704b25-4cdf-410a-9afb-ebc7963f4bc5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:12Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:12 crc kubenswrapper[4733]: E0318 10:15:12.965786 4733 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 10:15:13 crc kubenswrapper[4733]: I0318 10:15:13.002824 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-g6j2q_cc85b0d4-15a5-4894-9f07-9aaeb28f63fa/kube-multus/0.log" Mar 18 10:15:13 crc kubenswrapper[4733]: I0318 10:15:13.002917 4733 generic.go:334] "Generic (PLEG): container finished" podID="cc85b0d4-15a5-4894-9f07-9aaeb28f63fa" containerID="cf9836f3455051ee686f0ec11ceb1c60cff06c95a16bf2fcff6c4c3ed600b034" exitCode=1 Mar 18 10:15:13 crc kubenswrapper[4733]: I0318 10:15:13.002982 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-g6j2q" event={"ID":"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa","Type":"ContainerDied","Data":"cf9836f3455051ee686f0ec11ceb1c60cff06c95a16bf2fcff6c4c3ed600b034"} Mar 18 10:15:13 crc kubenswrapper[4733]: I0318 10:15:13.003713 4733 scope.go:117] "RemoveContainer" containerID="cf9836f3455051ee686f0ec11ceb1c60cff06c95a16bf2fcff6c4c3ed600b034" Mar 18 10:15:13 crc kubenswrapper[4733]: I0318 10:15:13.021370 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:13Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:13 crc kubenswrapper[4733]: I0318 10:15:13.043580 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t28sh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14c8bb1225c6c415d19ccaf11f0117aa22ccf43aa3b80472a8779ec5cea1aeb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f534576a07c40f1e53709418bbc816e439ac9e036c97e780922c19e36c272642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f534576a07c40f1e53709418bbc816e439ac9e036c97e780922c19e36c272642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819ab8cd6ee5f116abf7861120afb8ac702ade05eb0efc4f73366042f2dd3bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5819ab8cd6ee5f116abf7861120afb8ac702ade05eb0efc4f73366042f2dd3bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50025671e1a19a96b715db63cbfa653f3cf078d3bfcfd94803bca2fac9636637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50025671e1a19a96b715db63cbfa653f3cf078d3bfcfd94803bca2fac9636637\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bb1ae24448e6a65c00cce421e6bbc772e3cd734f8f35fb411ce8eac5d662256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bb1ae24448e6a65c00cce421e6bbc772e3cd734f8f35fb411ce8eac5d662256\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaee328f999419b8d4f3a3298dbf38fd35bc92f76224d38c6769aa7426bb9a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaee328f999419b8d4f3a3298dbf38fd35bc92f76224d38c6769aa7426bb9a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a0b2c5f56088e948c02d27d94da94aba67e2c6ffc58442adc30586a548271b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a0b2c5f56088e948c02d27d94da94aba67e2c6ffc58442adc30586a548271b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t28sh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:13Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:13 crc kubenswrapper[4733]: I0318 10:15:13.061916 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f75e1c5-e0c5-43df-944f-77b734070793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5b4eaa631b67f13321cd60f9136da1832c5cd6e226609c01cabfa28410630a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e7a90421535b4f8ff5e3b3a0ad9c958710094ffa4e3e4eb3eb41c79f80830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2h7dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:13Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:13 crc kubenswrapper[4733]: I0318 10:15:13.089939 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70b23fcc-38c7-420b-ad9a-57d1c547c788\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb888f7a23904596729e28ec137231447f22565be42be8589f1481aa52efd9f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d0f02cb69f907a82795f47bfae39d1f750bb7bedeeb6d0802e84087dd7150df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cfda710da166c7b27fe6df3f38f5f969d0edea58503530ace9d35e3a7ec1420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b454a77a46e10fcea3615e1f59d7849430a461ee7392b37fbbb6ec89e53eb432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://448f3d210c3e435bb68acc8f81dd92e63739d073e0d3746be3985c3d3fe07556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:13Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:13 crc kubenswrapper[4733]: I0318 10:15:13.103582 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"908bd772-fb33-4f68-8971-d1fef3118c82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3457636bb3e1cc25507158454524b9cee6812beb56c7b22fb86b9438b8082488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:13Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:13 crc kubenswrapper[4733]: I0318 10:15:13.118979 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a37904dce4f31563b6bf3db4a4e779fcaebf12e80cdabf402fb1fcf03320f46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:13Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:13 crc kubenswrapper[4733]: I0318 10:15:13.132314 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4s425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3650177-e338-4eba-ab42-bc0cd14c9d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4s425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:13Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:13 crc kubenswrapper[4733]: I0318 10:15:13.149102 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a14e8a496af63cf1951ed21cfb3b13b1b516b00271dce19cdf858148beff398b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61bc78e89fc84025b585b2a421fa96e8da9f90840b8c78c0658f30d8738c64ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:13Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:13 crc kubenswrapper[4733]: I0318 10:15:13.164970 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddb303e3-8922-4b43-9bba-2d3f0c30c6b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1614bd2915eb4ab62554cfe72d63669c062baaf25ae2e533788b876ff9544eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aaa002cf5203102149456e58fcc5db02a5e861736d3699e432a91186bac47d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edcafff0c9902e275fc23a2f154d3030c0e751e2f3230a4ca226c9cef8efcbfa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa9eed1a11fd6a14b82ea9f34ead9b9c67e9c9d52c2675651b37f9838875052\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba371d0dc81f8827d305037cab25306e3abe8ed3d243f74923b4709198f7ea38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T10:13:42Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 10:13:41.916017 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 10:13:41.916132 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 10:13:41.917022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1943543564/tls.crt::/tmp/serving-cert-1943543564/tls.key\\\\\\\"\\\\nI0318 10:13:42.070462 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 10:13:42.072416 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 10:13:42.072438 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 10:13:42.072464 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 10:13:42.072469 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 10:13:42.076902 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 10:13:42.076943 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076949 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076959 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 10:13:42.076962 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 10:13:42.076967 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 10:13:42.076974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 10:13:42.077028 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 10:13:42.078631 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:13:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b698902beccdf67c5646c01b34eea131f61dee8d5d6e1f566cdb70c930b2cde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:13Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:13 crc kubenswrapper[4733]: I0318 10:15:13.174724 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:15:13 crc kubenswrapper[4733]: I0318 10:15:13.174776 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:15:13 crc kubenswrapper[4733]: I0318 10:15:13.174887 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:15:13 crc kubenswrapper[4733]: E0318 10:15:13.174898 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 10:15:13 crc kubenswrapper[4733]: E0318 10:15:13.175019 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 10:15:13 crc kubenswrapper[4733]: E0318 10:15:13.175149 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4s425" podUID="b3650177-e338-4eba-ab42-bc0cd14c9d65" Mar 18 10:15:13 crc kubenswrapper[4733]: I0318 10:15:13.175305 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:15:13 crc kubenswrapper[4733]: E0318 10:15:13.175422 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 10:15:13 crc kubenswrapper[4733]: I0318 10:15:13.177522 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e797e62-fc82-47f7-8c8c-6c11d3463304\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0cb2e53d9e61f6e93594f61ef9614e057a66575c32d18a010ab1ecfd3ac367f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2fe29241779e03381bb946ac650ea8a793785c0c3ed67302dd89f1c5e0d93e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b09c8d5c3c63eb7d9db92ce941aec0f0def87adbc1d46334ccc518a47c60f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b4b403598b0be68c5baba6e126ecad218005a9c2aeea9badf14dfc4859dce03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b4b403598b0be68c5baba6e126ecad218005a9c2aeea9badf14dfc4859dce03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:13Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:13 crc kubenswrapper[4733]: I0318 10:15:13.193061 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:13Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:13 crc kubenswrapper[4733]: I0318 10:15:13.207034 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g6j2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf9836f3455051ee686f0ec11ceb1c60cff06c95a16bf2fcff6c4c3ed600b034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf9836f3455051ee686f0ec11ceb1c60cff06c95a16bf2fcff6c4c3ed600b034\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T10:15:12Z\\\",\\\"message\\\":\\\"2026-03-18T10:14:27+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_92f48003-aa44-42d0-a76f-02756a51562c\\\\n2026-03-18T10:14:27+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_92f48003-aa44-42d0-a76f-02756a51562c to /host/opt/cni/bin/\\\\n2026-03-18T10:14:27Z [verbose] multus-daemon started\\\\n2026-03-18T10:14:27Z [verbose] Readiness Indicator file check\\\\n2026-03-18T10:15:12Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ph8vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g6j2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:13Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:13 crc kubenswrapper[4733]: I0318 10:15:13.220637 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spfjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d693a73-68c1-4595-bbcc-be97691b06fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb07463e9cec5d204a136bc3da2a197f348b611ad242f9652741da372ebc490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e784035b634ef119368039982dbafab7f160c3864fe9ef9f5236d906de281b11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-spfjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:13Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:13 crc kubenswrapper[4733]: I0318 10:15:13.233457 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hsk58" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2c181c8-3361-40a2-afc5-a677e0ab4ecd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7ffcba189533d7ca155ab3284efac3d072ee3bc46d4b2a61247261bdaecb152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-httph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hsk58\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:13Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:13 crc kubenswrapper[4733]: I0318 10:15:13.249320 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aaa6e82080eecc5cde4d763e00b69fb4234de74431affa584f0b900a811dd2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:13Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:13 crc kubenswrapper[4733]: I0318 10:15:13.271238 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73327417-4d3b-45f1-b3b6-575fdeeaa31a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f1102a1e204850c8494267640da7ec93ec67e7341c4eb60d22a7f0772058cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6233b06ff88fa77ac74becea9ef44fd4aa09b0ae718390c1b73c78d353ecbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10bb15e1466307914440acf630eb4ea4a442cf4354a9cc5e6ddb40d8147a4d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7ac487e3e86e35e8a1d6ddf67975eb4a67657b219938a69a90ccd5774ee0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb388a698a6a2ee1963deeba09459d7190f60ed189ee20a2ba24de317604226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3043711b80807f025d3bd2e7b4593f22d78dd3e458aa185c18c065af4aca503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1a81a8bb9ca8ad4c87fd9b3cd1ae0f5c21d0e4b39a32bd67b6c63b41175d0a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1a81a8bb9ca8ad4c87fd9b3cd1ae0f5c21d0e4b39a32bd67b6c63b41175d0a6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T10:14:54Z\\\",\\\"message\\\":\\\"\\\\nI0318 10:14:54.959635 6930 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0318 10:14:54.959527 6930 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI0318 10:14:54.959642 6930 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI0318 10:14:54.959646 6930 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0318 10:14:54.959649 6930 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nI0318 10:14:54.959466 6930 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0318 10:14:54.959658 6930 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI0318 10:14:54.959662 6930 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI0318 10:14:54.959665 6930 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nF0318 10:14:54.959243 6930 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7pxwd_openshift-ovn-kubernetes(73327417-4d3b-45f1-b3b6-575fdeeaa31a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2d3be12ab406039374efc6a0094e21103be62b51ef65c4ccf5529d6ef05291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pxwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:13Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:13 crc kubenswrapper[4733]: I0318 10:15:13.288163 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"353ee984-b20f-41fa-978a-0167c20ede36\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4287a7d43815108131e4b725925805740a64682bc2a9c96ff054f65517e501f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7c73fedb720681572ba31d10e49b7fc28537f98b4afb32bee611e6265eafaff\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T10:13:13Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 10:12:43.210581 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 10:12:43.213660 1 observer_polling.go:159] Starting file observer\\\\nI0318 10:12:43.251533 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 10:12:43.256315 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0318 10:13:13.491530 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:13:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e84c65c99c9c698f4097bbffe0efebd320e4fc2c4a58788a606e7f0b98e1822\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b54b5cea02ea38b404d6b5730afbab0f729978207023e1dfa7cc49ea9736795\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28b29e8c4af41ef6391d7ea79821c7caa64424b8113473541a96ae936db10015\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:13Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:13 crc kubenswrapper[4733]: I0318 10:15:13.304871 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:13Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:13 crc kubenswrapper[4733]: I0318 10:15:13.317752 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xfvfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb58b528-9013-4fab-9747-60bb6ff1bc1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc72346f1bb873e40a1063486ebd2adfd16e3958e17730370c00cb3b775a982c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg7jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xfvfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:13Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:14 crc kubenswrapper[4733]: I0318 10:15:14.009004 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-g6j2q_cc85b0d4-15a5-4894-9f07-9aaeb28f63fa/kube-multus/0.log" Mar 18 10:15:14 crc kubenswrapper[4733]: I0318 10:15:14.009074 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-g6j2q" event={"ID":"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa","Type":"ContainerStarted","Data":"b6a4e9643a717b3f38fc1bed5c534e12bb873f0ffcf3c504cb4395c11621a73a"} Mar 18 10:15:14 crc kubenswrapper[4733]: I0318 10:15:14.025017 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4s425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3650177-e338-4eba-ab42-bc0cd14c9d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4s425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:14Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:14 crc kubenswrapper[4733]: I0318 10:15:14.044713 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a14e8a496af63cf1951ed21cfb3b13b1b516b00271dce19cdf858148beff398b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61bc78e89fc84025b585b2a421fa96e8da9f90840b8c78c0658f30d8738c64ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:14Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:14 crc kubenswrapper[4733]: I0318 10:15:14.062612 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddb303e3-8922-4b43-9bba-2d3f0c30c6b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1614bd2915eb4ab62554cfe72d63669c062baaf25ae2e533788b876ff9544eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aaa002cf5203102149456e58fcc5db02a5e861736d3699e432a91186bac47d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edcafff0c9902e275fc23a2f154d3030c0e751e2f3230a4ca226c9cef8efcbfa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa9eed1a11fd6a14b82ea9f34ead9b9c67e9c9d52c2675651b37f9838875052\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba371d0dc81f8827d305037cab25306e3abe8ed3d243f74923b4709198f7ea38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T10:13:42Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 10:13:41.916017 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 10:13:41.916132 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 10:13:41.917022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1943543564/tls.crt::/tmp/serving-cert-1943543564/tls.key\\\\\\\"\\\\nI0318 10:13:42.070462 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 10:13:42.072416 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 10:13:42.072438 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 10:13:42.072464 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 10:13:42.072469 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 10:13:42.076902 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 10:13:42.076943 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076949 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076959 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 10:13:42.076962 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 10:13:42.076967 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 10:13:42.076974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 10:13:42.077028 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 10:13:42.078631 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:13:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b698902beccdf67c5646c01b34eea131f61dee8d5d6e1f566cdb70c930b2cde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:14Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:14 crc kubenswrapper[4733]: I0318 10:15:14.079667 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e797e62-fc82-47f7-8c8c-6c11d3463304\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0cb2e53d9e61f6e93594f61ef9614e057a66575c32d18a010ab1ecfd3ac367f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2fe29241779e03381bb946ac650ea8a793785c0c3ed67302dd89f1c5e0d93e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b09c8d5c3c63eb7d9db92ce941aec0f0def87adbc1d46334ccc518a47c60f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b4b403598b0be68c5baba6e126ecad218005a9c2aeea9badf14dfc4859dce03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b4b403598b0be68c5baba6e126ecad218005a9c2aeea9badf14dfc4859dce03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:14Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:14 crc kubenswrapper[4733]: I0318 10:15:14.102777 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:14Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:14 crc kubenswrapper[4733]: I0318 10:15:14.118482 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g6j2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6a4e9643a717b3f38fc1bed5c534e12bb873f0ffcf3c504cb4395c11621a73a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf9836f3455051ee686f0ec11ceb1c60cff06c95a16bf2fcff6c4c3ed600b034\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T10:15:12Z\\\",\\\"message\\\":\\\"2026-03-18T10:14:27+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_92f48003-aa44-42d0-a76f-02756a51562c\\\\n2026-03-18T10:14:27+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_92f48003-aa44-42d0-a76f-02756a51562c to /host/opt/cni/bin/\\\\n2026-03-18T10:14:27Z [verbose] multus-daemon started\\\\n2026-03-18T10:14:27Z [verbose] Readiness Indicator file check\\\\n2026-03-18T10:15:12Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:26Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:15:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ph8vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g6j2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:14Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:14 crc kubenswrapper[4733]: I0318 10:15:14.138344 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spfjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d693a73-68c1-4595-bbcc-be97691b06fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb07463e9cec5d204a136bc3da2a197f348b611ad242f9652741da372ebc490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e784035b634ef119368039982dbafab7f160c3864fe9ef9f5236d906de281b11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-spfjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:14Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:14 crc kubenswrapper[4733]: I0318 10:15:14.152052 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hsk58" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2c181c8-3361-40a2-afc5-a677e0ab4ecd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7ffcba189533d7ca155ab3284efac3d072ee3bc46d4b2a61247261bdaecb152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-httph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hsk58\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:14Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:14 crc kubenswrapper[4733]: I0318 10:15:14.170152 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aaa6e82080eecc5cde4d763e00b69fb4234de74431affa584f0b900a811dd2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:14Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:14 crc kubenswrapper[4733]: I0318 10:15:14.207340 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73327417-4d3b-45f1-b3b6-575fdeeaa31a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f1102a1e204850c8494267640da7ec93ec67e7341c4eb60d22a7f0772058cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6233b06ff88fa77ac74becea9ef44fd4aa09b0ae718390c1b73c78d353ecbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10bb15e1466307914440acf630eb4ea4a442cf4354a9cc5e6ddb40d8147a4d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7ac487e3e86e35e8a1d6ddf67975eb4a67657b219938a69a90ccd5774ee0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb388a698a6a2ee1963deeba09459d7190f60ed189ee20a2ba24de317604226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3043711b80807f025d3bd2e7b4593f22d78dd3e458aa185c18c065af4aca503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1a81a8bb9ca8ad4c87fd9b3cd1ae0f5c21d0e4b39a32bd67b6c63b41175d0a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1a81a8bb9ca8ad4c87fd9b3cd1ae0f5c21d0e4b39a32bd67b6c63b41175d0a6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T10:14:54Z\\\",\\\"message\\\":\\\"\\\\nI0318 10:14:54.959635 6930 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0318 10:14:54.959527 6930 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI0318 10:14:54.959642 6930 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI0318 10:14:54.959646 6930 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0318 10:14:54.959649 6930 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nI0318 10:14:54.959466 6930 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0318 10:14:54.959658 6930 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI0318 10:14:54.959662 6930 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI0318 10:14:54.959665 6930 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nF0318 10:14:54.959243 6930 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7pxwd_openshift-ovn-kubernetes(73327417-4d3b-45f1-b3b6-575fdeeaa31a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2d3be12ab406039374efc6a0094e21103be62b51ef65c4ccf5529d6ef05291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pxwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:14Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:14 crc kubenswrapper[4733]: I0318 10:15:14.226816 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"353ee984-b20f-41fa-978a-0167c20ede36\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4287a7d43815108131e4b725925805740a64682bc2a9c96ff054f65517e501f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7c73fedb720681572ba31d10e49b7fc28537f98b4afb32bee611e6265eafaff\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T10:13:13Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 10:12:43.210581 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 10:12:43.213660 1 observer_polling.go:159] Starting file observer\\\\nI0318 10:12:43.251533 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 10:12:43.256315 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0318 10:13:13.491530 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:13:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e84c65c99c9c698f4097bbffe0efebd320e4fc2c4a58788a606e7f0b98e1822\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b54b5cea02ea38b404d6b5730afbab0f729978207023e1dfa7cc49ea9736795\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28b29e8c4af41ef6391d7ea79821c7caa64424b8113473541a96ae936db10015\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:14Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:14 crc kubenswrapper[4733]: I0318 10:15:14.246403 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:14Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:14 crc kubenswrapper[4733]: I0318 10:15:14.262363 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xfvfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb58b528-9013-4fab-9747-60bb6ff1bc1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc72346f1bb873e40a1063486ebd2adfd16e3958e17730370c00cb3b775a982c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg7jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xfvfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:14Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:14 crc kubenswrapper[4733]: I0318 10:15:14.281045 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:14Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:14 crc kubenswrapper[4733]: I0318 10:15:14.297700 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t28sh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14c8bb1225c6c415d19ccaf11f0117aa22ccf43aa3b80472a8779ec5cea1aeb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f534576a07c40f1e53709418bbc816e439ac9e036c97e780922c19e36c272642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f534576a07c40f1e53709418bbc816e439ac9e036c97e780922c19e36c272642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819ab8cd6ee5f116abf7861120afb8ac702ade05eb0efc4f73366042f2dd3bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5819ab8cd6ee5f116abf7861120afb8ac702ade05eb0efc4f73366042f2dd3bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50025671e1a19a96b715db63cbfa653f3cf078d3bfcfd94803bca2fac9636637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50025671e1a19a96b715db63cbfa653f3cf078d3bfcfd94803bca2fac9636637\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bb1ae24448e6a65c00cce421e6bbc772e3cd734f8f35fb411ce8eac5d662256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bb1ae24448e6a65c00cce421e6bbc772e3cd734f8f35fb411ce8eac5d662256\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaee328f999419b8d4f3a3298dbf38fd35bc92f76224d38c6769aa7426bb9a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaee328f999419b8d4f3a3298dbf38fd35bc92f76224d38c6769aa7426bb9a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a0b2c5f56088e948c02d27d94da94aba67e2c6ffc58442adc30586a548271b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a0b2c5f56088e948c02d27d94da94aba67e2c6ffc58442adc30586a548271b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t28sh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:14Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:14 crc kubenswrapper[4733]: I0318 10:15:14.316042 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f75e1c5-e0c5-43df-944f-77b734070793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5b4eaa631b67f13321cd60f9136da1832c5cd6e226609c01cabfa28410630a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e7a90421535b4f8ff5e3b3a0ad9c958710094ffa4e3e4eb3eb41c79f80830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2h7dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:14Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:14 crc kubenswrapper[4733]: I0318 10:15:14.346897 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70b23fcc-38c7-420b-ad9a-57d1c547c788\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb888f7a23904596729e28ec137231447f22565be42be8589f1481aa52efd9f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d0f02cb69f907a82795f47bfae39d1f750bb7bedeeb6d0802e84087dd7150df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cfda710da166c7b27fe6df3f38f5f969d0edea58503530ace9d35e3a7ec1420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b454a77a46e10fcea3615e1f59d7849430a461ee7392b37fbbb6ec89e53eb432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://448f3d210c3e435bb68acc8f81dd92e63739d073e0d3746be3985c3d3fe07556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:14Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:14 crc kubenswrapper[4733]: I0318 10:15:14.363894 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"908bd772-fb33-4f68-8971-d1fef3118c82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3457636bb3e1cc25507158454524b9cee6812beb56c7b22fb86b9438b8082488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:14Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:14 crc kubenswrapper[4733]: I0318 10:15:14.382400 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a37904dce4f31563b6bf3db4a4e779fcaebf12e80cdabf402fb1fcf03320f46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:14Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:15 crc kubenswrapper[4733]: I0318 10:15:15.174502 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:15:15 crc kubenswrapper[4733]: I0318 10:15:15.174590 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:15:15 crc kubenswrapper[4733]: E0318 10:15:15.174650 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 10:15:15 crc kubenswrapper[4733]: I0318 10:15:15.174770 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:15:15 crc kubenswrapper[4733]: E0318 10:15:15.174806 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4s425" podUID="b3650177-e338-4eba-ab42-bc0cd14c9d65" Mar 18 10:15:15 crc kubenswrapper[4733]: I0318 10:15:15.174837 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:15:15 crc kubenswrapper[4733]: E0318 10:15:15.175026 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 10:15:15 crc kubenswrapper[4733]: E0318 10:15:15.175155 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 10:15:16 crc kubenswrapper[4733]: E0318 10:15:16.276676 4733 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 10:15:17 crc kubenswrapper[4733]: I0318 10:15:17.174646 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:15:17 crc kubenswrapper[4733]: I0318 10:15:17.174705 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:15:17 crc kubenswrapper[4733]: I0318 10:15:17.174721 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:15:17 crc kubenswrapper[4733]: E0318 10:15:17.174896 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4s425" podUID="b3650177-e338-4eba-ab42-bc0cd14c9d65" Mar 18 10:15:17 crc kubenswrapper[4733]: I0318 10:15:17.174929 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:15:17 crc kubenswrapper[4733]: E0318 10:15:17.175114 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 10:15:17 crc kubenswrapper[4733]: E0318 10:15:17.175325 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 10:15:17 crc kubenswrapper[4733]: E0318 10:15:17.175458 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 10:15:17 crc kubenswrapper[4733]: I0318 10:15:17.227466 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:15:17 crc kubenswrapper[4733]: I0318 10:15:17.227617 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:15:17 crc kubenswrapper[4733]: E0318 10:15:17.227732 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:16:21.227696183 +0000 UTC m=+220.719430558 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:15:17 crc kubenswrapper[4733]: E0318 10:15:17.227823 4733 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 10:15:17 crc kubenswrapper[4733]: I0318 10:15:17.227831 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:15:17 crc kubenswrapper[4733]: E0318 10:15:17.227857 4733 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 10:15:17 crc kubenswrapper[4733]: E0318 10:15:17.227886 4733 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 10:15:17 crc kubenswrapper[4733]: I0318 10:15:17.227893 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:15:17 crc kubenswrapper[4733]: E0318 10:15:17.227964 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 10:16:21.227939931 +0000 UTC m=+220.719674296 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 10:15:17 crc kubenswrapper[4733]: I0318 10:15:17.228007 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b3650177-e338-4eba-ab42-bc0cd14c9d65-metrics-certs\") pod \"network-metrics-daemon-4s425\" (UID: \"b3650177-e338-4eba-ab42-bc0cd14c9d65\") " pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:15:17 crc kubenswrapper[4733]: E0318 10:15:17.228029 4733 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 10:15:17 crc kubenswrapper[4733]: E0318 10:15:17.228097 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 10:16:21.228078335 +0000 UTC m=+220.719812700 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 10:15:17 crc kubenswrapper[4733]: E0318 10:15:17.228149 4733 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 10:15:17 crc kubenswrapper[4733]: E0318 10:15:17.228228 4733 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 10:15:17 crc kubenswrapper[4733]: E0318 10:15:17.228255 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3650177-e338-4eba-ab42-bc0cd14c9d65-metrics-certs podName:b3650177-e338-4eba-ab42-bc0cd14c9d65 nodeName:}" failed. No retries permitted until 2026-03-18 10:16:21.22823218 +0000 UTC m=+220.719966535 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b3650177-e338-4eba-ab42-bc0cd14c9d65-metrics-certs") pod "network-metrics-daemon-4s425" (UID: "b3650177-e338-4eba-ab42-bc0cd14c9d65") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 10:15:17 crc kubenswrapper[4733]: E0318 10:15:17.228296 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 10:16:21.228277271 +0000 UTC m=+220.720011636 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 10:15:17 crc kubenswrapper[4733]: I0318 10:15:17.328836 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:15:17 crc kubenswrapper[4733]: E0318 10:15:17.329030 4733 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 10:15:17 crc kubenswrapper[4733]: E0318 10:15:17.329061 4733 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 10:15:17 crc kubenswrapper[4733]: E0318 10:15:17.329080 4733 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 10:15:17 crc kubenswrapper[4733]: E0318 10:15:17.329163 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 10:16:21.329141008 +0000 UTC m=+220.820875373 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 10:15:19 crc kubenswrapper[4733]: I0318 10:15:19.175410 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:15:19 crc kubenswrapper[4733]: I0318 10:15:19.175548 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:15:19 crc kubenswrapper[4733]: I0318 10:15:19.175626 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:15:19 crc kubenswrapper[4733]: I0318 10:15:19.175616 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:15:19 crc kubenswrapper[4733]: E0318 10:15:19.176916 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4s425" podUID="b3650177-e338-4eba-ab42-bc0cd14c9d65" Mar 18 10:15:19 crc kubenswrapper[4733]: E0318 10:15:19.177108 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 10:15:19 crc kubenswrapper[4733]: E0318 10:15:19.177286 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 10:15:19 crc kubenswrapper[4733]: E0318 10:15:19.176752 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 10:15:21 crc kubenswrapper[4733]: I0318 10:15:21.175210 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:15:21 crc kubenswrapper[4733]: E0318 10:15:21.175330 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4s425" podUID="b3650177-e338-4eba-ab42-bc0cd14c9d65" Mar 18 10:15:21 crc kubenswrapper[4733]: I0318 10:15:21.175395 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:15:21 crc kubenswrapper[4733]: I0318 10:15:21.175449 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:15:21 crc kubenswrapper[4733]: I0318 10:15:21.175517 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:15:21 crc kubenswrapper[4733]: E0318 10:15:21.175472 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 10:15:21 crc kubenswrapper[4733]: E0318 10:15:21.175633 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 10:15:21 crc kubenswrapper[4733]: E0318 10:15:21.175685 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 10:15:21 crc kubenswrapper[4733]: I0318 10:15:21.196334 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4s425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3650177-e338-4eba-ab42-bc0cd14c9d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4s425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:21Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:21 crc kubenswrapper[4733]: I0318 10:15:21.213792 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a14e8a496af63cf1951ed21cfb3b13b1b516b00271dce19cdf858148beff398b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61bc78e89fc84025b585b2a421fa96e8da9f90840b8c78c0658f30d8738c64ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:21Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:21 crc kubenswrapper[4733]: I0318 10:15:21.248113 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddb303e3-8922-4b43-9bba-2d3f0c30c6b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1614bd2915eb4ab62554cfe72d63669c062baaf25ae2e533788b876ff9544eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aaa002cf5203102149456e58fcc5db02a5e861736d3699e432a91186bac47d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edcafff0c9902e275fc23a2f154d3030c0e751e2f3230a4ca226c9cef8efcbfa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa9eed1a11fd6a14b82ea9f34ead9b9c67e9c9d52c2675651b37f9838875052\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba371d0dc81f8827d305037cab25306e3abe8ed3d243f74923b4709198f7ea38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T10:13:42Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 10:13:41.916017 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 10:13:41.916132 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 10:13:41.917022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1943543564/tls.crt::/tmp/serving-cert-1943543564/tls.key\\\\\\\"\\\\nI0318 10:13:42.070462 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 10:13:42.072416 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 10:13:42.072438 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 10:13:42.072464 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 10:13:42.072469 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 10:13:42.076902 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 10:13:42.076943 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076949 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076959 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 10:13:42.076962 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 10:13:42.076967 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 10:13:42.076974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 10:13:42.077028 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 10:13:42.078631 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:13:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b698902beccdf67c5646c01b34eea131f61dee8d5d6e1f566cdb70c930b2cde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:21Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:21 crc kubenswrapper[4733]: I0318 10:15:21.266872 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e797e62-fc82-47f7-8c8c-6c11d3463304\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0cb2e53d9e61f6e93594f61ef9614e057a66575c32d18a010ab1ecfd3ac367f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2fe29241779e03381bb946ac650ea8a793785c0c3ed67302dd89f1c5e0d93e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b09c8d5c3c63eb7d9db92ce941aec0f0def87adbc1d46334ccc518a47c60f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b4b403598b0be68c5baba6e126ecad218005a9c2aeea9badf14dfc4859dce03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b4b403598b0be68c5baba6e126ecad218005a9c2aeea9badf14dfc4859dce03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:21Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:21 crc kubenswrapper[4733]: E0318 10:15:21.277074 4733 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 10:15:21 crc kubenswrapper[4733]: I0318 10:15:21.283111 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:21Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:21 crc kubenswrapper[4733]: I0318 10:15:21.296281 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g6j2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6a4e9643a717b3f38fc1bed5c534e12bb873f0ffcf3c504cb4395c11621a73a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf9836f3455051ee686f0ec11ceb1c60cff06c95a16bf2fcff6c4c3ed600b034\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T10:15:12Z\\\",\\\"message\\\":\\\"2026-03-18T10:14:27+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_92f48003-aa44-42d0-a76f-02756a51562c\\\\n2026-03-18T10:14:27+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_92f48003-aa44-42d0-a76f-02756a51562c to /host/opt/cni/bin/\\\\n2026-03-18T10:14:27Z [verbose] multus-daemon started\\\\n2026-03-18T10:14:27Z [verbose] Readiness Indicator file check\\\\n2026-03-18T10:15:12Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:26Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:15:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ph8vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g6j2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:21Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:21 crc kubenswrapper[4733]: I0318 10:15:21.306438 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spfjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d693a73-68c1-4595-bbcc-be97691b06fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb07463e9cec5d204a136bc3da2a197f348b611ad242f9652741da372ebc490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e784035b634ef119368039982dbafab7f160c3864fe9ef9f5236d906de281b11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-spfjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:21Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:21 crc kubenswrapper[4733]: I0318 10:15:21.315106 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hsk58" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2c181c8-3361-40a2-afc5-a677e0ab4ecd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7ffcba189533d7ca155ab3284efac3d072ee3bc46d4b2a61247261bdaecb152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-httph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hsk58\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:21Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:21 crc kubenswrapper[4733]: I0318 10:15:21.325597 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aaa6e82080eecc5cde4d763e00b69fb4234de74431affa584f0b900a811dd2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:21Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:21 crc kubenswrapper[4733]: I0318 10:15:21.341555 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73327417-4d3b-45f1-b3b6-575fdeeaa31a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f1102a1e204850c8494267640da7ec93ec67e7341c4eb60d22a7f0772058cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6233b06ff88fa77ac74becea9ef44fd4aa09b0ae718390c1b73c78d353ecbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10bb15e1466307914440acf630eb4ea4a442cf4354a9cc5e6ddb40d8147a4d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7ac487e3e86e35e8a1d6ddf67975eb4a67657b219938a69a90ccd5774ee0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb388a698a6a2ee1963deeba09459d7190f60ed189ee20a2ba24de317604226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3043711b80807f025d3bd2e7b4593f22d78dd3e458aa185c18c065af4aca503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1a81a8bb9ca8ad4c87fd9b3cd1ae0f5c21d0e4b39a32bd67b6c63b41175d0a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1a81a8bb9ca8ad4c87fd9b3cd1ae0f5c21d0e4b39a32bd67b6c63b41175d0a6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T10:14:54Z\\\",\\\"message\\\":\\\"\\\\nI0318 10:14:54.959635 6930 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0318 10:14:54.959527 6930 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI0318 10:14:54.959642 6930 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI0318 10:14:54.959646 6930 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0318 10:14:54.959649 6930 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nI0318 10:14:54.959466 6930 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0318 10:14:54.959658 6930 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI0318 10:14:54.959662 6930 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI0318 10:14:54.959665 6930 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nF0318 10:14:54.959243 6930 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7pxwd_openshift-ovn-kubernetes(73327417-4d3b-45f1-b3b6-575fdeeaa31a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2d3be12ab406039374efc6a0094e21103be62b51ef65c4ccf5529d6ef05291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pxwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:21Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:21 crc kubenswrapper[4733]: I0318 10:15:21.356910 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"353ee984-b20f-41fa-978a-0167c20ede36\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4287a7d43815108131e4b725925805740a64682bc2a9c96ff054f65517e501f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7c73fedb720681572ba31d10e49b7fc28537f98b4afb32bee611e6265eafaff\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T10:13:13Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 10:12:43.210581 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 10:12:43.213660 1 observer_polling.go:159] Starting file observer\\\\nI0318 10:12:43.251533 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 10:12:43.256315 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0318 10:13:13.491530 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:13:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e84c65c99c9c698f4097bbffe0efebd320e4fc2c4a58788a606e7f0b98e1822\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b54b5cea02ea38b404d6b5730afbab0f729978207023e1dfa7cc49ea9736795\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28b29e8c4af41ef6391d7ea79821c7caa64424b8113473541a96ae936db10015\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:21Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:21 crc kubenswrapper[4733]: I0318 10:15:21.370137 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:21Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:21 crc kubenswrapper[4733]: I0318 10:15:21.378670 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xfvfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb58b528-9013-4fab-9747-60bb6ff1bc1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc72346f1bb873e40a1063486ebd2adfd16e3958e17730370c00cb3b775a982c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg7jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xfvfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:21Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:21 crc kubenswrapper[4733]: I0318 10:15:21.391111 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:21Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:21 crc kubenswrapper[4733]: I0318 10:15:21.404141 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t28sh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14c8bb1225c6c415d19ccaf11f0117aa22ccf43aa3b80472a8779ec5cea1aeb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f534576a07c40f1e53709418bbc816e439ac9e036c97e780922c19e36c272642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f534576a07c40f1e53709418bbc816e439ac9e036c97e780922c19e36c272642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819ab8cd6ee5f116abf7861120afb8ac702ade05eb0efc4f73366042f2dd3bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5819ab8cd6ee5f116abf7861120afb8ac702ade05eb0efc4f73366042f2dd3bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50025671e1a19a96b715db63cbfa653f3cf078d3bfcfd94803bca2fac9636637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50025671e1a19a96b715db63cbfa653f3cf078d3bfcfd94803bca2fac9636637\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bb1ae24448e6a65c00cce421e6bbc772e3cd734f8f35fb411ce8eac5d662256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bb1ae24448e6a65c00cce421e6bbc772e3cd734f8f35fb411ce8eac5d662256\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaee328f999419b8d4f3a3298dbf38fd35bc92f76224d38c6769aa7426bb9a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaee328f999419b8d4f3a3298dbf38fd35bc92f76224d38c6769aa7426bb9a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a0b2c5f56088e948c02d27d94da94aba67e2c6ffc58442adc30586a548271b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a0b2c5f56088e948c02d27d94da94aba67e2c6ffc58442adc30586a548271b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t28sh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:21Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:21 crc kubenswrapper[4733]: I0318 10:15:21.414756 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f75e1c5-e0c5-43df-944f-77b734070793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5b4eaa631b67f13321cd60f9136da1832c5cd6e226609c01cabfa28410630a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e7a90421535b4f8ff5e3b3a0ad9c958710094ffa4e3e4eb3eb41c79f80830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2h7dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:21Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:21 crc kubenswrapper[4733]: I0318 10:15:21.436437 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70b23fcc-38c7-420b-ad9a-57d1c547c788\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb888f7a23904596729e28ec137231447f22565be42be8589f1481aa52efd9f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d0f02cb69f907a82795f47bfae39d1f750bb7bedeeb6d0802e84087dd7150df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cfda710da166c7b27fe6df3f38f5f969d0edea58503530ace9d35e3a7ec1420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b454a77a46e10fcea3615e1f59d7849430a461ee7392b37fbbb6ec89e53eb432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://448f3d210c3e435bb68acc8f81dd92e63739d073e0d3746be3985c3d3fe07556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:21Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:21 crc kubenswrapper[4733]: I0318 10:15:21.446937 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"908bd772-fb33-4f68-8971-d1fef3118c82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3457636bb3e1cc25507158454524b9cee6812beb56c7b22fb86b9438b8082488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:21Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:21 crc kubenswrapper[4733]: I0318 10:15:21.460368 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a37904dce4f31563b6bf3db4a4e779fcaebf12e80cdabf402fb1fcf03320f46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:21Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:23 crc kubenswrapper[4733]: I0318 10:15:23.153160 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:15:23 crc kubenswrapper[4733]: I0318 10:15:23.153264 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:15:23 crc kubenswrapper[4733]: I0318 10:15:23.153293 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:15:23 crc kubenswrapper[4733]: I0318 10:15:23.153316 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:15:23 crc kubenswrapper[4733]: I0318 10:15:23.153333 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:15:23Z","lastTransitionTime":"2026-03-18T10:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:15:23 crc kubenswrapper[4733]: I0318 10:15:23.175304 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:15:23 crc kubenswrapper[4733]: I0318 10:15:23.175422 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:15:23 crc kubenswrapper[4733]: I0318 10:15:23.175594 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:15:23 crc kubenswrapper[4733]: E0318 10:15:23.175775 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 10:15:23 crc kubenswrapper[4733]: E0318 10:15:23.175906 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 10:15:23 crc kubenswrapper[4733]: E0318 10:15:23.175981 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a826494-c246-4717-869b-fd136e2b8410\\\",\\\"systemUUID\\\":\\\"fe704b25-4cdf-410a-9afb-ebc7963f4bc5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:23Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:23 crc kubenswrapper[4733]: E0318 10:15:23.176454 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4s425" podUID="b3650177-e338-4eba-ab42-bc0cd14c9d65" Mar 18 10:15:23 crc kubenswrapper[4733]: I0318 10:15:23.177001 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:15:23 crc kubenswrapper[4733]: E0318 10:15:23.177348 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 10:15:23 crc kubenswrapper[4733]: I0318 10:15:23.181646 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:15:23 crc kubenswrapper[4733]: I0318 10:15:23.181707 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:15:23 crc kubenswrapper[4733]: I0318 10:15:23.181729 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:15:23 crc kubenswrapper[4733]: I0318 10:15:23.181752 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:15:23 crc kubenswrapper[4733]: I0318 10:15:23.181770 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:15:23Z","lastTransitionTime":"2026-03-18T10:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:15:23 crc kubenswrapper[4733]: E0318 10:15:23.202327 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a826494-c246-4717-869b-fd136e2b8410\\\",\\\"systemUUID\\\":\\\"fe704b25-4cdf-410a-9afb-ebc7963f4bc5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:23Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:23 crc kubenswrapper[4733]: I0318 10:15:23.207454 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:15:23 crc kubenswrapper[4733]: I0318 10:15:23.207507 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:15:23 crc kubenswrapper[4733]: I0318 10:15:23.207528 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:15:23 crc kubenswrapper[4733]: I0318 10:15:23.207553 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:15:23 crc kubenswrapper[4733]: I0318 10:15:23.207574 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:15:23Z","lastTransitionTime":"2026-03-18T10:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:15:23 crc kubenswrapper[4733]: E0318 10:15:23.228676 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a826494-c246-4717-869b-fd136e2b8410\\\",\\\"systemUUID\\\":\\\"fe704b25-4cdf-410a-9afb-ebc7963f4bc5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:23Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:23 crc kubenswrapper[4733]: I0318 10:15:23.233877 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:15:23 crc kubenswrapper[4733]: I0318 10:15:23.234097 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:15:23 crc kubenswrapper[4733]: I0318 10:15:23.234275 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:15:23 crc kubenswrapper[4733]: I0318 10:15:23.234436 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:15:23 crc kubenswrapper[4733]: I0318 10:15:23.234580 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:15:23Z","lastTransitionTime":"2026-03-18T10:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:15:23 crc kubenswrapper[4733]: E0318 10:15:23.255477 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a826494-c246-4717-869b-fd136e2b8410\\\",\\\"systemUUID\\\":\\\"fe704b25-4cdf-410a-9afb-ebc7963f4bc5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:23Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:23 crc kubenswrapper[4733]: I0318 10:15:23.260961 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:15:23 crc kubenswrapper[4733]: I0318 10:15:23.261238 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:15:23 crc kubenswrapper[4733]: I0318 10:15:23.261583 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:15:23 crc kubenswrapper[4733]: I0318 10:15:23.261759 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:15:23 crc kubenswrapper[4733]: I0318 10:15:23.261909 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:15:23Z","lastTransitionTime":"2026-03-18T10:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:15:23 crc kubenswrapper[4733]: E0318 10:15:23.280209 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a826494-c246-4717-869b-fd136e2b8410\\\",\\\"systemUUID\\\":\\\"fe704b25-4cdf-410a-9afb-ebc7963f4bc5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:23Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:23 crc kubenswrapper[4733]: E0318 10:15:23.280328 4733 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 10:15:24 crc kubenswrapper[4733]: I0318 10:15:24.176565 4733 scope.go:117] "RemoveContainer" containerID="b1a81a8bb9ca8ad4c87fd9b3cd1ae0f5c21d0e4b39a32bd67b6c63b41175d0a6" Mar 18 10:15:25 crc kubenswrapper[4733]: I0318 10:15:25.052985 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pxwd_73327417-4d3b-45f1-b3b6-575fdeeaa31a/ovnkube-controller/2.log" Mar 18 10:15:25 crc kubenswrapper[4733]: I0318 10:15:25.055619 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" event={"ID":"73327417-4d3b-45f1-b3b6-575fdeeaa31a","Type":"ContainerStarted","Data":"f271860bb80800ec82f217effead5b1e9475829bbf78baea857aa7639eea7291"} Mar 18 10:15:25 crc kubenswrapper[4733]: I0318 10:15:25.056402 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:15:25 crc kubenswrapper[4733]: I0318 10:15:25.075617 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spfjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d693a73-68c1-4595-bbcc-be97691b06fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb07463e9cec5d204a136bc3da2a197f348b611ad242f9652741da372ebc490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e784035b634ef119368039982dbafab7f160c3864fe9ef9f5236d906de281b11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-spfjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:25Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:25 crc kubenswrapper[4733]: I0318 10:15:25.090279 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hsk58" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2c181c8-3361-40a2-afc5-a677e0ab4ecd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7ffcba189533d7ca155ab3284efac3d072ee3bc46d4b2a61247261bdaecb152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-httph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hsk58\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:25Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:25 crc kubenswrapper[4733]: I0318 10:15:25.107332 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aaa6e82080eecc5cde4d763e00b69fb4234de74431affa584f0b900a811dd2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:25Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:25 crc kubenswrapper[4733]: I0318 10:15:25.128898 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g6j2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6a4e9643a717b3f38fc1bed5c534e12bb873f0ffcf3c504cb4395c11621a73a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf9836f3455051ee686f0ec11ceb1c60cff06c95a16bf2fcff6c4c3ed600b034\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T10:15:12Z\\\",\\\"message\\\":\\\"2026-03-18T10:14:27+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_92f48003-aa44-42d0-a76f-02756a51562c\\\\n2026-03-18T10:14:27+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_92f48003-aa44-42d0-a76f-02756a51562c to /host/opt/cni/bin/\\\\n2026-03-18T10:14:27Z [verbose] multus-daemon started\\\\n2026-03-18T10:14:27Z [verbose] Readiness Indicator file check\\\\n2026-03-18T10:15:12Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:26Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:15:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ph8vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g6j2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:25Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:25 crc kubenswrapper[4733]: I0318 10:15:25.140827 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"353ee984-b20f-41fa-978a-0167c20ede36\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4287a7d43815108131e4b725925805740a64682bc2a9c96ff054f65517e501f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7c73fedb720681572ba31d10e49b7fc28537f98b4afb32bee611e6265eafaff\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T10:13:13Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 10:12:43.210581 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 10:12:43.213660 1 observer_polling.go:159] Starting file observer\\\\nI0318 10:12:43.251533 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 10:12:43.256315 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0318 10:13:13.491530 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:13:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e84c65c99c9c698f4097bbffe0efebd320e4fc2c4a58788a606e7f0b98e1822\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b54b5cea02ea38b404d6b5730afbab0f729978207023e1dfa7cc49ea9736795\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28b29e8c4af41ef6391d7ea79821c7caa64424b8113473541a96ae936db10015\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:25Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:25 crc kubenswrapper[4733]: I0318 10:15:25.152132 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:25Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:25 crc kubenswrapper[4733]: I0318 10:15:25.167954 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xfvfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb58b528-9013-4fab-9747-60bb6ff1bc1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc72346f1bb873e40a1063486ebd2adfd16e3958e17730370c00cb3b775a982c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg7jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xfvfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:25Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:25 crc kubenswrapper[4733]: I0318 10:15:25.174884 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:15:25 crc kubenswrapper[4733]: I0318 10:15:25.174935 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:15:25 crc kubenswrapper[4733]: I0318 10:15:25.175017 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:15:25 crc kubenswrapper[4733]: E0318 10:15:25.175098 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 10:15:25 crc kubenswrapper[4733]: E0318 10:15:25.175256 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4s425" podUID="b3650177-e338-4eba-ab42-bc0cd14c9d65" Mar 18 10:15:25 crc kubenswrapper[4733]: E0318 10:15:25.175323 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 10:15:25 crc kubenswrapper[4733]: I0318 10:15:25.175443 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:15:25 crc kubenswrapper[4733]: E0318 10:15:25.175591 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 10:15:25 crc kubenswrapper[4733]: I0318 10:15:25.191821 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73327417-4d3b-45f1-b3b6-575fdeeaa31a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f1102a1e204850c8494267640da7ec93ec67e7341c4eb60d22a7f0772058cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6233b06ff88fa77ac74becea9ef44fd4aa09b0ae718390c1b73c78d353ecbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10bb15e1466307914440acf630eb4ea4a442cf4354a9cc5e6ddb40d8147a4d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7ac487e3e86e35e8a1d6ddf67975eb4a67657b219938a69a90ccd5774ee0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb388a698a6a2ee1963deeba09459d7190f60ed189ee20a2ba24de317604226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3043711b80807f025d3bd2e7b4593f22d78dd3e458aa185c18c065af4aca503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f271860bb80800ec82f217effead5b1e9475829bbf78baea857aa7639eea7291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1a81a8bb9ca8ad4c87fd9b3cd1ae0f5c21d0e4b39a32bd67b6c63b41175d0a6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T10:14:54Z\\\",\\\"message\\\":\\\"\\\\nI0318 10:14:54.959635 6930 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0318 10:14:54.959527 6930 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI0318 10:14:54.959642 6930 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI0318 10:14:54.959646 6930 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0318 10:14:54.959649 6930 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nI0318 10:14:54.959466 6930 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0318 10:14:54.959658 6930 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI0318 10:14:54.959662 6930 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI0318 10:14:54.959665 6930 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nF0318 10:14:54.959243 6930 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:15:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2d3be12ab406039374efc6a0094e21103be62b51ef65c4ccf5529d6ef05291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pxwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:25Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:25 crc kubenswrapper[4733]: I0318 10:15:25.208019 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t28sh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14c8bb1225c6c415d19ccaf11f0117aa22ccf43aa3b80472a8779ec5cea1aeb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f534576a07c40f1e53709418bbc816e439ac9e036c97e780922c19e36c272642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f534576a07c40f1e53709418bbc816e439ac9e036c97e780922c19e36c272642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819ab8cd6ee5f116abf7861120afb8ac702ade05eb0efc4f73366042f2dd3bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5819ab8cd6ee5f116abf7861120afb8ac702ade05eb0efc4f73366042f2dd3bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50025671e1a19a96b715db63cbfa653f3cf078d3bfcfd94803bca2fac9636637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50025671e1a19a96b715db63cbfa653f3cf078d3bfcfd94803bca2fac9636637\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bb1ae24448e6a65c00cce421e6bbc772e3cd734f8f35fb411ce8eac5d662256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bb1ae24448e6a65c00cce421e6bbc772e3cd734f8f35fb411ce8eac5d662256\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaee328f999419b8d4f3a3298dbf38fd35bc92f76224d38c6769aa7426bb9a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaee328f999419b8d4f3a3298dbf38fd35bc92f76224d38c6769aa7426bb9a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a0b2c5f56088e948c02d27d94da94aba67e2c6ffc58442adc30586a548271b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a0b2c5f56088e948c02d27d94da94aba67e2c6ffc58442adc30586a548271b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t28sh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:25Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:25 crc kubenswrapper[4733]: I0318 10:15:25.220073 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f75e1c5-e0c5-43df-944f-77b734070793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5b4eaa631b67f13321cd60f9136da1832c5cd6e226609c01cabfa28410630a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e7a90421535b4f8ff5e3b3a0ad9c958710094ffa4e3e4eb3eb41c79f80830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2h7dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:25Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:25 crc kubenswrapper[4733]: I0318 10:15:25.242248 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70b23fcc-38c7-420b-ad9a-57d1c547c788\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb888f7a23904596729e28ec137231447f22565be42be8589f1481aa52efd9f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d0f02cb69f907a82795f47bfae39d1f750bb7bedeeb6d0802e84087dd7150df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cfda710da166c7b27fe6df3f38f5f969d0edea58503530ace9d35e3a7ec1420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b454a77a46e10fcea3615e1f59d7849430a461ee7392b37fbbb6ec89e53eb432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://448f3d210c3e435bb68acc8f81dd92e63739d073e0d3746be3985c3d3fe07556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:25Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:25 crc kubenswrapper[4733]: I0318 10:15:25.253882 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"908bd772-fb33-4f68-8971-d1fef3118c82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3457636bb3e1cc25507158454524b9cee6812beb56c7b22fb86b9438b8082488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:25Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:25 crc kubenswrapper[4733]: I0318 10:15:25.266837 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a37904dce4f31563b6bf3db4a4e779fcaebf12e80cdabf402fb1fcf03320f46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:25Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:25 crc kubenswrapper[4733]: I0318 10:15:25.281384 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:25Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:25 crc kubenswrapper[4733]: I0318 10:15:25.297521 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a14e8a496af63cf1951ed21cfb3b13b1b516b00271dce19cdf858148beff398b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61bc78e89fc84025b585b2a421fa96e8da9f90840b8c78c0658f30d8738c64ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:25Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:25 crc kubenswrapper[4733]: I0318 10:15:25.312325 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddb303e3-8922-4b43-9bba-2d3f0c30c6b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1614bd2915eb4ab62554cfe72d63669c062baaf25ae2e533788b876ff9544eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aaa002cf5203102149456e58fcc5db02a5e861736d3699e432a91186bac47d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edcafff0c9902e275fc23a2f154d3030c0e751e2f3230a4ca226c9cef8efcbfa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa9eed1a11fd6a14b82ea9f34ead9b9c67e9c9d52c2675651b37f9838875052\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba371d0dc81f8827d305037cab25306e3abe8ed3d243f74923b4709198f7ea38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T10:13:42Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 10:13:41.916017 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 10:13:41.916132 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 10:13:41.917022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1943543564/tls.crt::/tmp/serving-cert-1943543564/tls.key\\\\\\\"\\\\nI0318 10:13:42.070462 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 10:13:42.072416 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 10:13:42.072438 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 10:13:42.072464 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 10:13:42.072469 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 10:13:42.076902 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 10:13:42.076943 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076949 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076959 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 10:13:42.076962 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 10:13:42.076967 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 10:13:42.076974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 10:13:42.077028 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 10:13:42.078631 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:13:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b698902beccdf67c5646c01b34eea131f61dee8d5d6e1f566cdb70c930b2cde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:25Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:25 crc kubenswrapper[4733]: I0318 10:15:25.325914 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e797e62-fc82-47f7-8c8c-6c11d3463304\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0cb2e53d9e61f6e93594f61ef9614e057a66575c32d18a010ab1ecfd3ac367f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2fe29241779e03381bb946ac650ea8a793785c0c3ed67302dd89f1c5e0d93e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b09c8d5c3c63eb7d9db92ce941aec0f0def87adbc1d46334ccc518a47c60f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b4b403598b0be68c5baba6e126ecad218005a9c2aeea9badf14dfc4859dce03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b4b403598b0be68c5baba6e126ecad218005a9c2aeea9badf14dfc4859dce03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:25Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:25 crc kubenswrapper[4733]: I0318 10:15:25.344009 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:25Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:25 crc kubenswrapper[4733]: I0318 10:15:25.357181 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4s425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3650177-e338-4eba-ab42-bc0cd14c9d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4s425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:25Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:26 crc kubenswrapper[4733]: I0318 10:15:26.062932 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pxwd_73327417-4d3b-45f1-b3b6-575fdeeaa31a/ovnkube-controller/3.log" Mar 18 10:15:26 crc kubenswrapper[4733]: I0318 10:15:26.064003 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pxwd_73327417-4d3b-45f1-b3b6-575fdeeaa31a/ovnkube-controller/2.log" Mar 18 10:15:26 crc kubenswrapper[4733]: I0318 10:15:26.067848 4733 generic.go:334] "Generic (PLEG): container finished" podID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" containerID="f271860bb80800ec82f217effead5b1e9475829bbf78baea857aa7639eea7291" exitCode=1 Mar 18 10:15:26 crc kubenswrapper[4733]: I0318 10:15:26.067909 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" event={"ID":"73327417-4d3b-45f1-b3b6-575fdeeaa31a","Type":"ContainerDied","Data":"f271860bb80800ec82f217effead5b1e9475829bbf78baea857aa7639eea7291"} Mar 18 10:15:26 crc kubenswrapper[4733]: I0318 10:15:26.067959 4733 scope.go:117] "RemoveContainer" containerID="b1a81a8bb9ca8ad4c87fd9b3cd1ae0f5c21d0e4b39a32bd67b6c63b41175d0a6" Mar 18 10:15:26 crc kubenswrapper[4733]: I0318 10:15:26.069094 4733 scope.go:117] "RemoveContainer" containerID="f271860bb80800ec82f217effead5b1e9475829bbf78baea857aa7639eea7291" Mar 18 10:15:26 crc kubenswrapper[4733]: E0318 10:15:26.069367 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7pxwd_openshift-ovn-kubernetes(73327417-4d3b-45f1-b3b6-575fdeeaa31a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" podUID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" Mar 18 10:15:26 crc kubenswrapper[4733]: I0318 10:15:26.091787 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddb303e3-8922-4b43-9bba-2d3f0c30c6b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1614bd2915eb4ab62554cfe72d63669c062baaf25ae2e533788b876ff9544eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aaa002cf5203102149456e58fcc5db02a5e861736d3699e432a91186bac47d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edcafff0c9902e275fc23a2f154d3030c0e751e2f3230a4ca226c9cef8efcbfa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa9eed1a11fd6a14b82ea9f34ead9b9c67e9c9d52c2675651b37f9838875052\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba371d0dc81f8827d305037cab25306e3abe8ed3d243f74923b4709198f7ea38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T10:13:42Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 10:13:41.916017 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 10:13:41.916132 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 10:13:41.917022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1943543564/tls.crt::/tmp/serving-cert-1943543564/tls.key\\\\\\\"\\\\nI0318 10:13:42.070462 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 10:13:42.072416 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 10:13:42.072438 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 10:13:42.072464 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 10:13:42.072469 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 10:13:42.076902 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 10:13:42.076943 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076949 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076959 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 10:13:42.076962 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 10:13:42.076967 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 10:13:42.076974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 10:13:42.077028 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 10:13:42.078631 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:13:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b698902beccdf67c5646c01b34eea131f61dee8d5d6e1f566cdb70c930b2cde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:26Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:26 crc kubenswrapper[4733]: I0318 10:15:26.107342 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e797e62-fc82-47f7-8c8c-6c11d3463304\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0cb2e53d9e61f6e93594f61ef9614e057a66575c32d18a010ab1ecfd3ac367f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2fe29241779e03381bb946ac650ea8a793785c0c3ed67302dd89f1c5e0d93e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b09c8d5c3c63eb7d9db92ce941aec0f0def87adbc1d46334ccc518a47c60f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b4b403598b0be68c5baba6e126ecad218005a9c2aeea9badf14dfc4859dce03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b4b403598b0be68c5baba6e126ecad218005a9c2aeea9badf14dfc4859dce03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:26Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:26 crc kubenswrapper[4733]: I0318 10:15:26.128036 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:26Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:26 crc kubenswrapper[4733]: I0318 10:15:26.142145 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4s425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3650177-e338-4eba-ab42-bc0cd14c9d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4s425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:26Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:26 crc kubenswrapper[4733]: I0318 10:15:26.165022 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a14e8a496af63cf1951ed21cfb3b13b1b516b00271dce19cdf858148beff398b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61bc78e89fc84025b585b2a421fa96e8da9f90840b8c78c0658f30d8738c64ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:26Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:26 crc kubenswrapper[4733]: I0318 10:15:26.181902 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spfjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d693a73-68c1-4595-bbcc-be97691b06fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb07463e9cec5d204a136bc3da2a197f348b611ad242f9652741da372ebc490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e784035b634ef119368039982dbafab7f160c3864fe9ef9f5236d906de281b11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-spfjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:26Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:26 crc kubenswrapper[4733]: I0318 10:15:26.193180 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hsk58" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2c181c8-3361-40a2-afc5-a677e0ab4ecd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7ffcba189533d7ca155ab3284efac3d072ee3bc46d4b2a61247261bdaecb152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-httph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hsk58\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:26Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:26 crc kubenswrapper[4733]: I0318 10:15:26.204797 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aaa6e82080eecc5cde4d763e00b69fb4234de74431affa584f0b900a811dd2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:26Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:26 crc kubenswrapper[4733]: I0318 10:15:26.216988 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g6j2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6a4e9643a717b3f38fc1bed5c534e12bb873f0ffcf3c504cb4395c11621a73a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf9836f3455051ee686f0ec11ceb1c60cff06c95a16bf2fcff6c4c3ed600b034\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T10:15:12Z\\\",\\\"message\\\":\\\"2026-03-18T10:14:27+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_92f48003-aa44-42d0-a76f-02756a51562c\\\\n2026-03-18T10:14:27+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_92f48003-aa44-42d0-a76f-02756a51562c to /host/opt/cni/bin/\\\\n2026-03-18T10:14:27Z [verbose] multus-daemon started\\\\n2026-03-18T10:14:27Z [verbose] Readiness Indicator file check\\\\n2026-03-18T10:15:12Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:26Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:15:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ph8vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g6j2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:26Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:26 crc kubenswrapper[4733]: I0318 10:15:26.232064 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"353ee984-b20f-41fa-978a-0167c20ede36\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4287a7d43815108131e4b725925805740a64682bc2a9c96ff054f65517e501f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7c73fedb720681572ba31d10e49b7fc28537f98b4afb32bee611e6265eafaff\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T10:13:13Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 10:12:43.210581 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 10:12:43.213660 1 observer_polling.go:159] Starting file observer\\\\nI0318 10:12:43.251533 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 10:12:43.256315 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0318 10:13:13.491530 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:13:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e84c65c99c9c698f4097bbffe0efebd320e4fc2c4a58788a606e7f0b98e1822\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b54b5cea02ea38b404d6b5730afbab0f729978207023e1dfa7cc49ea9736795\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28b29e8c4af41ef6391d7ea79821c7caa64424b8113473541a96ae936db10015\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:26Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:26 crc kubenswrapper[4733]: I0318 10:15:26.243374 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:26Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:26 crc kubenswrapper[4733]: I0318 10:15:26.253771 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xfvfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb58b528-9013-4fab-9747-60bb6ff1bc1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc72346f1bb873e40a1063486ebd2adfd16e3958e17730370c00cb3b775a982c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg7jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xfvfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:26Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:26 crc kubenswrapper[4733]: E0318 10:15:26.278044 4733 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 10:15:26 crc kubenswrapper[4733]: I0318 10:15:26.282140 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73327417-4d3b-45f1-b3b6-575fdeeaa31a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f1102a1e204850c8494267640da7ec93ec67e7341c4eb60d22a7f0772058cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6233b06ff88fa77ac74becea9ef44fd4aa09b0ae718390c1b73c78d353ecbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10bb15e1466307914440acf630eb4ea4a442cf4354a9cc5e6ddb40d8147a4d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7ac487e3e86e35e8a1d6ddf67975eb4a67657b219938a69a90ccd5774ee0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb388a698a6a2ee1963deeba09459d7190f60ed189ee20a2ba24de317604226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3043711b80807f025d3bd2e7b4593f22d78dd3e458aa185c18c065af4aca503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f271860bb80800ec82f217effead5b1e9475829bbf78baea857aa7639eea7291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1a81a8bb9ca8ad4c87fd9b3cd1ae0f5c21d0e4b39a32bd67b6c63b41175d0a6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T10:14:54Z\\\",\\\"message\\\":\\\"\\\\nI0318 10:14:54.959635 6930 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0318 10:14:54.959527 6930 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI0318 10:14:54.959642 6930 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI0318 10:14:54.959646 6930 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0318 10:14:54.959649 6930 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nI0318 10:14:54.959466 6930 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0318 10:14:54.959658 6930 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI0318 10:14:54.959662 6930 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI0318 10:14:54.959665 6930 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nF0318 10:14:54.959243 6930 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f271860bb80800ec82f217effead5b1e9475829bbf78baea857aa7639eea7291\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T10:15:25Z\\\",\\\"message\\\":\\\"_openshift-kube-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"3ec9f67e-7758-4707-a6d0-2dc28f28ac37\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.219\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0318 10:15:25.172380 7260 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:15:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2d3be12ab406039374efc6a0094e21103be62b51ef65c4ccf5529d6ef05291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pxwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:26Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:26 crc kubenswrapper[4733]: I0318 10:15:26.305495 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70b23fcc-38c7-420b-ad9a-57d1c547c788\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb888f7a23904596729e28ec137231447f22565be42be8589f1481aa52efd9f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d0f02cb69f907a82795f47bfae39d1f750bb7bedeeb6d0802e84087dd7150df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cfda710da166c7b27fe6df3f38f5f969d0edea58503530ace9d35e3a7ec1420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b454a77a46e10fcea3615e1f59d7849430a461ee7392b37fbbb6ec89e53eb432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://448f3d210c3e435bb68acc8f81dd92e63739d073e0d3746be3985c3d3fe07556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:26Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:26 crc kubenswrapper[4733]: I0318 10:15:26.316483 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"908bd772-fb33-4f68-8971-d1fef3118c82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3457636bb3e1cc25507158454524b9cee6812beb56c7b22fb86b9438b8082488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:26Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:26 crc kubenswrapper[4733]: I0318 10:15:26.328045 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a37904dce4f31563b6bf3db4a4e779fcaebf12e80cdabf402fb1fcf03320f46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:26Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:26 crc kubenswrapper[4733]: I0318 10:15:26.340599 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:26Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:26 crc kubenswrapper[4733]: I0318 10:15:26.360651 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t28sh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14c8bb1225c6c415d19ccaf11f0117aa22ccf43aa3b80472a8779ec5cea1aeb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f534576a07c40f1e53709418bbc816e439ac9e036c97e780922c19e36c272642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f534576a07c40f1e53709418bbc816e439ac9e036c97e780922c19e36c272642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819ab8cd6ee5f116abf7861120afb8ac702ade05eb0efc4f73366042f2dd3bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5819ab8cd6ee5f116abf7861120afb8ac702ade05eb0efc4f73366042f2dd3bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50025671e1a19a96b715db63cbfa653f3cf078d3bfcfd94803bca2fac9636637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50025671e1a19a96b715db63cbfa653f3cf078d3bfcfd94803bca2fac9636637\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bb1ae24448e6a65c00cce421e6bbc772e3cd734f8f35fb411ce8eac5d662256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bb1ae24448e6a65c00cce421e6bbc772e3cd734f8f35fb411ce8eac5d662256\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaee328f999419b8d4f3a3298dbf38fd35bc92f76224d38c6769aa7426bb9a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaee328f999419b8d4f3a3298dbf38fd35bc92f76224d38c6769aa7426bb9a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a0b2c5f56088e948c02d27d94da94aba67e2c6ffc58442adc30586a548271b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a0b2c5f56088e948c02d27d94da94aba67e2c6ffc58442adc30586a548271b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t28sh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:26Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:26 crc kubenswrapper[4733]: I0318 10:15:26.376266 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f75e1c5-e0c5-43df-944f-77b734070793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5b4eaa631b67f13321cd60f9136da1832c5cd6e226609c01cabfa28410630a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e7a90421535b4f8ff5e3b3a0ad9c958710094ffa4e3e4eb3eb41c79f80830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2h7dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:26Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:27 crc kubenswrapper[4733]: I0318 10:15:27.074782 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pxwd_73327417-4d3b-45f1-b3b6-575fdeeaa31a/ovnkube-controller/3.log" Mar 18 10:15:27 crc kubenswrapper[4733]: I0318 10:15:27.081146 4733 scope.go:117] "RemoveContainer" containerID="f271860bb80800ec82f217effead5b1e9475829bbf78baea857aa7639eea7291" Mar 18 10:15:27 crc kubenswrapper[4733]: E0318 10:15:27.081494 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7pxwd_openshift-ovn-kubernetes(73327417-4d3b-45f1-b3b6-575fdeeaa31a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" podUID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" Mar 18 10:15:27 crc kubenswrapper[4733]: I0318 10:15:27.098277 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4s425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3650177-e338-4eba-ab42-bc0cd14c9d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4s425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:27Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:27 crc kubenswrapper[4733]: I0318 10:15:27.118069 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a14e8a496af63cf1951ed21cfb3b13b1b516b00271dce19cdf858148beff398b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61bc78e89fc84025b585b2a421fa96e8da9f90840b8c78c0658f30d8738c64ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:27Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:27 crc kubenswrapper[4733]: I0318 10:15:27.138310 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddb303e3-8922-4b43-9bba-2d3f0c30c6b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1614bd2915eb4ab62554cfe72d63669c062baaf25ae2e533788b876ff9544eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aaa002cf5203102149456e58fcc5db02a5e861736d3699e432a91186bac47d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edcafff0c9902e275fc23a2f154d3030c0e751e2f3230a4ca226c9cef8efcbfa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa9eed1a11fd6a14b82ea9f34ead9b9c67e9c9d52c2675651b37f9838875052\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba371d0dc81f8827d305037cab25306e3abe8ed3d243f74923b4709198f7ea38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T10:13:42Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 10:13:41.916017 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 10:13:41.916132 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 10:13:41.917022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1943543564/tls.crt::/tmp/serving-cert-1943543564/tls.key\\\\\\\"\\\\nI0318 10:13:42.070462 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 10:13:42.072416 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 10:13:42.072438 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 10:13:42.072464 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 10:13:42.072469 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 10:13:42.076902 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 10:13:42.076943 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076949 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076959 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 10:13:42.076962 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 10:13:42.076967 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 10:13:42.076974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 10:13:42.077028 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 10:13:42.078631 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:13:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b698902beccdf67c5646c01b34eea131f61dee8d5d6e1f566cdb70c930b2cde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:27Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:27 crc kubenswrapper[4733]: I0318 10:15:27.171499 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e797e62-fc82-47f7-8c8c-6c11d3463304\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0cb2e53d9e61f6e93594f61ef9614e057a66575c32d18a010ab1ecfd3ac367f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2fe29241779e03381bb946ac650ea8a793785c0c3ed67302dd89f1c5e0d93e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b09c8d5c3c63eb7d9db92ce941aec0f0def87adbc1d46334ccc518a47c60f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b4b403598b0be68c5baba6e126ecad218005a9c2aeea9badf14dfc4859dce03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b4b403598b0be68c5baba6e126ecad218005a9c2aeea9badf14dfc4859dce03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:27Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:27 crc kubenswrapper[4733]: I0318 10:15:27.174579 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:15:27 crc kubenswrapper[4733]: I0318 10:15:27.174629 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:15:27 crc kubenswrapper[4733]: I0318 10:15:27.174697 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:15:27 crc kubenswrapper[4733]: E0318 10:15:27.174823 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 10:15:27 crc kubenswrapper[4733]: I0318 10:15:27.174859 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:15:27 crc kubenswrapper[4733]: E0318 10:15:27.175028 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4s425" podUID="b3650177-e338-4eba-ab42-bc0cd14c9d65" Mar 18 10:15:27 crc kubenswrapper[4733]: E0318 10:15:27.175179 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 10:15:27 crc kubenswrapper[4733]: E0318 10:15:27.175384 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 10:15:27 crc kubenswrapper[4733]: I0318 10:15:27.195587 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:27Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:27 crc kubenswrapper[4733]: I0318 10:15:27.213818 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g6j2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6a4e9643a717b3f38fc1bed5c534e12bb873f0ffcf3c504cb4395c11621a73a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf9836f3455051ee686f0ec11ceb1c60cff06c95a16bf2fcff6c4c3ed600b034\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T10:15:12Z\\\",\\\"message\\\":\\\"2026-03-18T10:14:27+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_92f48003-aa44-42d0-a76f-02756a51562c\\\\n2026-03-18T10:14:27+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_92f48003-aa44-42d0-a76f-02756a51562c to /host/opt/cni/bin/\\\\n2026-03-18T10:14:27Z [verbose] multus-daemon started\\\\n2026-03-18T10:14:27Z [verbose] Readiness Indicator file check\\\\n2026-03-18T10:15:12Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:26Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:15:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ph8vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g6j2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:27Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:27 crc kubenswrapper[4733]: I0318 10:15:27.233915 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spfjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d693a73-68c1-4595-bbcc-be97691b06fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb07463e9cec5d204a136bc3da2a197f348b611ad242f9652741da372ebc490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e784035b634ef119368039982dbafab7f160c3864fe9ef9f5236d906de281b11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-spfjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:27Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:27 crc kubenswrapper[4733]: I0318 10:15:27.247801 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hsk58" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2c181c8-3361-40a2-afc5-a677e0ab4ecd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7ffcba189533d7ca155ab3284efac3d072ee3bc46d4b2a61247261bdaecb152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-httph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hsk58\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:27Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:27 crc kubenswrapper[4733]: I0318 10:15:27.263368 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aaa6e82080eecc5cde4d763e00b69fb4234de74431affa584f0b900a811dd2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:27Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:27 crc kubenswrapper[4733]: I0318 10:15:27.294998 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73327417-4d3b-45f1-b3b6-575fdeeaa31a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f1102a1e204850c8494267640da7ec93ec67e7341c4eb60d22a7f0772058cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6233b06ff88fa77ac74becea9ef44fd4aa09b0ae718390c1b73c78d353ecbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10bb15e1466307914440acf630eb4ea4a442cf4354a9cc5e6ddb40d8147a4d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7ac487e3e86e35e8a1d6ddf67975eb4a67657b219938a69a90ccd5774ee0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb388a698a6a2ee1963deeba09459d7190f60ed189ee20a2ba24de317604226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3043711b80807f025d3bd2e7b4593f22d78dd3e458aa185c18c065af4aca503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f271860bb80800ec82f217effead5b1e9475829bbf78baea857aa7639eea7291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f271860bb80800ec82f217effead5b1e9475829bbf78baea857aa7639eea7291\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T10:15:25Z\\\",\\\"message\\\":\\\"_openshift-kube-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"3ec9f67e-7758-4707-a6d0-2dc28f28ac37\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.219\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0318 10:15:25.172380 7260 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:15:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7pxwd_openshift-ovn-kubernetes(73327417-4d3b-45f1-b3b6-575fdeeaa31a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2d3be12ab406039374efc6a0094e21103be62b51ef65c4ccf5529d6ef05291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pxwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:27Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:27 crc kubenswrapper[4733]: I0318 10:15:27.316307 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"353ee984-b20f-41fa-978a-0167c20ede36\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4287a7d43815108131e4b725925805740a64682bc2a9c96ff054f65517e501f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7c73fedb720681572ba31d10e49b7fc28537f98b4afb32bee611e6265eafaff\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T10:13:13Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 10:12:43.210581 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 10:12:43.213660 1 observer_polling.go:159] Starting file observer\\\\nI0318 10:12:43.251533 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 10:12:43.256315 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0318 10:13:13.491530 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:13:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e84c65c99c9c698f4097bbffe0efebd320e4fc2c4a58788a606e7f0b98e1822\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b54b5cea02ea38b404d6b5730afbab0f729978207023e1dfa7cc49ea9736795\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28b29e8c4af41ef6391d7ea79821c7caa64424b8113473541a96ae936db10015\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:27Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:27 crc kubenswrapper[4733]: I0318 10:15:27.333983 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:27Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:27 crc kubenswrapper[4733]: I0318 10:15:27.348247 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xfvfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb58b528-9013-4fab-9747-60bb6ff1bc1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc72346f1bb873e40a1063486ebd2adfd16e3958e17730370c00cb3b775a982c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg7jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xfvfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:27Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:27 crc kubenswrapper[4733]: I0318 10:15:27.365559 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:27Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:27 crc kubenswrapper[4733]: I0318 10:15:27.391184 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t28sh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14c8bb1225c6c415d19ccaf11f0117aa22ccf43aa3b80472a8779ec5cea1aeb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f534576a07c40f1e53709418bbc816e439ac9e036c97e780922c19e36c272642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f534576a07c40f1e53709418bbc816e439ac9e036c97e780922c19e36c272642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819ab8cd6ee5f116abf7861120afb8ac702ade05eb0efc4f73366042f2dd3bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5819ab8cd6ee5f116abf7861120afb8ac702ade05eb0efc4f73366042f2dd3bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50025671e1a19a96b715db63cbfa653f3cf078d3bfcfd94803bca2fac9636637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50025671e1a19a96b715db63cbfa653f3cf078d3bfcfd94803bca2fac9636637\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bb1ae24448e6a65c00cce421e6bbc772e3cd734f8f35fb411ce8eac5d662256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bb1ae24448e6a65c00cce421e6bbc772e3cd734f8f35fb411ce8eac5d662256\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaee328f999419b8d4f3a3298dbf38fd35bc92f76224d38c6769aa7426bb9a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaee328f999419b8d4f3a3298dbf38fd35bc92f76224d38c6769aa7426bb9a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a0b2c5f56088e948c02d27d94da94aba67e2c6ffc58442adc30586a548271b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a0b2c5f56088e948c02d27d94da94aba67e2c6ffc58442adc30586a548271b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t28sh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:27Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:27 crc kubenswrapper[4733]: I0318 10:15:27.402829 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f75e1c5-e0c5-43df-944f-77b734070793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5b4eaa631b67f13321cd60f9136da1832c5cd6e226609c01cabfa28410630a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e7a90421535b4f8ff5e3b3a0ad9c958710094ffa4e3e4eb3eb41c79f80830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2h7dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:27Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:27 crc kubenswrapper[4733]: I0318 10:15:27.426007 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70b23fcc-38c7-420b-ad9a-57d1c547c788\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb888f7a23904596729e28ec137231447f22565be42be8589f1481aa52efd9f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d0f02cb69f907a82795f47bfae39d1f750bb7bedeeb6d0802e84087dd7150df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cfda710da166c7b27fe6df3f38f5f969d0edea58503530ace9d35e3a7ec1420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b454a77a46e10fcea3615e1f59d7849430a461ee7392b37fbbb6ec89e53eb432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://448f3d210c3e435bb68acc8f81dd92e63739d073e0d3746be3985c3d3fe07556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:27Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:27 crc kubenswrapper[4733]: I0318 10:15:27.441308 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"908bd772-fb33-4f68-8971-d1fef3118c82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3457636bb3e1cc25507158454524b9cee6812beb56c7b22fb86b9438b8082488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:27Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:27 crc kubenswrapper[4733]: I0318 10:15:27.459971 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a37904dce4f31563b6bf3db4a4e779fcaebf12e80cdabf402fb1fcf03320f46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:27Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:29 crc kubenswrapper[4733]: I0318 10:15:29.175246 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:15:29 crc kubenswrapper[4733]: I0318 10:15:29.175320 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:15:29 crc kubenswrapper[4733]: I0318 10:15:29.175331 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:15:29 crc kubenswrapper[4733]: E0318 10:15:29.175434 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 10:15:29 crc kubenswrapper[4733]: I0318 10:15:29.175522 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:15:29 crc kubenswrapper[4733]: E0318 10:15:29.175549 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4s425" podUID="b3650177-e338-4eba-ab42-bc0cd14c9d65" Mar 18 10:15:29 crc kubenswrapper[4733]: E0318 10:15:29.175832 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 10:15:29 crc kubenswrapper[4733]: E0318 10:15:29.176304 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 10:15:31 crc kubenswrapper[4733]: I0318 10:15:31.174756 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:15:31 crc kubenswrapper[4733]: I0318 10:15:31.174883 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:15:31 crc kubenswrapper[4733]: E0318 10:15:31.174958 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 10:15:31 crc kubenswrapper[4733]: I0318 10:15:31.174795 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:15:31 crc kubenswrapper[4733]: I0318 10:15:31.174888 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:15:31 crc kubenswrapper[4733]: E0318 10:15:31.175104 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 10:15:31 crc kubenswrapper[4733]: E0318 10:15:31.175291 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 10:15:31 crc kubenswrapper[4733]: E0318 10:15:31.175459 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4s425" podUID="b3650177-e338-4eba-ab42-bc0cd14c9d65" Mar 18 10:15:31 crc kubenswrapper[4733]: I0318 10:15:31.208027 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70b23fcc-38c7-420b-ad9a-57d1c547c788\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb888f7a23904596729e28ec137231447f22565be42be8589f1481aa52efd9f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d0f02cb69f907a82795f47bfae39d1f750bb7bedeeb6d0802e84087dd7150df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cfda710da166c7b27fe6df3f38f5f969d0edea58503530ace9d35e3a7ec1420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b454a77a46e10fcea3615e1f59d7849430a461ee7392b37fbbb6ec89e53eb432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://448f3d210c3e435bb68acc8f81dd92e63739d073e0d3746be3985c3d3fe07556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fa633c3540b7c0b9ac0a8ccaca1fb619e1fffff85dd8626f4140c926862ba1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://980838b633a0f6d7144508883fbb8308ffbdba2b27ce7eecf1e046f28b30ec6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1b9a779bc130375be7c7d0792015ec5a467cfc85fd2bead18618dc5e292b2e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:31Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:31 crc kubenswrapper[4733]: I0318 10:15:31.220878 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"908bd772-fb33-4f68-8971-d1fef3118c82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3457636bb3e1cc25507158454524b9cee6812beb56c7b22fb86b9438b8082488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5452bdbefc77b8eab2e8b5f9f71d0edb627db0661edd4abd3989970363812fb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:31Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:31 crc kubenswrapper[4733]: I0318 10:15:31.235682 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a37904dce4f31563b6bf3db4a4e779fcaebf12e80cdabf402fb1fcf03320f46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:31Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:31 crc kubenswrapper[4733]: I0318 10:15:31.250029 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:31Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:31 crc kubenswrapper[4733]: I0318 10:15:31.267340 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t28sh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f82588a-9dbd-4c55-8cfc-f96e57fa58b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14c8bb1225c6c415d19ccaf11f0117aa22ccf43aa3b80472a8779ec5cea1aeb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f534576a07c40f1e53709418bbc816e439ac9e036c97e780922c19e36c272642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f534576a07c40f1e53709418bbc816e439ac9e036c97e780922c19e36c272642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5819ab8cd6ee5f116abf7861120afb8ac702ade05eb0efc4f73366042f2dd3bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5819ab8cd6ee5f116abf7861120afb8ac702ade05eb0efc4f73366042f2dd3bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50025671e1a19a96b715db63cbfa653f3cf078d3bfcfd94803bca2fac9636637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50025671e1a19a96b715db63cbfa653f3cf078d3bfcfd94803bca2fac9636637\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bb1ae24448e6a65c00cce421e6bbc772e3cd734f8f35fb411ce8eac5d662256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bb1ae24448e6a65c00cce421e6bbc772e3cd734f8f35fb411ce8eac5d662256\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaee328f999419b8d4f3a3298dbf38fd35bc92f76224d38c6769aa7426bb9a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaee328f999419b8d4f3a3298dbf38fd35bc92f76224d38c6769aa7426bb9a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a0b2c5f56088e948c02d27d94da94aba67e2c6ffc58442adc30586a548271b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a0b2c5f56088e948c02d27d94da94aba67e2c6ffc58442adc30586a548271b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwk4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t28sh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:31Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:31 crc kubenswrapper[4733]: E0318 10:15:31.281891 4733 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 10:15:31 crc kubenswrapper[4733]: I0318 10:15:31.281667 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f75e1c5-e0c5-43df-944f-77b734070793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5b4eaa631b67f13321cd60f9136da1832c5cd6e226609c01cabfa28410630a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e7a90421535b4f8ff5e3b3a0ad9c958710094ffa4e3e4eb3eb41c79f80830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpnv6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2h7dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:31Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:31 crc kubenswrapper[4733]: I0318 10:15:31.298897 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddb303e3-8922-4b43-9bba-2d3f0c30c6b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1614bd2915eb4ab62554cfe72d63669c062baaf25ae2e533788b876ff9544eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aaa002cf5203102149456e58fcc5db02a5e861736d3699e432a91186bac47d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edcafff0c9902e275fc23a2f154d3030c0e751e2f3230a4ca226c9cef8efcbfa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa9eed1a11fd6a14b82ea9f34ead9b9c67e9c9d52c2675651b37f9838875052\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba371d0dc81f8827d305037cab25306e3abe8ed3d243f74923b4709198f7ea38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T10:13:42Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 10:13:41.916017 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 10:13:41.916132 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 10:13:41.917022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1943543564/tls.crt::/tmp/serving-cert-1943543564/tls.key\\\\\\\"\\\\nI0318 10:13:42.070462 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 10:13:42.072416 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 10:13:42.072438 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 10:13:42.072464 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 10:13:42.072469 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 10:13:42.076902 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 10:13:42.076943 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076949 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 10:13:42.076959 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 10:13:42.076962 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 10:13:42.076967 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 10:13:42.076974 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 10:13:42.077028 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 10:13:42.078631 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:13:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b698902beccdf67c5646c01b34eea131f61dee8d5d6e1f566cdb70c930b2cde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:31Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:31 crc kubenswrapper[4733]: I0318 10:15:31.311455 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e797e62-fc82-47f7-8c8c-6c11d3463304\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0cb2e53d9e61f6e93594f61ef9614e057a66575c32d18a010ab1ecfd3ac367f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2fe29241779e03381bb946ac650ea8a793785c0c3ed67302dd89f1c5e0d93e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b09c8d5c3c63eb7d9db92ce941aec0f0def87adbc1d46334ccc518a47c60f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b4b403598b0be68c5baba6e126ecad218005a9c2aeea9badf14dfc4859dce03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b4b403598b0be68c5baba6e126ecad218005a9c2aeea9badf14dfc4859dce03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:12:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:31Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:31 crc kubenswrapper[4733]: I0318 10:15:31.324441 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:31Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:31 crc kubenswrapper[4733]: I0318 10:15:31.333987 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4s425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3650177-e338-4eba-ab42-bc0cd14c9d65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9zpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4s425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:31Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:31 crc kubenswrapper[4733]: I0318 10:15:31.347655 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a14e8a496af63cf1951ed21cfb3b13b1b516b00271dce19cdf858148beff398b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61bc78e89fc84025b585b2a421fa96e8da9f90840b8c78c0658f30d8738c64ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:31Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:31 crc kubenswrapper[4733]: I0318 10:15:31.364783 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spfjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d693a73-68c1-4595-bbcc-be97691b06fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb07463e9cec5d204a136bc3da2a197f348b611ad242f9652741da372ebc490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e784035b634ef119368039982dbafab7f160c3864fe9ef9f5236d906de281b11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg7hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-spfjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:31Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:31 crc kubenswrapper[4733]: I0318 10:15:31.379362 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hsk58" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2c181c8-3361-40a2-afc5-a677e0ab4ecd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7ffcba189533d7ca155ab3284efac3d072ee3bc46d4b2a61247261bdaecb152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-httph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hsk58\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:31Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:31 crc kubenswrapper[4733]: I0318 10:15:31.395018 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aaa6e82080eecc5cde4d763e00b69fb4234de74431affa584f0b900a811dd2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:31Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:31 crc kubenswrapper[4733]: I0318 10:15:31.417217 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g6j2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6a4e9643a717b3f38fc1bed5c534e12bb873f0ffcf3c504cb4395c11621a73a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf9836f3455051ee686f0ec11ceb1c60cff06c95a16bf2fcff6c4c3ed600b034\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T10:15:12Z\\\",\\\"message\\\":\\\"2026-03-18T10:14:27+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_92f48003-aa44-42d0-a76f-02756a51562c\\\\n2026-03-18T10:14:27+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_92f48003-aa44-42d0-a76f-02756a51562c to /host/opt/cni/bin/\\\\n2026-03-18T10:14:27Z [verbose] multus-daemon started\\\\n2026-03-18T10:14:27Z [verbose] Readiness Indicator file check\\\\n2026-03-18T10:15:12Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:26Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:15:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ph8vv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g6j2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:31Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:31 crc kubenswrapper[4733]: I0318 10:15:31.438872 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"353ee984-b20f-41fa-978a-0167c20ede36\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:12:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4287a7d43815108131e4b725925805740a64682bc2a9c96ff054f65517e501f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7c73fedb720681572ba31d10e49b7fc28537f98b4afb32bee611e6265eafaff\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T10:13:13Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 10:12:43.210581 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 10:12:43.213660 1 observer_polling.go:159] Starting file observer\\\\nI0318 10:12:43.251533 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 10:12:43.256315 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0318 10:13:13.491530 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:13:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e84c65c99c9c698f4097bbffe0efebd320e4fc2c4a58788a606e7f0b98e1822\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b54b5cea02ea38b404d6b5730afbab0f729978207023e1dfa7cc49ea9736795\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28b29e8c4af41ef6391d7ea79821c7caa64424b8113473541a96ae936db10015\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:12:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:12:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:31Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:31 crc kubenswrapper[4733]: I0318 10:15:31.453715 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:31Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:31 crc kubenswrapper[4733]: I0318 10:15:31.468432 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xfvfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb58b528-9013-4fab-9747-60bb6ff1bc1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc72346f1bb873e40a1063486ebd2adfd16e3958e17730370c00cb3b775a982c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg7jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xfvfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:31Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:31 crc kubenswrapper[4733]: I0318 10:15:31.493754 4733 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73327417-4d3b-45f1-b3b6-575fdeeaa31a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T10:14:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f1102a1e204850c8494267640da7ec93ec67e7341c4eb60d22a7f0772058cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6233b06ff88fa77ac74becea9ef44fd4aa09b0ae718390c1b73c78d353ecbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10bb15e1466307914440acf630eb4ea4a442cf4354a9cc5e6ddb40d8147a4d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7ac487e3e86e35e8a1d6ddf67975eb4a67657b219938a69a90ccd5774ee0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb388a698a6a2ee1963deeba09459d7190f60ed189ee20a2ba24de317604226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3043711b80807f025d3bd2e7b4593f22d78dd3e458aa185c18c065af4aca503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f271860bb80800ec82f217effead5b1e9475829bbf78baea857aa7639eea7291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f271860bb80800ec82f217effead5b1e9475829bbf78baea857aa7639eea7291\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T10:15:25Z\\\",\\\"message\\\":\\\"_openshift-kube-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"3ec9f67e-7758-4707-a6d0-2dc28f28ac37\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.219\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0318 10:15:25.172380 7260 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T10:15:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7pxwd_openshift-ovn-kubernetes(73327417-4d3b-45f1-b3b6-575fdeeaa31a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2d3be12ab406039374efc6a0094e21103be62b51ef65c4ccf5529d6ef05291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T10:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T10:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T10:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqxdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T10:14:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pxwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:31Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:33 crc kubenswrapper[4733]: I0318 10:15:33.175239 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:15:33 crc kubenswrapper[4733]: I0318 10:15:33.175349 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:15:33 crc kubenswrapper[4733]: I0318 10:15:33.175404 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:15:33 crc kubenswrapper[4733]: I0318 10:15:33.175477 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:15:33 crc kubenswrapper[4733]: E0318 10:15:33.175425 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4s425" podUID="b3650177-e338-4eba-ab42-bc0cd14c9d65" Mar 18 10:15:33 crc kubenswrapper[4733]: E0318 10:15:33.175648 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 10:15:33 crc kubenswrapper[4733]: E0318 10:15:33.175723 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 10:15:33 crc kubenswrapper[4733]: E0318 10:15:33.175819 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 10:15:33 crc kubenswrapper[4733]: I0318 10:15:33.619641 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:15:33 crc kubenswrapper[4733]: I0318 10:15:33.619681 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:15:33 crc kubenswrapper[4733]: I0318 10:15:33.619692 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:15:33 crc kubenswrapper[4733]: I0318 10:15:33.619709 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:15:33 crc kubenswrapper[4733]: I0318 10:15:33.619720 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:15:33Z","lastTransitionTime":"2026-03-18T10:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:15:33 crc kubenswrapper[4733]: E0318 10:15:33.632147 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a826494-c246-4717-869b-fd136e2b8410\\\",\\\"systemUUID\\\":\\\"fe704b25-4cdf-410a-9afb-ebc7963f4bc5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:33Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:33 crc kubenswrapper[4733]: I0318 10:15:33.635976 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:15:33 crc kubenswrapper[4733]: I0318 10:15:33.636049 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:15:33 crc kubenswrapper[4733]: I0318 10:15:33.636076 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:15:33 crc kubenswrapper[4733]: I0318 10:15:33.636106 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:15:33 crc kubenswrapper[4733]: I0318 10:15:33.636130 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:15:33Z","lastTransitionTime":"2026-03-18T10:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:15:33 crc kubenswrapper[4733]: E0318 10:15:33.657969 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a826494-c246-4717-869b-fd136e2b8410\\\",\\\"systemUUID\\\":\\\"fe704b25-4cdf-410a-9afb-ebc7963f4bc5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:33Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:33 crc kubenswrapper[4733]: I0318 10:15:33.662875 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:15:33 crc kubenswrapper[4733]: I0318 10:15:33.662930 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:15:33 crc kubenswrapper[4733]: I0318 10:15:33.662945 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:15:33 crc kubenswrapper[4733]: I0318 10:15:33.662967 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:15:33 crc kubenswrapper[4733]: I0318 10:15:33.662984 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:15:33Z","lastTransitionTime":"2026-03-18T10:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:15:33 crc kubenswrapper[4733]: E0318 10:15:33.679404 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a826494-c246-4717-869b-fd136e2b8410\\\",\\\"systemUUID\\\":\\\"fe704b25-4cdf-410a-9afb-ebc7963f4bc5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:33Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:33 crc kubenswrapper[4733]: I0318 10:15:33.683374 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:15:33 crc kubenswrapper[4733]: I0318 10:15:33.683429 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:15:33 crc kubenswrapper[4733]: I0318 10:15:33.683447 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:15:33 crc kubenswrapper[4733]: I0318 10:15:33.683469 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:15:33 crc kubenswrapper[4733]: I0318 10:15:33.683484 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:15:33Z","lastTransitionTime":"2026-03-18T10:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:15:33 crc kubenswrapper[4733]: E0318 10:15:33.702659 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a826494-c246-4717-869b-fd136e2b8410\\\",\\\"systemUUID\\\":\\\"fe704b25-4cdf-410a-9afb-ebc7963f4bc5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:33Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:33 crc kubenswrapper[4733]: I0318 10:15:33.707046 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:15:33 crc kubenswrapper[4733]: I0318 10:15:33.707090 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:15:33 crc kubenswrapper[4733]: I0318 10:15:33.707103 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:15:33 crc kubenswrapper[4733]: I0318 10:15:33.707121 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:15:33 crc kubenswrapper[4733]: I0318 10:15:33.707133 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:15:33Z","lastTransitionTime":"2026-03-18T10:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:15:33 crc kubenswrapper[4733]: E0318 10:15:33.724887 4733 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T10:15:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T10:15:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a826494-c246-4717-869b-fd136e2b8410\\\",\\\"systemUUID\\\":\\\"fe704b25-4cdf-410a-9afb-ebc7963f4bc5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T10:15:33Z is after 2025-08-24T17:21:41Z" Mar 18 10:15:33 crc kubenswrapper[4733]: E0318 10:15:33.724996 4733 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 10:15:35 crc kubenswrapper[4733]: I0318 10:15:35.175318 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:15:35 crc kubenswrapper[4733]: I0318 10:15:35.175384 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:15:35 crc kubenswrapper[4733]: E0318 10:15:35.175542 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 10:15:35 crc kubenswrapper[4733]: I0318 10:15:35.175584 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:15:35 crc kubenswrapper[4733]: E0318 10:15:35.175718 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 10:15:35 crc kubenswrapper[4733]: I0318 10:15:35.175320 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:15:35 crc kubenswrapper[4733]: E0318 10:15:35.175864 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 10:15:35 crc kubenswrapper[4733]: E0318 10:15:35.176153 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4s425" podUID="b3650177-e338-4eba-ab42-bc0cd14c9d65" Mar 18 10:15:36 crc kubenswrapper[4733]: E0318 10:15:36.283961 4733 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 10:15:37 crc kubenswrapper[4733]: I0318 10:15:37.175167 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:15:37 crc kubenswrapper[4733]: I0318 10:15:37.175323 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:15:37 crc kubenswrapper[4733]: E0318 10:15:37.175360 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 10:15:37 crc kubenswrapper[4733]: I0318 10:15:37.175477 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:15:37 crc kubenswrapper[4733]: E0318 10:15:37.175688 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 10:15:37 crc kubenswrapper[4733]: I0318 10:15:37.175809 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:15:37 crc kubenswrapper[4733]: E0318 10:15:37.175921 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 10:15:37 crc kubenswrapper[4733]: E0318 10:15:37.176004 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4s425" podUID="b3650177-e338-4eba-ab42-bc0cd14c9d65" Mar 18 10:15:38 crc kubenswrapper[4733]: I0318 10:15:38.176453 4733 scope.go:117] "RemoveContainer" containerID="f271860bb80800ec82f217effead5b1e9475829bbf78baea857aa7639eea7291" Mar 18 10:15:38 crc kubenswrapper[4733]: E0318 10:15:38.176737 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7pxwd_openshift-ovn-kubernetes(73327417-4d3b-45f1-b3b6-575fdeeaa31a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" podUID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" Mar 18 10:15:39 crc kubenswrapper[4733]: I0318 10:15:39.175554 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:15:39 crc kubenswrapper[4733]: I0318 10:15:39.175593 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:15:39 crc kubenswrapper[4733]: I0318 10:15:39.175717 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:15:39 crc kubenswrapper[4733]: E0318 10:15:39.175750 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4s425" podUID="b3650177-e338-4eba-ab42-bc0cd14c9d65" Mar 18 10:15:39 crc kubenswrapper[4733]: E0318 10:15:39.175894 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 10:15:39 crc kubenswrapper[4733]: I0318 10:15:39.175987 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:15:39 crc kubenswrapper[4733]: E0318 10:15:39.176032 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 10:15:39 crc kubenswrapper[4733]: E0318 10:15:39.176239 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 10:15:41 crc kubenswrapper[4733]: I0318 10:15:41.174977 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:15:41 crc kubenswrapper[4733]: I0318 10:15:41.175063 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:15:41 crc kubenswrapper[4733]: I0318 10:15:41.175146 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:15:41 crc kubenswrapper[4733]: E0318 10:15:41.175382 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4s425" podUID="b3650177-e338-4eba-ab42-bc0cd14c9d65" Mar 18 10:15:41 crc kubenswrapper[4733]: I0318 10:15:41.175468 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:15:41 crc kubenswrapper[4733]: E0318 10:15:41.175614 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 10:15:41 crc kubenswrapper[4733]: E0318 10:15:41.176519 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 10:15:41 crc kubenswrapper[4733]: E0318 10:15:41.176678 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 10:15:41 crc kubenswrapper[4733]: I0318 10:15:41.213554 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-g6j2q" podStartSLOduration=123.213535372 podStartE2EDuration="2m3.213535372s" podCreationTimestamp="2026-03-18 10:13:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:15:41.210940249 +0000 UTC m=+180.702674574" watchObservedRunningTime="2026-03-18 10:15:41.213535372 +0000 UTC m=+180.705269697" Mar 18 10:15:41 crc kubenswrapper[4733]: I0318 10:15:41.243133 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-hsk58" podStartSLOduration=123.243110148 podStartE2EDuration="2m3.243110148s" podCreationTimestamp="2026-03-18 10:13:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:15:41.243036746 +0000 UTC m=+180.734771081" watchObservedRunningTime="2026-03-18 10:15:41.243110148 +0000 UTC m=+180.734844473" Mar 18 10:15:41 crc kubenswrapper[4733]: I0318 10:15:41.243369 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-spfjj" podStartSLOduration=123.243361866 podStartE2EDuration="2m3.243361866s" podCreationTimestamp="2026-03-18 10:13:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:15:41.225452793 +0000 UTC m=+180.717187208" watchObservedRunningTime="2026-03-18 10:15:41.243361866 +0000 UTC m=+180.735096191" Mar 18 10:15:41 crc kubenswrapper[4733]: E0318 10:15:41.287610 4733 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 10:15:41 crc kubenswrapper[4733]: I0318 10:15:41.306833 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=57.306819136 podStartE2EDuration="57.306819136s" podCreationTimestamp="2026-03-18 10:14:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:15:41.306056802 +0000 UTC m=+180.797791127" watchObservedRunningTime="2026-03-18 10:15:41.306819136 +0000 UTC m=+180.798553461" Mar 18 10:15:41 crc kubenswrapper[4733]: I0318 10:15:41.341231 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-xfvfl" podStartSLOduration=123.341144854 podStartE2EDuration="2m3.341144854s" podCreationTimestamp="2026-03-18 10:13:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:15:41.340521164 +0000 UTC m=+180.832255529" watchObservedRunningTime="2026-03-18 10:15:41.341144854 +0000 UTC m=+180.832879229" Mar 18 10:15:41 crc kubenswrapper[4733]: I0318 10:15:41.378746 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-t28sh" podStartSLOduration=123.378719836 podStartE2EDuration="2m3.378719836s" podCreationTimestamp="2026-03-18 10:13:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:15:41.376854096 +0000 UTC m=+180.868588461" watchObservedRunningTime="2026-03-18 10:15:41.378719836 +0000 UTC m=+180.870454161" Mar 18 10:15:41 crc kubenswrapper[4733]: I0318 10:15:41.421402 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=79.421377801 podStartE2EDuration="1m19.421377801s" podCreationTimestamp="2026-03-18 10:14:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:15:41.42106005 +0000 UTC m=+180.912794405" watchObservedRunningTime="2026-03-18 10:15:41.421377801 +0000 UTC m=+180.913112126" Mar 18 10:15:41 crc kubenswrapper[4733]: I0318 10:15:41.421906 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podStartSLOduration=123.421888687 podStartE2EDuration="2m3.421888687s" podCreationTimestamp="2026-03-18 10:13:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:15:41.394536332 +0000 UTC m=+180.886270697" watchObservedRunningTime="2026-03-18 10:15:41.421888687 +0000 UTC m=+180.913623012" Mar 18 10:15:41 crc kubenswrapper[4733]: I0318 10:15:41.432930 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=88.432894119 podStartE2EDuration="1m28.432894119s" podCreationTimestamp="2026-03-18 10:14:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:15:41.43135764 +0000 UTC m=+180.923091965" watchObservedRunningTime="2026-03-18 10:15:41.432894119 +0000 UTC m=+180.924628484" Mar 18 10:15:41 crc kubenswrapper[4733]: I0318 10:15:41.498930 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=79.498909051 podStartE2EDuration="1m19.498909051s" podCreationTimestamp="2026-03-18 10:14:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:15:41.49856666 +0000 UTC m=+180.990300995" watchObservedRunningTime="2026-03-18 10:15:41.498909051 +0000 UTC m=+180.990643366" Mar 18 10:15:41 crc kubenswrapper[4733]: I0318 10:15:41.510940 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=66.510917455 podStartE2EDuration="1m6.510917455s" podCreationTimestamp="2026-03-18 10:14:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:15:41.510609745 +0000 UTC m=+181.002344070" watchObservedRunningTime="2026-03-18 10:15:41.510917455 +0000 UTC m=+181.002651790" Mar 18 10:15:43 crc kubenswrapper[4733]: I0318 10:15:43.174821 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:15:43 crc kubenswrapper[4733]: I0318 10:15:43.174898 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:15:43 crc kubenswrapper[4733]: I0318 10:15:43.174956 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:15:43 crc kubenswrapper[4733]: E0318 10:15:43.175055 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4s425" podUID="b3650177-e338-4eba-ab42-bc0cd14c9d65" Mar 18 10:15:43 crc kubenswrapper[4733]: I0318 10:15:43.175072 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:15:43 crc kubenswrapper[4733]: E0318 10:15:43.175205 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 10:15:43 crc kubenswrapper[4733]: E0318 10:15:43.175301 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 10:15:43 crc kubenswrapper[4733]: E0318 10:15:43.175357 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 10:15:43 crc kubenswrapper[4733]: I0318 10:15:43.792038 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 10:15:43 crc kubenswrapper[4733]: I0318 10:15:43.792099 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 10:15:43 crc kubenswrapper[4733]: I0318 10:15:43.792118 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 10:15:43 crc kubenswrapper[4733]: I0318 10:15:43.792145 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 10:15:43 crc kubenswrapper[4733]: I0318 10:15:43.792165 4733 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T10:15:43Z","lastTransitionTime":"2026-03-18T10:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 10:15:43 crc kubenswrapper[4733]: I0318 10:15:43.865536 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-8nsd7"] Mar 18 10:15:43 crc kubenswrapper[4733]: I0318 10:15:43.866394 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8nsd7" Mar 18 10:15:43 crc kubenswrapper[4733]: I0318 10:15:43.869382 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 18 10:15:43 crc kubenswrapper[4733]: I0318 10:15:43.869706 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 18 10:15:43 crc kubenswrapper[4733]: I0318 10:15:43.873326 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 18 10:15:43 crc kubenswrapper[4733]: I0318 10:15:43.873395 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 18 10:15:44 crc kubenswrapper[4733]: I0318 10:15:44.046171 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1ed168c9-7562-4ab5-8cf4-5edfa11200e7-service-ca\") pod \"cluster-version-operator-5c965bbfc6-8nsd7\" (UID: \"1ed168c9-7562-4ab5-8cf4-5edfa11200e7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8nsd7" Mar 18 10:15:44 crc kubenswrapper[4733]: I0318 10:15:44.046322 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1ed168c9-7562-4ab5-8cf4-5edfa11200e7-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-8nsd7\" (UID: \"1ed168c9-7562-4ab5-8cf4-5edfa11200e7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8nsd7" Mar 18 10:15:44 crc kubenswrapper[4733]: I0318 10:15:44.046394 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ed168c9-7562-4ab5-8cf4-5edfa11200e7-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-8nsd7\" (UID: \"1ed168c9-7562-4ab5-8cf4-5edfa11200e7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8nsd7" Mar 18 10:15:44 crc kubenswrapper[4733]: I0318 10:15:44.046477 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/1ed168c9-7562-4ab5-8cf4-5edfa11200e7-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-8nsd7\" (UID: \"1ed168c9-7562-4ab5-8cf4-5edfa11200e7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8nsd7" Mar 18 10:15:44 crc kubenswrapper[4733]: I0318 10:15:44.046601 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/1ed168c9-7562-4ab5-8cf4-5edfa11200e7-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-8nsd7\" (UID: \"1ed168c9-7562-4ab5-8cf4-5edfa11200e7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8nsd7" Mar 18 10:15:44 crc kubenswrapper[4733]: I0318 10:15:44.147854 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1ed168c9-7562-4ab5-8cf4-5edfa11200e7-service-ca\") pod \"cluster-version-operator-5c965bbfc6-8nsd7\" (UID: \"1ed168c9-7562-4ab5-8cf4-5edfa11200e7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8nsd7" Mar 18 10:15:44 crc kubenswrapper[4733]: I0318 10:15:44.147911 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1ed168c9-7562-4ab5-8cf4-5edfa11200e7-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-8nsd7\" (UID: \"1ed168c9-7562-4ab5-8cf4-5edfa11200e7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8nsd7" Mar 18 10:15:44 crc kubenswrapper[4733]: I0318 10:15:44.147955 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ed168c9-7562-4ab5-8cf4-5edfa11200e7-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-8nsd7\" (UID: \"1ed168c9-7562-4ab5-8cf4-5edfa11200e7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8nsd7" Mar 18 10:15:44 crc kubenswrapper[4733]: I0318 10:15:44.147999 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/1ed168c9-7562-4ab5-8cf4-5edfa11200e7-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-8nsd7\" (UID: \"1ed168c9-7562-4ab5-8cf4-5edfa11200e7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8nsd7" Mar 18 10:15:44 crc kubenswrapper[4733]: I0318 10:15:44.148045 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/1ed168c9-7562-4ab5-8cf4-5edfa11200e7-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-8nsd7\" (UID: \"1ed168c9-7562-4ab5-8cf4-5edfa11200e7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8nsd7" Mar 18 10:15:44 crc kubenswrapper[4733]: I0318 10:15:44.148136 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/1ed168c9-7562-4ab5-8cf4-5edfa11200e7-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-8nsd7\" (UID: \"1ed168c9-7562-4ab5-8cf4-5edfa11200e7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8nsd7" Mar 18 10:15:44 crc kubenswrapper[4733]: I0318 10:15:44.148184 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/1ed168c9-7562-4ab5-8cf4-5edfa11200e7-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-8nsd7\" (UID: \"1ed168c9-7562-4ab5-8cf4-5edfa11200e7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8nsd7" Mar 18 10:15:44 crc kubenswrapper[4733]: I0318 10:15:44.149830 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1ed168c9-7562-4ab5-8cf4-5edfa11200e7-service-ca\") pod \"cluster-version-operator-5c965bbfc6-8nsd7\" (UID: \"1ed168c9-7562-4ab5-8cf4-5edfa11200e7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8nsd7" Mar 18 10:15:44 crc kubenswrapper[4733]: I0318 10:15:44.158059 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ed168c9-7562-4ab5-8cf4-5edfa11200e7-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-8nsd7\" (UID: \"1ed168c9-7562-4ab5-8cf4-5edfa11200e7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8nsd7" Mar 18 10:15:44 crc kubenswrapper[4733]: I0318 10:15:44.169580 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1ed168c9-7562-4ab5-8cf4-5edfa11200e7-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-8nsd7\" (UID: \"1ed168c9-7562-4ab5-8cf4-5edfa11200e7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8nsd7" Mar 18 10:15:44 crc kubenswrapper[4733]: I0318 10:15:44.192270 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8nsd7" Mar 18 10:15:44 crc kubenswrapper[4733]: I0318 10:15:44.282207 4733 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 18 10:15:44 crc kubenswrapper[4733]: I0318 10:15:44.292065 4733 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 18 10:15:45 crc kubenswrapper[4733]: I0318 10:15:45.143757 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8nsd7" event={"ID":"1ed168c9-7562-4ab5-8cf4-5edfa11200e7","Type":"ContainerStarted","Data":"af3b957f0d8068e24e6b30b30a154402543b1c9b867238d2a05edf65767bee90"} Mar 18 10:15:45 crc kubenswrapper[4733]: I0318 10:15:45.143824 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8nsd7" event={"ID":"1ed168c9-7562-4ab5-8cf4-5edfa11200e7","Type":"ContainerStarted","Data":"4bfa48ba8fbdea9f5c215496267acbf7b6181ea4a50d5f4a1e09661215475b0c"} Mar 18 10:15:45 crc kubenswrapper[4733]: I0318 10:15:45.175015 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:15:45 crc kubenswrapper[4733]: I0318 10:15:45.175029 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:15:45 crc kubenswrapper[4733]: E0318 10:15:45.175389 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 10:15:45 crc kubenswrapper[4733]: I0318 10:15:45.175162 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:15:45 crc kubenswrapper[4733]: E0318 10:15:45.175741 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4s425" podUID="b3650177-e338-4eba-ab42-bc0cd14c9d65" Mar 18 10:15:45 crc kubenswrapper[4733]: E0318 10:15:45.175619 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 10:15:45 crc kubenswrapper[4733]: I0318 10:15:45.175063 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:15:45 crc kubenswrapper[4733]: E0318 10:15:45.176001 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 10:15:46 crc kubenswrapper[4733]: E0318 10:15:46.288776 4733 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 10:15:47 crc kubenswrapper[4733]: I0318 10:15:47.174679 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:15:47 crc kubenswrapper[4733]: I0318 10:15:47.174687 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:15:47 crc kubenswrapper[4733]: I0318 10:15:47.174770 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:15:47 crc kubenswrapper[4733]: I0318 10:15:47.174908 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:15:47 crc kubenswrapper[4733]: E0318 10:15:47.175120 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 10:15:47 crc kubenswrapper[4733]: E0318 10:15:47.175270 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 10:15:47 crc kubenswrapper[4733]: E0318 10:15:47.175354 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 10:15:47 crc kubenswrapper[4733]: E0318 10:15:47.175466 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4s425" podUID="b3650177-e338-4eba-ab42-bc0cd14c9d65" Mar 18 10:15:49 crc kubenswrapper[4733]: I0318 10:15:49.174926 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:15:49 crc kubenswrapper[4733]: I0318 10:15:49.174958 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:15:49 crc kubenswrapper[4733]: E0318 10:15:49.175051 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 10:15:49 crc kubenswrapper[4733]: I0318 10:15:49.174927 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:15:49 crc kubenswrapper[4733]: I0318 10:15:49.175140 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:15:49 crc kubenswrapper[4733]: E0318 10:15:49.176266 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4s425" podUID="b3650177-e338-4eba-ab42-bc0cd14c9d65" Mar 18 10:15:49 crc kubenswrapper[4733]: I0318 10:15:49.179503 4733 scope.go:117] "RemoveContainer" containerID="f271860bb80800ec82f217effead5b1e9475829bbf78baea857aa7639eea7291" Mar 18 10:15:49 crc kubenswrapper[4733]: E0318 10:15:49.179866 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 10:15:49 crc kubenswrapper[4733]: E0318 10:15:49.180248 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7pxwd_openshift-ovn-kubernetes(73327417-4d3b-45f1-b3b6-575fdeeaa31a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" podUID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" Mar 18 10:15:49 crc kubenswrapper[4733]: E0318 10:15:49.180484 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 10:15:51 crc kubenswrapper[4733]: I0318 10:15:51.174874 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:15:51 crc kubenswrapper[4733]: I0318 10:15:51.176162 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:15:51 crc kubenswrapper[4733]: E0318 10:15:51.176356 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4s425" podUID="b3650177-e338-4eba-ab42-bc0cd14c9d65" Mar 18 10:15:51 crc kubenswrapper[4733]: I0318 10:15:51.176414 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:15:51 crc kubenswrapper[4733]: E0318 10:15:51.176951 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 10:15:51 crc kubenswrapper[4733]: I0318 10:15:51.176446 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:15:51 crc kubenswrapper[4733]: E0318 10:15:51.177100 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 10:15:51 crc kubenswrapper[4733]: E0318 10:15:51.177417 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 10:15:51 crc kubenswrapper[4733]: E0318 10:15:51.289406 4733 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 10:15:53 crc kubenswrapper[4733]: I0318 10:15:53.174536 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:15:53 crc kubenswrapper[4733]: I0318 10:15:53.174608 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:15:53 crc kubenswrapper[4733]: I0318 10:15:53.174562 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:15:53 crc kubenswrapper[4733]: E0318 10:15:53.174762 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 10:15:53 crc kubenswrapper[4733]: I0318 10:15:53.174790 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:15:53 crc kubenswrapper[4733]: E0318 10:15:53.175389 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 10:15:53 crc kubenswrapper[4733]: E0318 10:15:53.175552 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 10:15:53 crc kubenswrapper[4733]: E0318 10:15:53.175755 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4s425" podUID="b3650177-e338-4eba-ab42-bc0cd14c9d65" Mar 18 10:15:55 crc kubenswrapper[4733]: I0318 10:15:55.174923 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:15:55 crc kubenswrapper[4733]: I0318 10:15:55.174945 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:15:55 crc kubenswrapper[4733]: I0318 10:15:55.175052 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:15:55 crc kubenswrapper[4733]: E0318 10:15:55.175246 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 10:15:55 crc kubenswrapper[4733]: E0318 10:15:55.175461 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 10:15:55 crc kubenswrapper[4733]: E0318 10:15:55.175623 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4s425" podUID="b3650177-e338-4eba-ab42-bc0cd14c9d65" Mar 18 10:15:55 crc kubenswrapper[4733]: I0318 10:15:55.176237 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:15:55 crc kubenswrapper[4733]: E0318 10:15:55.176378 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 10:15:56 crc kubenswrapper[4733]: E0318 10:15:56.291263 4733 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 10:15:57 crc kubenswrapper[4733]: I0318 10:15:57.175411 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:15:57 crc kubenswrapper[4733]: E0318 10:15:57.175570 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 10:15:57 crc kubenswrapper[4733]: I0318 10:15:57.175773 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:15:57 crc kubenswrapper[4733]: E0318 10:15:57.175830 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 10:15:57 crc kubenswrapper[4733]: I0318 10:15:57.175954 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:15:57 crc kubenswrapper[4733]: E0318 10:15:57.176014 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4s425" podUID="b3650177-e338-4eba-ab42-bc0cd14c9d65" Mar 18 10:15:57 crc kubenswrapper[4733]: I0318 10:15:57.176178 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:15:57 crc kubenswrapper[4733]: E0318 10:15:57.176390 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 10:15:59 crc kubenswrapper[4733]: I0318 10:15:59.175419 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:15:59 crc kubenswrapper[4733]: I0318 10:15:59.175514 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:15:59 crc kubenswrapper[4733]: I0318 10:15:59.175538 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:15:59 crc kubenswrapper[4733]: E0318 10:15:59.175632 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 10:15:59 crc kubenswrapper[4733]: E0318 10:15:59.175833 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4s425" podUID="b3650177-e338-4eba-ab42-bc0cd14c9d65" Mar 18 10:15:59 crc kubenswrapper[4733]: E0318 10:15:59.175880 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 10:15:59 crc kubenswrapper[4733]: I0318 10:15:59.176046 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:15:59 crc kubenswrapper[4733]: E0318 10:15:59.176114 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 10:15:59 crc kubenswrapper[4733]: I0318 10:15:59.228597 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-g6j2q_cc85b0d4-15a5-4894-9f07-9aaeb28f63fa/kube-multus/1.log" Mar 18 10:15:59 crc kubenswrapper[4733]: I0318 10:15:59.229075 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-g6j2q_cc85b0d4-15a5-4894-9f07-9aaeb28f63fa/kube-multus/0.log" Mar 18 10:15:59 crc kubenswrapper[4733]: I0318 10:15:59.229120 4733 generic.go:334] "Generic (PLEG): container finished" podID="cc85b0d4-15a5-4894-9f07-9aaeb28f63fa" containerID="b6a4e9643a717b3f38fc1bed5c534e12bb873f0ffcf3c504cb4395c11621a73a" exitCode=1 Mar 18 10:15:59 crc kubenswrapper[4733]: I0318 10:15:59.229150 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-g6j2q" event={"ID":"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa","Type":"ContainerDied","Data":"b6a4e9643a717b3f38fc1bed5c534e12bb873f0ffcf3c504cb4395c11621a73a"} Mar 18 10:15:59 crc kubenswrapper[4733]: I0318 10:15:59.229205 4733 scope.go:117] "RemoveContainer" containerID="cf9836f3455051ee686f0ec11ceb1c60cff06c95a16bf2fcff6c4c3ed600b034" Mar 18 10:15:59 crc kubenswrapper[4733]: I0318 10:15:59.229852 4733 scope.go:117] "RemoveContainer" containerID="b6a4e9643a717b3f38fc1bed5c534e12bb873f0ffcf3c504cb4395c11621a73a" Mar 18 10:15:59 crc kubenswrapper[4733]: E0318 10:15:59.230127 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-g6j2q_openshift-multus(cc85b0d4-15a5-4894-9f07-9aaeb28f63fa)\"" pod="openshift-multus/multus-g6j2q" podUID="cc85b0d4-15a5-4894-9f07-9aaeb28f63fa" Mar 18 10:15:59 crc kubenswrapper[4733]: I0318 10:15:59.250963 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8nsd7" podStartSLOduration=141.250937344 podStartE2EDuration="2m21.250937344s" podCreationTimestamp="2026-03-18 10:13:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:15:45.162241315 +0000 UTC m=+184.653975640" watchObservedRunningTime="2026-03-18 10:15:59.250937344 +0000 UTC m=+198.742671709" Mar 18 10:16:00 crc kubenswrapper[4733]: I0318 10:16:00.234515 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-g6j2q_cc85b0d4-15a5-4894-9f07-9aaeb28f63fa/kube-multus/1.log" Mar 18 10:16:01 crc kubenswrapper[4733]: I0318 10:16:01.175184 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:16:01 crc kubenswrapper[4733]: I0318 10:16:01.175257 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:16:01 crc kubenswrapper[4733]: I0318 10:16:01.175243 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:16:01 crc kubenswrapper[4733]: E0318 10:16:01.176099 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 10:16:01 crc kubenswrapper[4733]: I0318 10:16:01.176232 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:16:01 crc kubenswrapper[4733]: E0318 10:16:01.176264 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 10:16:01 crc kubenswrapper[4733]: E0318 10:16:01.176383 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 10:16:01 crc kubenswrapper[4733]: E0318 10:16:01.176531 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4s425" podUID="b3650177-e338-4eba-ab42-bc0cd14c9d65" Mar 18 10:16:01 crc kubenswrapper[4733]: E0318 10:16:01.291797 4733 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 10:16:03 crc kubenswrapper[4733]: I0318 10:16:03.175082 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:16:03 crc kubenswrapper[4733]: E0318 10:16:03.175332 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 10:16:03 crc kubenswrapper[4733]: I0318 10:16:03.175533 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:16:03 crc kubenswrapper[4733]: I0318 10:16:03.175709 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:16:03 crc kubenswrapper[4733]: E0318 10:16:03.175874 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 10:16:03 crc kubenswrapper[4733]: I0318 10:16:03.175912 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:16:03 crc kubenswrapper[4733]: E0318 10:16:03.176078 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4s425" podUID="b3650177-e338-4eba-ab42-bc0cd14c9d65" Mar 18 10:16:03 crc kubenswrapper[4733]: E0318 10:16:03.176299 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 10:16:04 crc kubenswrapper[4733]: I0318 10:16:04.175900 4733 scope.go:117] "RemoveContainer" containerID="f271860bb80800ec82f217effead5b1e9475829bbf78baea857aa7639eea7291" Mar 18 10:16:04 crc kubenswrapper[4733]: E0318 10:16:04.176076 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7pxwd_openshift-ovn-kubernetes(73327417-4d3b-45f1-b3b6-575fdeeaa31a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" podUID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" Mar 18 10:16:05 crc kubenswrapper[4733]: I0318 10:16:05.175405 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:16:05 crc kubenswrapper[4733]: I0318 10:16:05.175483 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:16:05 crc kubenswrapper[4733]: I0318 10:16:05.175492 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:16:05 crc kubenswrapper[4733]: E0318 10:16:05.175608 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4s425" podUID="b3650177-e338-4eba-ab42-bc0cd14c9d65" Mar 18 10:16:05 crc kubenswrapper[4733]: E0318 10:16:05.175757 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 10:16:05 crc kubenswrapper[4733]: I0318 10:16:05.175839 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:16:05 crc kubenswrapper[4733]: E0318 10:16:05.175912 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 10:16:05 crc kubenswrapper[4733]: E0318 10:16:05.176065 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 10:16:06 crc kubenswrapper[4733]: E0318 10:16:06.293315 4733 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 10:16:07 crc kubenswrapper[4733]: I0318 10:16:07.175248 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:16:07 crc kubenswrapper[4733]: I0318 10:16:07.175340 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:16:07 crc kubenswrapper[4733]: I0318 10:16:07.175406 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:16:07 crc kubenswrapper[4733]: I0318 10:16:07.175266 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:16:07 crc kubenswrapper[4733]: E0318 10:16:07.175492 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 10:16:07 crc kubenswrapper[4733]: E0318 10:16:07.175585 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 10:16:07 crc kubenswrapper[4733]: E0318 10:16:07.175664 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4s425" podUID="b3650177-e338-4eba-ab42-bc0cd14c9d65" Mar 18 10:16:07 crc kubenswrapper[4733]: E0318 10:16:07.175808 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 10:16:09 crc kubenswrapper[4733]: I0318 10:16:09.175522 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:16:09 crc kubenswrapper[4733]: I0318 10:16:09.175635 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:16:09 crc kubenswrapper[4733]: I0318 10:16:09.175679 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:16:09 crc kubenswrapper[4733]: E0318 10:16:09.175713 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 10:16:09 crc kubenswrapper[4733]: I0318 10:16:09.175768 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:16:09 crc kubenswrapper[4733]: E0318 10:16:09.175919 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 10:16:09 crc kubenswrapper[4733]: E0318 10:16:09.176030 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 10:16:09 crc kubenswrapper[4733]: E0318 10:16:09.176089 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4s425" podUID="b3650177-e338-4eba-ab42-bc0cd14c9d65" Mar 18 10:16:11 crc kubenswrapper[4733]: I0318 10:16:11.174871 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:16:11 crc kubenswrapper[4733]: E0318 10:16:11.175866 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 10:16:11 crc kubenswrapper[4733]: I0318 10:16:11.175982 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:16:11 crc kubenswrapper[4733]: I0318 10:16:11.176039 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:16:11 crc kubenswrapper[4733]: I0318 10:16:11.175976 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:16:11 crc kubenswrapper[4733]: E0318 10:16:11.176127 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 10:16:11 crc kubenswrapper[4733]: E0318 10:16:11.176153 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4s425" podUID="b3650177-e338-4eba-ab42-bc0cd14c9d65" Mar 18 10:16:11 crc kubenswrapper[4733]: E0318 10:16:11.176232 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 10:16:11 crc kubenswrapper[4733]: E0318 10:16:11.293894 4733 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 10:16:13 crc kubenswrapper[4733]: I0318 10:16:13.175226 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:16:13 crc kubenswrapper[4733]: I0318 10:16:13.175282 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:16:13 crc kubenswrapper[4733]: I0318 10:16:13.175261 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:16:13 crc kubenswrapper[4733]: I0318 10:16:13.175247 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:16:13 crc kubenswrapper[4733]: E0318 10:16:13.175423 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4s425" podUID="b3650177-e338-4eba-ab42-bc0cd14c9d65" Mar 18 10:16:13 crc kubenswrapper[4733]: E0318 10:16:13.175748 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 10:16:13 crc kubenswrapper[4733]: E0318 10:16:13.175989 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 10:16:13 crc kubenswrapper[4733]: E0318 10:16:13.176225 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 10:16:14 crc kubenswrapper[4733]: I0318 10:16:14.175306 4733 scope.go:117] "RemoveContainer" containerID="b6a4e9643a717b3f38fc1bed5c534e12bb873f0ffcf3c504cb4395c11621a73a" Mar 18 10:16:15 crc kubenswrapper[4733]: I0318 10:16:15.175349 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:16:15 crc kubenswrapper[4733]: I0318 10:16:15.175387 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:16:15 crc kubenswrapper[4733]: E0318 10:16:15.175562 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 10:16:15 crc kubenswrapper[4733]: I0318 10:16:15.175597 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:16:15 crc kubenswrapper[4733]: I0318 10:16:15.175610 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:16:15 crc kubenswrapper[4733]: E0318 10:16:15.175763 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4s425" podUID="b3650177-e338-4eba-ab42-bc0cd14c9d65" Mar 18 10:16:15 crc kubenswrapper[4733]: E0318 10:16:15.175873 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 10:16:15 crc kubenswrapper[4733]: E0318 10:16:15.175979 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 10:16:15 crc kubenswrapper[4733]: I0318 10:16:15.284102 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-g6j2q_cc85b0d4-15a5-4894-9f07-9aaeb28f63fa/kube-multus/1.log" Mar 18 10:16:15 crc kubenswrapper[4733]: I0318 10:16:15.284216 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-g6j2q" event={"ID":"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa","Type":"ContainerStarted","Data":"e6e4d066d930397d09ab341b832e9b1659ca8d82f0e6fdc83f2d3f3738f5c64d"} Mar 18 10:16:16 crc kubenswrapper[4733]: E0318 10:16:16.295464 4733 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 10:16:17 crc kubenswrapper[4733]: I0318 10:16:17.174753 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:16:17 crc kubenswrapper[4733]: I0318 10:16:17.174803 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:16:17 crc kubenswrapper[4733]: E0318 10:16:17.174929 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 10:16:17 crc kubenswrapper[4733]: I0318 10:16:17.174961 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:16:17 crc kubenswrapper[4733]: E0318 10:16:17.175376 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 10:16:17 crc kubenswrapper[4733]: E0318 10:16:17.175466 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4s425" podUID="b3650177-e338-4eba-ab42-bc0cd14c9d65" Mar 18 10:16:17 crc kubenswrapper[4733]: I0318 10:16:17.175671 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:16:17 crc kubenswrapper[4733]: I0318 10:16:17.176015 4733 scope.go:117] "RemoveContainer" containerID="f271860bb80800ec82f217effead5b1e9475829bbf78baea857aa7639eea7291" Mar 18 10:16:17 crc kubenswrapper[4733]: E0318 10:16:17.176271 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 10:16:18 crc kubenswrapper[4733]: I0318 10:16:18.048638 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4s425"] Mar 18 10:16:18 crc kubenswrapper[4733]: I0318 10:16:18.049155 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:16:18 crc kubenswrapper[4733]: E0318 10:16:18.049335 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4s425" podUID="b3650177-e338-4eba-ab42-bc0cd14c9d65" Mar 18 10:16:18 crc kubenswrapper[4733]: I0318 10:16:18.300682 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pxwd_73327417-4d3b-45f1-b3b6-575fdeeaa31a/ovnkube-controller/3.log" Mar 18 10:16:18 crc kubenswrapper[4733]: I0318 10:16:18.303802 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" event={"ID":"73327417-4d3b-45f1-b3b6-575fdeeaa31a","Type":"ContainerStarted","Data":"850880b1c00b2f5a5a32f08989e49cc1406960901b41de4ee69b92f38458d395"} Mar 18 10:16:18 crc kubenswrapper[4733]: I0318 10:16:18.304266 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:16:18 crc kubenswrapper[4733]: I0318 10:16:18.341110 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" podStartSLOduration=160.34109251 podStartE2EDuration="2m40.34109251s" podCreationTimestamp="2026-03-18 10:13:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:16:18.340310945 +0000 UTC m=+217.832045280" watchObservedRunningTime="2026-03-18 10:16:18.34109251 +0000 UTC m=+217.832826845" Mar 18 10:16:19 crc kubenswrapper[4733]: I0318 10:16:19.175248 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:16:19 crc kubenswrapper[4733]: I0318 10:16:19.175362 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:16:19 crc kubenswrapper[4733]: E0318 10:16:19.175436 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 10:16:19 crc kubenswrapper[4733]: E0318 10:16:19.175498 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 10:16:19 crc kubenswrapper[4733]: I0318 10:16:19.175559 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:16:19 crc kubenswrapper[4733]: E0318 10:16:19.175743 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 10:16:20 crc kubenswrapper[4733]: I0318 10:16:20.174650 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:16:20 crc kubenswrapper[4733]: E0318 10:16:20.174862 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4s425" podUID="b3650177-e338-4eba-ab42-bc0cd14c9d65" Mar 18 10:16:21 crc kubenswrapper[4733]: I0318 10:16:21.174684 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:16:21 crc kubenswrapper[4733]: I0318 10:16:21.174685 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:16:21 crc kubenswrapper[4733]: E0318 10:16:21.176610 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 10:16:21 crc kubenswrapper[4733]: I0318 10:16:21.176646 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:16:21 crc kubenswrapper[4733]: E0318 10:16:21.176831 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 10:16:21 crc kubenswrapper[4733]: E0318 10:16:21.176964 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 10:16:21 crc kubenswrapper[4733]: I0318 10:16:21.275507 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:16:21 crc kubenswrapper[4733]: E0318 10:16:21.275828 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:18:23.275776184 +0000 UTC m=+342.767510519 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:21 crc kubenswrapper[4733]: I0318 10:16:21.275954 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:16:21 crc kubenswrapper[4733]: I0318 10:16:21.276014 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:16:21 crc kubenswrapper[4733]: I0318 10:16:21.276074 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b3650177-e338-4eba-ab42-bc0cd14c9d65-metrics-certs\") pod \"network-metrics-daemon-4s425\" (UID: \"b3650177-e338-4eba-ab42-bc0cd14c9d65\") " pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:16:21 crc kubenswrapper[4733]: I0318 10:16:21.276125 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:16:21 crc kubenswrapper[4733]: E0318 10:16:21.276234 4733 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 10:16:21 crc kubenswrapper[4733]: E0318 10:16:21.276315 4733 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 10:16:21 crc kubenswrapper[4733]: E0318 10:16:21.276333 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 10:18:23.276320332 +0000 UTC m=+342.768054667 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 10:16:21 crc kubenswrapper[4733]: E0318 10:16:21.276522 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 10:18:23.276459676 +0000 UTC m=+342.768194041 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 10:16:21 crc kubenswrapper[4733]: E0318 10:16:21.276709 4733 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 10:16:21 crc kubenswrapper[4733]: E0318 10:16:21.276764 4733 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 10:16:21 crc kubenswrapper[4733]: E0318 10:16:21.276785 4733 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 10:16:21 crc kubenswrapper[4733]: E0318 10:16:21.276907 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 10:18:23.276872159 +0000 UTC m=+342.768606624 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 10:16:21 crc kubenswrapper[4733]: E0318 10:16:21.276999 4733 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 10:16:21 crc kubenswrapper[4733]: E0318 10:16:21.277107 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3650177-e338-4eba-ab42-bc0cd14c9d65-metrics-certs podName:b3650177-e338-4eba-ab42-bc0cd14c9d65 nodeName:}" failed. No retries permitted until 2026-03-18 10:18:23.277079076 +0000 UTC m=+342.768813441 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b3650177-e338-4eba-ab42-bc0cd14c9d65-metrics-certs") pod "network-metrics-daemon-4s425" (UID: "b3650177-e338-4eba-ab42-bc0cd14c9d65") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 10:16:21 crc kubenswrapper[4733]: I0318 10:16:21.377040 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:16:21 crc kubenswrapper[4733]: E0318 10:16:21.377372 4733 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 10:16:21 crc kubenswrapper[4733]: E0318 10:16:21.377437 4733 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 10:16:21 crc kubenswrapper[4733]: E0318 10:16:21.377459 4733 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 10:16:21 crc kubenswrapper[4733]: E0318 10:16:21.377561 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 10:18:23.377533809 +0000 UTC m=+342.869268174 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 10:16:22 crc kubenswrapper[4733]: I0318 10:16:22.367944 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:16:22 crc kubenswrapper[4733]: I0318 10:16:22.368089 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:16:22 crc kubenswrapper[4733]: I0318 10:16:22.368777 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:16:22 crc kubenswrapper[4733]: I0318 10:16:22.374017 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 18 10:16:22 crc kubenswrapper[4733]: I0318 10:16:22.374064 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 18 10:16:22 crc kubenswrapper[4733]: I0318 10:16:22.374305 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 18 10:16:22 crc kubenswrapper[4733]: I0318 10:16:22.374456 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 18 10:16:23 crc kubenswrapper[4733]: I0318 10:16:23.175350 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:16:23 crc kubenswrapper[4733]: I0318 10:16:23.178301 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 18 10:16:23 crc kubenswrapper[4733]: I0318 10:16:23.178762 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.589309 4733 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.645146 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-m5k95"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.645903 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m5k95" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.648092 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-848w7"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.648977 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-848w7" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.652040 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-nbftd"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.653459 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-nbftd" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.653724 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7z2vw"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.654858 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7z2vw" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.655313 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.655436 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.655722 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.661617 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-gxcb2"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.662180 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-gxcb2" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.662653 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.662826 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.662928 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.662677 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.662731 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.663285 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.662802 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.663455 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.662846 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.666236 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qs72s"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.666746 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-9dd56"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.666938 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qs72s" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.667173 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9dd56" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.678009 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-8v244"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.678469 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-8v244" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.679075 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-ltwbb"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.680032 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ltwbb" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.680922 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-lsqn4"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.681382 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lsqn4" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.683531 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-xh9n5"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.683860 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-xh9n5" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.684395 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-zztn5"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.684899 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-zztn5" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.686500 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-hw7zb"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.687305 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hw7zb" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.687736 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-nwhtg"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.688345 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.691269 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-xl5d7"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.691988 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-xl5d7" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.692413 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-6572z"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.693521 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6572z" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.693735 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2wc5m"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.694110 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2wc5m" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.701082 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.702477 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.702763 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.702937 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.703363 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.702830 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.720014 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25331c44-b639-46f7-8a7f-6f62f8779e2b-serving-cert\") pod \"route-controller-manager-6576b87f9c-m5k95\" (UID: \"25331c44-b639-46f7-8a7f-6f62f8779e2b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m5k95" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.720287 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgmgk\" (UniqueName: \"kubernetes.io/projected/ef9e43d5-8b80-4934-82b6-c8ee0591e1bf-kube-api-access-kgmgk\") pod \"openshift-apiserver-operator-796bbdcf4f-7z2vw\" (UID: \"ef9e43d5-8b80-4934-82b6-c8ee0591e1bf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7z2vw" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.720414 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/43cea3fb-14f9-4993-a8a9-4618680e8286-serving-cert\") pod \"openshift-config-operator-7777fb866f-848w7\" (UID: \"43cea3fb-14f9-4993-a8a9-4618680e8286\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-848w7" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.720547 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0c02459c-3d75-4363-a010-3e9639bb9b4e-images\") pod \"machine-api-operator-5694c8668f-nbftd\" (UID: \"0c02459c-3d75-4363-a010-3e9639bb9b4e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nbftd" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.720653 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzj8w\" (UniqueName: \"kubernetes.io/projected/25331c44-b639-46f7-8a7f-6f62f8779e2b-kube-api-access-rzj8w\") pod \"route-controller-manager-6576b87f9c-m5k95\" (UID: \"25331c44-b639-46f7-8a7f-6f62f8779e2b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m5k95" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.720802 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef9e43d5-8b80-4934-82b6-c8ee0591e1bf-config\") pod \"openshift-apiserver-operator-796bbdcf4f-7z2vw\" (UID: \"ef9e43d5-8b80-4934-82b6-c8ee0591e1bf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7z2vw" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.720914 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/25331c44-b639-46f7-8a7f-6f62f8779e2b-client-ca\") pod \"route-controller-manager-6576b87f9c-m5k95\" (UID: \"25331c44-b639-46f7-8a7f-6f62f8779e2b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m5k95" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.721011 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c02459c-3d75-4363-a010-3e9639bb9b4e-config\") pod \"machine-api-operator-5694c8668f-nbftd\" (UID: \"0c02459c-3d75-4363-a010-3e9639bb9b4e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nbftd" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.721110 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2wcj\" (UniqueName: \"kubernetes.io/projected/0c02459c-3d75-4363-a010-3e9639bb9b4e-kube-api-access-b2wcj\") pod \"machine-api-operator-5694c8668f-nbftd\" (UID: \"0c02459c-3d75-4363-a010-3e9639bb9b4e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nbftd" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.721233 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w48br\" (UniqueName: \"kubernetes.io/projected/43cea3fb-14f9-4993-a8a9-4618680e8286-kube-api-access-w48br\") pod \"openshift-config-operator-7777fb866f-848w7\" (UID: \"43cea3fb-14f9-4993-a8a9-4618680e8286\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-848w7" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.721399 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/43cea3fb-14f9-4993-a8a9-4618680e8286-available-featuregates\") pod \"openshift-config-operator-7777fb866f-848w7\" (UID: \"43cea3fb-14f9-4993-a8a9-4618680e8286\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-848w7" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.721530 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25331c44-b639-46f7-8a7f-6f62f8779e2b-config\") pod \"route-controller-manager-6576b87f9c-m5k95\" (UID: \"25331c44-b639-46f7-8a7f-6f62f8779e2b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m5k95" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.721639 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef9e43d5-8b80-4934-82b6-c8ee0591e1bf-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-7z2vw\" (UID: \"ef9e43d5-8b80-4934-82b6-c8ee0591e1bf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7z2vw" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.721995 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/0c02459c-3d75-4363-a010-3e9639bb9b4e-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-nbftd\" (UID: \"0c02459c-3d75-4363-a010-3e9639bb9b4e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nbftd" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.751828 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.754613 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.754977 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.771569 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.772220 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.772301 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.773639 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.774970 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.786258 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.786373 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.786405 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.786511 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.786523 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.786661 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.786673 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.786785 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.786840 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.786785 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.790263 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-mz68f"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.790345 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.790542 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.790803 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.790906 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.790945 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.791058 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.791117 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.791226 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.791314 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.791419 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.791489 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.791553 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.791629 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mz68f" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.791641 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.791671 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.791769 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.791804 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.791890 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.791941 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.792019 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.792121 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.792169 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.792274 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.792336 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.812089 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.813087 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4lbr5"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.813444 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.813519 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.813560 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4lbr5" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.813616 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.813681 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.813760 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.813843 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.813869 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.813885 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.813934 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.814484 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.815308 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.815366 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9h9xr"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.815566 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.815788 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.815861 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-9h9xr" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.816006 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.816162 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.816290 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.816449 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.816625 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.816770 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.816799 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.817762 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.817801 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.817907 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.817927 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mxb9q"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.818050 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.818218 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.818349 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.818569 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mxb9q" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.818812 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g686q"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.822835 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef9e43d5-8b80-4934-82b6-c8ee0591e1bf-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-7z2vw\" (UID: \"ef9e43d5-8b80-4934-82b6-c8ee0591e1bf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7z2vw" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.822880 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msfql\" (UniqueName: \"kubernetes.io/projected/61e27ee7-5eb0-4cc7-a696-85ddd192b171-kube-api-access-msfql\") pod \"downloads-7954f5f757-gxcb2\" (UID: \"61e27ee7-5eb0-4cc7-a696-85ddd192b171\") " pod="openshift-console/downloads-7954f5f757-gxcb2" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.822905 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9572819-3894-4603-bd2b-7c9465bb0067-config\") pod \"etcd-operator-b45778765-zztn5\" (UID: \"d9572819-3894-4603-bd2b-7c9465bb0067\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zztn5" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.822925 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/d9572819-3894-4603-bd2b-7c9465bb0067-etcd-ca\") pod \"etcd-operator-b45778765-zztn5\" (UID: \"d9572819-3894-4603-bd2b-7c9465bb0067\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zztn5" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.822943 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2539fca8-3dde-43ed-815c-e837f37dfdd5-auth-proxy-config\") pod \"machine-approver-56656f9798-9dd56\" (UID: \"2539fca8-3dde-43ed-815c-e837f37dfdd5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9dd56" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.822966 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6b6a9601-6689-435b-aca1-256a0c3c07fb-metrics-tls\") pod \"ingress-operator-5b745b69d9-ltwbb\" (UID: \"6b6a9601-6689-435b-aca1-256a0c3c07fb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ltwbb" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.822987 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2539fca8-3dde-43ed-815c-e837f37dfdd5-config\") pod \"machine-approver-56656f9798-9dd56\" (UID: \"2539fca8-3dde-43ed-815c-e837f37dfdd5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9dd56" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.823009 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f27409fc-b6dd-4573-918b-7b30b3635cc7-oauth-serving-cert\") pod \"console-f9d7485db-8v244\" (UID: \"f27409fc-b6dd-4573-918b-7b30b3635cc7\") " pod="openshift-console/console-f9d7485db-8v244" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.823030 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9c5f567e-b38f-44a0-b1fd-1a96857e811f-metrics-certs\") pod \"router-default-5444994796-xl5d7\" (UID: \"9c5f567e-b38f-44a0-b1fd-1a96857e811f\") " pod="openshift-ingress/router-default-5444994796-xl5d7" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.823048 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5c94c\" (UniqueName: \"kubernetes.io/projected/c0da800f-a7ca-4d0e-89bb-96673854969e-kube-api-access-5c94c\") pod \"migrator-59844c95c7-6572z\" (UID: \"c0da800f-a7ca-4d0e-89bb-96673854969e\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6572z" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.823065 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z955v\" (UniqueName: \"kubernetes.io/projected/9c5f567e-b38f-44a0-b1fd-1a96857e811f-kube-api-access-z955v\") pod \"router-default-5444994796-xl5d7\" (UID: \"9c5f567e-b38f-44a0-b1fd-1a96857e811f\") " pod="openshift-ingress/router-default-5444994796-xl5d7" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.823085 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/158a5836-f175-4da3-b22d-6a3130a89d30-serving-cert\") pod \"authentication-operator-69f744f599-xh9n5\" (UID: \"158a5836-f175-4da3-b22d-6a3130a89d30\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xh9n5" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.823106 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/0c02459c-3d75-4363-a010-3e9639bb9b4e-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-nbftd\" (UID: \"0c02459c-3d75-4363-a010-3e9639bb9b4e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nbftd" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.823127 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f27409fc-b6dd-4573-918b-7b30b3635cc7-service-ca\") pod \"console-f9d7485db-8v244\" (UID: \"f27409fc-b6dd-4573-918b-7b30b3635cc7\") " pod="openshift-console/console-f9d7485db-8v244" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.823144 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f27409fc-b6dd-4573-918b-7b30b3635cc7-trusted-ca-bundle\") pod \"console-f9d7485db-8v244\" (UID: \"f27409fc-b6dd-4573-918b-7b30b3635cc7\") " pod="openshift-console/console-f9d7485db-8v244" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.823164 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/9c5f567e-b38f-44a0-b1fd-1a96857e811f-stats-auth\") pod \"router-default-5444994796-xl5d7\" (UID: \"9c5f567e-b38f-44a0-b1fd-1a96857e811f\") " pod="openshift-ingress/router-default-5444994796-xl5d7" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.823202 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/158a5836-f175-4da3-b22d-6a3130a89d30-service-ca-bundle\") pod \"authentication-operator-69f744f599-xh9n5\" (UID: \"158a5836-f175-4da3-b22d-6a3130a89d30\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xh9n5" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.823224 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2cdd\" (UniqueName: \"kubernetes.io/projected/99efba52-bc27-49d7-8efb-154b6e3787a9-kube-api-access-f2cdd\") pod \"cluster-samples-operator-665b6dd947-qs72s\" (UID: \"99efba52-bc27-49d7-8efb-154b6e3787a9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qs72s" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.823258 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2539fca8-3dde-43ed-815c-e837f37dfdd5-machine-approver-tls\") pod \"machine-approver-56656f9798-9dd56\" (UID: \"2539fca8-3dde-43ed-815c-e837f37dfdd5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9dd56" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.823278 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f27409fc-b6dd-4573-918b-7b30b3635cc7-console-config\") pod \"console-f9d7485db-8v244\" (UID: \"f27409fc-b6dd-4573-918b-7b30b3635cc7\") " pod="openshift-console/console-f9d7485db-8v244" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.823302 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25331c44-b639-46f7-8a7f-6f62f8779e2b-serving-cert\") pod \"route-controller-manager-6576b87f9c-m5k95\" (UID: \"25331c44-b639-46f7-8a7f-6f62f8779e2b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m5k95" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.824102 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-h5xdn"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.824232 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngwlw\" (UniqueName: \"kubernetes.io/projected/158a5836-f175-4da3-b22d-6a3130a89d30-kube-api-access-ngwlw\") pod \"authentication-operator-69f744f599-xh9n5\" (UID: \"158a5836-f175-4da3-b22d-6a3130a89d30\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xh9n5" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.824277 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/99efba52-bc27-49d7-8efb-154b6e3787a9-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-qs72s\" (UID: \"99efba52-bc27-49d7-8efb-154b6e3787a9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qs72s" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.824296 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6b6a9601-6689-435b-aca1-256a0c3c07fb-bound-sa-token\") pod \"ingress-operator-5b745b69d9-ltwbb\" (UID: \"6b6a9601-6689-435b-aca1-256a0c3c07fb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ltwbb" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.824315 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9572819-3894-4603-bd2b-7c9465bb0067-serving-cert\") pod \"etcd-operator-b45778765-zztn5\" (UID: \"d9572819-3894-4603-bd2b-7c9465bb0067\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zztn5" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.824356 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f27409fc-b6dd-4573-918b-7b30b3635cc7-console-oauth-config\") pod \"console-f9d7485db-8v244\" (UID: \"f27409fc-b6dd-4573-918b-7b30b3635cc7\") " pod="openshift-console/console-f9d7485db-8v244" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.824377 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f27409fc-b6dd-4573-918b-7b30b3635cc7-console-serving-cert\") pod \"console-f9d7485db-8v244\" (UID: \"f27409fc-b6dd-4573-918b-7b30b3635cc7\") " pod="openshift-console/console-f9d7485db-8v244" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.824402 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgmgk\" (UniqueName: \"kubernetes.io/projected/ef9e43d5-8b80-4934-82b6-c8ee0591e1bf-kube-api-access-kgmgk\") pod \"openshift-apiserver-operator-796bbdcf4f-7z2vw\" (UID: \"ef9e43d5-8b80-4934-82b6-c8ee0591e1bf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7z2vw" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.824425 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/43cea3fb-14f9-4993-a8a9-4618680e8286-serving-cert\") pod \"openshift-config-operator-7777fb866f-848w7\" (UID: \"43cea3fb-14f9-4993-a8a9-4618680e8286\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-848w7" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.824447 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0c02459c-3d75-4363-a010-3e9639bb9b4e-images\") pod \"machine-api-operator-5694c8668f-nbftd\" (UID: \"0c02459c-3d75-4363-a010-3e9639bb9b4e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nbftd" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.824471 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/158a5836-f175-4da3-b22d-6a3130a89d30-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-xh9n5\" (UID: \"158a5836-f175-4da3-b22d-6a3130a89d30\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xh9n5" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.824492 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d9572819-3894-4603-bd2b-7c9465bb0067-etcd-client\") pod \"etcd-operator-b45778765-zztn5\" (UID: \"d9572819-3894-4603-bd2b-7c9465bb0067\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zztn5" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.824511 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/9c5f567e-b38f-44a0-b1fd-1a96857e811f-default-certificate\") pod \"router-default-5444994796-xl5d7\" (UID: \"9c5f567e-b38f-44a0-b1fd-1a96857e811f\") " pod="openshift-ingress/router-default-5444994796-xl5d7" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.824531 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6b6a9601-6689-435b-aca1-256a0c3c07fb-trusted-ca\") pod \"ingress-operator-5b745b69d9-ltwbb\" (UID: \"6b6a9601-6689-435b-aca1-256a0c3c07fb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ltwbb" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.824557 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzj8w\" (UniqueName: \"kubernetes.io/projected/25331c44-b639-46f7-8a7f-6f62f8779e2b-kube-api-access-rzj8w\") pod \"route-controller-manager-6576b87f9c-m5k95\" (UID: \"25331c44-b639-46f7-8a7f-6f62f8779e2b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m5k95" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.824570 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g686q" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.824580 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef9e43d5-8b80-4934-82b6-c8ee0591e1bf-config\") pod \"openshift-apiserver-operator-796bbdcf4f-7z2vw\" (UID: \"ef9e43d5-8b80-4934-82b6-c8ee0591e1bf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7z2vw" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.824604 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdltm\" (UniqueName: \"kubernetes.io/projected/f27409fc-b6dd-4573-918b-7b30b3635cc7-kube-api-access-cdltm\") pod \"console-f9d7485db-8v244\" (UID: \"f27409fc-b6dd-4573-918b-7b30b3635cc7\") " pod="openshift-console/console-f9d7485db-8v244" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.824629 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddl6d\" (UniqueName: \"kubernetes.io/projected/2539fca8-3dde-43ed-815c-e837f37dfdd5-kube-api-access-ddl6d\") pod \"machine-approver-56656f9798-9dd56\" (UID: \"2539fca8-3dde-43ed-815c-e837f37dfdd5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9dd56" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.824650 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9jhd\" (UniqueName: \"kubernetes.io/projected/d9572819-3894-4603-bd2b-7c9465bb0067-kube-api-access-l9jhd\") pod \"etcd-operator-b45778765-zztn5\" (UID: \"d9572819-3894-4603-bd2b-7c9465bb0067\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zztn5" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.824672 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c5f567e-b38f-44a0-b1fd-1a96857e811f-service-ca-bundle\") pod \"router-default-5444994796-xl5d7\" (UID: \"9c5f567e-b38f-44a0-b1fd-1a96857e811f\") " pod="openshift-ingress/router-default-5444994796-xl5d7" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.824693 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/25331c44-b639-46f7-8a7f-6f62f8779e2b-client-ca\") pod \"route-controller-manager-6576b87f9c-m5k95\" (UID: \"25331c44-b639-46f7-8a7f-6f62f8779e2b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m5k95" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.824712 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/d9572819-3894-4603-bd2b-7c9465bb0067-etcd-service-ca\") pod \"etcd-operator-b45778765-zztn5\" (UID: \"d9572819-3894-4603-bd2b-7c9465bb0067\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zztn5" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.824734 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2wcj\" (UniqueName: \"kubernetes.io/projected/0c02459c-3d75-4363-a010-3e9639bb9b4e-kube-api-access-b2wcj\") pod \"machine-api-operator-5694c8668f-nbftd\" (UID: \"0c02459c-3d75-4363-a010-3e9639bb9b4e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nbftd" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.824755 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c02459c-3d75-4363-a010-3e9639bb9b4e-config\") pod \"machine-api-operator-5694c8668f-nbftd\" (UID: \"0c02459c-3d75-4363-a010-3e9639bb9b4e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nbftd" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.824775 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w48br\" (UniqueName: \"kubernetes.io/projected/43cea3fb-14f9-4993-a8a9-4618680e8286-kube-api-access-w48br\") pod \"openshift-config-operator-7777fb866f-848w7\" (UID: \"43cea3fb-14f9-4993-a8a9-4618680e8286\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-848w7" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.824795 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/158a5836-f175-4da3-b22d-6a3130a89d30-config\") pod \"authentication-operator-69f744f599-xh9n5\" (UID: \"158a5836-f175-4da3-b22d-6a3130a89d30\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xh9n5" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.824531 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-h5xdn" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.825235 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/43cea3fb-14f9-4993-a8a9-4618680e8286-available-featuregates\") pod \"openshift-config-operator-7777fb866f-848w7\" (UID: \"43cea3fb-14f9-4993-a8a9-4618680e8286\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-848w7" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.826130 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef9e43d5-8b80-4934-82b6-c8ee0591e1bf-config\") pod \"openshift-apiserver-operator-796bbdcf4f-7z2vw\" (UID: \"ef9e43d5-8b80-4934-82b6-c8ee0591e1bf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7z2vw" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.831082 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7rr85"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.831611 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c02459c-3d75-4363-a010-3e9639bb9b4e-config\") pod \"machine-api-operator-5694c8668f-nbftd\" (UID: \"0c02459c-3d75-4363-a010-3e9639bb9b4e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nbftd" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.832120 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0c02459c-3d75-4363-a010-3e9639bb9b4e-images\") pod \"machine-api-operator-5694c8668f-nbftd\" (UID: \"0c02459c-3d75-4363-a010-3e9639bb9b4e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nbftd" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.832390 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/25331c44-b639-46f7-8a7f-6f62f8779e2b-client-ca\") pod \"route-controller-manager-6576b87f9c-m5k95\" (UID: \"25331c44-b639-46f7-8a7f-6f62f8779e2b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m5k95" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.824814 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/43cea3fb-14f9-4993-a8a9-4618680e8286-available-featuregates\") pod \"openshift-config-operator-7777fb866f-848w7\" (UID: \"43cea3fb-14f9-4993-a8a9-4618680e8286\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-848w7" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.832701 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25331c44-b639-46f7-8a7f-6f62f8779e2b-config\") pod \"route-controller-manager-6576b87f9c-m5k95\" (UID: \"25331c44-b639-46f7-8a7f-6f62f8779e2b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m5k95" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.832749 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6s2kq\" (UniqueName: \"kubernetes.io/projected/6b6a9601-6689-435b-aca1-256a0c3c07fb-kube-api-access-6s2kq\") pod \"ingress-operator-5b745b69d9-ltwbb\" (UID: \"6b6a9601-6689-435b-aca1-256a0c3c07fb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ltwbb" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.833703 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25331c44-b639-46f7-8a7f-6f62f8779e2b-config\") pod \"route-controller-manager-6576b87f9c-m5k95\" (UID: \"25331c44-b639-46f7-8a7f-6f62f8779e2b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m5k95" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.834473 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef9e43d5-8b80-4934-82b6-c8ee0591e1bf-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-7z2vw\" (UID: \"ef9e43d5-8b80-4934-82b6-c8ee0591e1bf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7z2vw" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.834868 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.835230 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mj46t"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.835570 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-7rr85" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.835593 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-vsnq2"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.835705 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mj46t" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.836134 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.836646 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563815-tsrs6"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.837049 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563815-tsrs6" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.837297 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-vsnq2" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.837548 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.837798 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.840607 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gdmqx"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.842364 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gdmqx" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.843660 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25331c44-b639-46f7-8a7f-6f62f8779e2b-serving-cert\") pod \"route-controller-manager-6576b87f9c-m5k95\" (UID: \"25331c44-b639-46f7-8a7f-6f62f8779e2b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m5k95" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.847072 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.848460 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.848847 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/43cea3fb-14f9-4993-a8a9-4618680e8286-serving-cert\") pod \"openshift-config-operator-7777fb866f-848w7\" (UID: \"43cea3fb-14f9-4993-a8a9-4618680e8286\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-848w7" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.848996 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/0c02459c-3d75-4363-a010-3e9639bb9b4e-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-nbftd\" (UID: \"0c02459c-3d75-4363-a010-3e9639bb9b4e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nbftd" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.849030 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.850508 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dxd7p"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.851110 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z8g4f"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.851673 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z8g4f" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.852010 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dxd7p" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.853633 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.853663 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.853864 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.853932 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.854073 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.854118 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.854148 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.868724 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.874806 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-lptjf"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.875494 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.876917 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-h8kqf"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.877078 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.878289 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.878856 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.879373 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.884888 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-lptjf" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.886651 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.888367 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8kv4d"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.888789 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-h8kqf" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.889637 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563816-4582s"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.892851 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8kv4d" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.895489 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-t95b6"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.896513 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563816-4582s" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.896842 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-n6hmz"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.900386 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-t95b6" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.901178 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-n6hmz" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.906146 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzj8w\" (UniqueName: \"kubernetes.io/projected/25331c44-b639-46f7-8a7f-6f62f8779e2b-kube-api-access-rzj8w\") pod \"route-controller-manager-6576b87f9c-m5k95\" (UID: \"25331c44-b639-46f7-8a7f-6f62f8779e2b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m5k95" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.911726 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bzhq6"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.912403 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kd6gw"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.912894 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-m5k95"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.912987 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kd6gw" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.913777 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bzhq6" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.916957 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-848w7"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.920701 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-xvnwv"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.921479 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qs72s"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.921587 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-xvnwv" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.924607 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-gxcb2"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.926418 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgmgk\" (UniqueName: \"kubernetes.io/projected/ef9e43d5-8b80-4934-82b6-c8ee0591e1bf-kube-api-access-kgmgk\") pod \"openshift-apiserver-operator-796bbdcf4f-7z2vw\" (UID: \"ef9e43d5-8b80-4934-82b6-c8ee0591e1bf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7z2vw" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.926506 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-6572z"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.929445 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-nwhtg"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.932316 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-hw7zb"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.933995 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddl6d\" (UniqueName: \"kubernetes.io/projected/2539fca8-3dde-43ed-815c-e837f37dfdd5-kube-api-access-ddl6d\") pod \"machine-approver-56656f9798-9dd56\" (UID: \"2539fca8-3dde-43ed-815c-e837f37dfdd5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9dd56" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.934048 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9jhd\" (UniqueName: \"kubernetes.io/projected/d9572819-3894-4603-bd2b-7c9465bb0067-kube-api-access-l9jhd\") pod \"etcd-operator-b45778765-zztn5\" (UID: \"d9572819-3894-4603-bd2b-7c9465bb0067\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zztn5" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.934079 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c5f567e-b38f-44a0-b1fd-1a96857e811f-service-ca-bundle\") pod \"router-default-5444994796-xl5d7\" (UID: \"9c5f567e-b38f-44a0-b1fd-1a96857e811f\") " pod="openshift-ingress/router-default-5444994796-xl5d7" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.934103 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/d9572819-3894-4603-bd2b-7c9465bb0067-etcd-service-ca\") pod \"etcd-operator-b45778765-zztn5\" (UID: \"d9572819-3894-4603-bd2b-7c9465bb0067\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zztn5" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.934137 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7b5dc098-4a15-429b-8243-1ac75ce2e0c1-trusted-ca\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.934174 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/158a5836-f175-4da3-b22d-6a3130a89d30-config\") pod \"authentication-operator-69f744f599-xh9n5\" (UID: \"158a5836-f175-4da3-b22d-6a3130a89d30\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xh9n5" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.934244 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6s2kq\" (UniqueName: \"kubernetes.io/projected/6b6a9601-6689-435b-aca1-256a0c3c07fb-kube-api-access-6s2kq\") pod \"ingress-operator-5b745b69d9-ltwbb\" (UID: \"6b6a9601-6689-435b-aca1-256a0c3c07fb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ltwbb" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.934281 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9571ba80-f267-46ed-8d16-e44531cb0ce8-profile-collector-cert\") pod \"olm-operator-6b444d44fb-2wc5m\" (UID: \"9571ba80-f267-46ed-8d16-e44531cb0ce8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2wc5m" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.934315 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msfql\" (UniqueName: \"kubernetes.io/projected/61e27ee7-5eb0-4cc7-a696-85ddd192b171-kube-api-access-msfql\") pod \"downloads-7954f5f757-gxcb2\" (UID: \"61e27ee7-5eb0-4cc7-a696-85ddd192b171\") " pod="openshift-console/downloads-7954f5f757-gxcb2" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.934338 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9572819-3894-4603-bd2b-7c9465bb0067-config\") pod \"etcd-operator-b45778765-zztn5\" (UID: \"d9572819-3894-4603-bd2b-7c9465bb0067\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zztn5" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.934362 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d48vf\" (UniqueName: \"kubernetes.io/projected/57151941-19ac-4bb5-a93b-b5dfbc88e0d6-kube-api-access-d48vf\") pod \"machine-config-controller-84d6567774-hw7zb\" (UID: \"57151941-19ac-4bb5-a93b-b5dfbc88e0d6\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hw7zb" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.934385 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2539fca8-3dde-43ed-815c-e837f37dfdd5-auth-proxy-config\") pod \"machine-approver-56656f9798-9dd56\" (UID: \"2539fca8-3dde-43ed-815c-e837f37dfdd5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9dd56" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.934406 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/d9572819-3894-4603-bd2b-7c9465bb0067-etcd-ca\") pod \"etcd-operator-b45778765-zztn5\" (UID: \"d9572819-3894-4603-bd2b-7c9465bb0067\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zztn5" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.934428 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/352d0ed5-c43b-431f-bd66-1749ab30d013-encryption-config\") pod \"apiserver-7bbb656c7d-lsqn4\" (UID: \"352d0ed5-c43b-431f-bd66-1749ab30d013\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lsqn4" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.934451 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4lrt\" (UniqueName: \"kubernetes.io/projected/352d0ed5-c43b-431f-bd66-1749ab30d013-kube-api-access-l4lrt\") pod \"apiserver-7bbb656c7d-lsqn4\" (UID: \"352d0ed5-c43b-431f-bd66-1749ab30d013\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lsqn4" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.934473 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6b6a9601-6689-435b-aca1-256a0c3c07fb-metrics-tls\") pod \"ingress-operator-5b745b69d9-ltwbb\" (UID: \"6b6a9601-6689-435b-aca1-256a0c3c07fb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ltwbb" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.934500 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2539fca8-3dde-43ed-815c-e837f37dfdd5-config\") pod \"machine-approver-56656f9798-9dd56\" (UID: \"2539fca8-3dde-43ed-815c-e837f37dfdd5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9dd56" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.934522 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7b5dc098-4a15-429b-8243-1ac75ce2e0c1-installation-pull-secrets\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.934553 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/352d0ed5-c43b-431f-bd66-1749ab30d013-serving-cert\") pod \"apiserver-7bbb656c7d-lsqn4\" (UID: \"352d0ed5-c43b-431f-bd66-1749ab30d013\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lsqn4" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.934576 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfbq5\" (UniqueName: \"kubernetes.io/projected/9571ba80-f267-46ed-8d16-e44531cb0ce8-kube-api-access-wfbq5\") pod \"olm-operator-6b444d44fb-2wc5m\" (UID: \"9571ba80-f267-46ed-8d16-e44531cb0ce8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2wc5m" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.934597 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/352d0ed5-c43b-431f-bd66-1749ab30d013-audit-policies\") pod \"apiserver-7bbb656c7d-lsqn4\" (UID: \"352d0ed5-c43b-431f-bd66-1749ab30d013\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lsqn4" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.934622 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f27409fc-b6dd-4573-918b-7b30b3635cc7-oauth-serving-cert\") pod \"console-f9d7485db-8v244\" (UID: \"f27409fc-b6dd-4573-918b-7b30b3635cc7\") " pod="openshift-console/console-f9d7485db-8v244" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.934644 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/352d0ed5-c43b-431f-bd66-1749ab30d013-audit-dir\") pod \"apiserver-7bbb656c7d-lsqn4\" (UID: \"352d0ed5-c43b-431f-bd66-1749ab30d013\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lsqn4" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.934677 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9c5f567e-b38f-44a0-b1fd-1a96857e811f-metrics-certs\") pod \"router-default-5444994796-xl5d7\" (UID: \"9c5f567e-b38f-44a0-b1fd-1a96857e811f\") " pod="openshift-ingress/router-default-5444994796-xl5d7" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.934720 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5c94c\" (UniqueName: \"kubernetes.io/projected/c0da800f-a7ca-4d0e-89bb-96673854969e-kube-api-access-5c94c\") pod \"migrator-59844c95c7-6572z\" (UID: \"c0da800f-a7ca-4d0e-89bb-96673854969e\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6572z" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.934743 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/158a5836-f175-4da3-b22d-6a3130a89d30-serving-cert\") pod \"authentication-operator-69f744f599-xh9n5\" (UID: \"158a5836-f175-4da3-b22d-6a3130a89d30\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xh9n5" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.934766 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z955v\" (UniqueName: \"kubernetes.io/projected/9c5f567e-b38f-44a0-b1fd-1a96857e811f-kube-api-access-z955v\") pod \"router-default-5444994796-xl5d7\" (UID: \"9c5f567e-b38f-44a0-b1fd-1a96857e811f\") " pod="openshift-ingress/router-default-5444994796-xl5d7" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.934790 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f27409fc-b6dd-4573-918b-7b30b3635cc7-service-ca\") pod \"console-f9d7485db-8v244\" (UID: \"f27409fc-b6dd-4573-918b-7b30b3635cc7\") " pod="openshift-console/console-f9d7485db-8v244" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.934812 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f27409fc-b6dd-4573-918b-7b30b3635cc7-trusted-ca-bundle\") pod \"console-f9d7485db-8v244\" (UID: \"f27409fc-b6dd-4573-918b-7b30b3635cc7\") " pod="openshift-console/console-f9d7485db-8v244" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.934833 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qqrn\" (UniqueName: \"kubernetes.io/projected/7b5dc098-4a15-429b-8243-1ac75ce2e0c1-kube-api-access-2qqrn\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.934855 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/352d0ed5-c43b-431f-bd66-1749ab30d013-etcd-client\") pod \"apiserver-7bbb656c7d-lsqn4\" (UID: \"352d0ed5-c43b-431f-bd66-1749ab30d013\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lsqn4" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.934878 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/158a5836-f175-4da3-b22d-6a3130a89d30-service-ca-bundle\") pod \"authentication-operator-69f744f599-xh9n5\" (UID: \"158a5836-f175-4da3-b22d-6a3130a89d30\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xh9n5" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.934902 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/9c5f567e-b38f-44a0-b1fd-1a96857e811f-stats-auth\") pod \"router-default-5444994796-xl5d7\" (UID: \"9c5f567e-b38f-44a0-b1fd-1a96857e811f\") " pod="openshift-ingress/router-default-5444994796-xl5d7" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.934924 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2cdd\" (UniqueName: \"kubernetes.io/projected/99efba52-bc27-49d7-8efb-154b6e3787a9-kube-api-access-f2cdd\") pod \"cluster-samples-operator-665b6dd947-qs72s\" (UID: \"99efba52-bc27-49d7-8efb-154b6e3787a9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qs72s" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.934953 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.934975 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7b5dc098-4a15-429b-8243-1ac75ce2e0c1-registry-tls\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.935015 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2539fca8-3dde-43ed-815c-e837f37dfdd5-machine-approver-tls\") pod \"machine-approver-56656f9798-9dd56\" (UID: \"2539fca8-3dde-43ed-815c-e837f37dfdd5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9dd56" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.935037 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7b5dc098-4a15-429b-8243-1ac75ce2e0c1-ca-trust-extracted\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.935056 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7b5dc098-4a15-429b-8243-1ac75ce2e0c1-bound-sa-token\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.935090 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f27409fc-b6dd-4573-918b-7b30b3635cc7-console-config\") pod \"console-f9d7485db-8v244\" (UID: \"f27409fc-b6dd-4573-918b-7b30b3635cc7\") " pod="openshift-console/console-f9d7485db-8v244" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.935133 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9571ba80-f267-46ed-8d16-e44531cb0ce8-srv-cert\") pod \"olm-operator-6b444d44fb-2wc5m\" (UID: \"9571ba80-f267-46ed-8d16-e44531cb0ce8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2wc5m" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.935159 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f27409fc-b6dd-4573-918b-7b30b3635cc7-console-oauth-config\") pod \"console-f9d7485db-8v244\" (UID: \"f27409fc-b6dd-4573-918b-7b30b3635cc7\") " pod="openshift-console/console-f9d7485db-8v244" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.935199 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngwlw\" (UniqueName: \"kubernetes.io/projected/158a5836-f175-4da3-b22d-6a3130a89d30-kube-api-access-ngwlw\") pod \"authentication-operator-69f744f599-xh9n5\" (UID: \"158a5836-f175-4da3-b22d-6a3130a89d30\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xh9n5" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.935222 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/99efba52-bc27-49d7-8efb-154b6e3787a9-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-qs72s\" (UID: \"99efba52-bc27-49d7-8efb-154b6e3787a9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qs72s" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.935245 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6b6a9601-6689-435b-aca1-256a0c3c07fb-bound-sa-token\") pod \"ingress-operator-5b745b69d9-ltwbb\" (UID: \"6b6a9601-6689-435b-aca1-256a0c3c07fb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ltwbb" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.935266 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9572819-3894-4603-bd2b-7c9465bb0067-serving-cert\") pod \"etcd-operator-b45778765-zztn5\" (UID: \"d9572819-3894-4603-bd2b-7c9465bb0067\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zztn5" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.935288 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7b5dc098-4a15-429b-8243-1ac75ce2e0c1-registry-certificates\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.935308 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/352d0ed5-c43b-431f-bd66-1749ab30d013-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-lsqn4\" (UID: \"352d0ed5-c43b-431f-bd66-1749ab30d013\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lsqn4" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.935330 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/352d0ed5-c43b-431f-bd66-1749ab30d013-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-lsqn4\" (UID: \"352d0ed5-c43b-431f-bd66-1749ab30d013\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lsqn4" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.935354 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f27409fc-b6dd-4573-918b-7b30b3635cc7-console-serving-cert\") pod \"console-f9d7485db-8v244\" (UID: \"f27409fc-b6dd-4573-918b-7b30b3635cc7\") " pod="openshift-console/console-f9d7485db-8v244" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.935393 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/57151941-19ac-4bb5-a93b-b5dfbc88e0d6-proxy-tls\") pod \"machine-config-controller-84d6567774-hw7zb\" (UID: \"57151941-19ac-4bb5-a93b-b5dfbc88e0d6\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hw7zb" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.935416 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/57151941-19ac-4bb5-a93b-b5dfbc88e0d6-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-hw7zb\" (UID: \"57151941-19ac-4bb5-a93b-b5dfbc88e0d6\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hw7zb" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.935440 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/158a5836-f175-4da3-b22d-6a3130a89d30-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-xh9n5\" (UID: \"158a5836-f175-4da3-b22d-6a3130a89d30\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xh9n5" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.935476 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d9572819-3894-4603-bd2b-7c9465bb0067-etcd-client\") pod \"etcd-operator-b45778765-zztn5\" (UID: \"d9572819-3894-4603-bd2b-7c9465bb0067\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zztn5" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.935499 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6b6a9601-6689-435b-aca1-256a0c3c07fb-trusted-ca\") pod \"ingress-operator-5b745b69d9-ltwbb\" (UID: \"6b6a9601-6689-435b-aca1-256a0c3c07fb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ltwbb" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.935543 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/9c5f567e-b38f-44a0-b1fd-1a96857e811f-default-certificate\") pod \"router-default-5444994796-xl5d7\" (UID: \"9c5f567e-b38f-44a0-b1fd-1a96857e811f\") " pod="openshift-ingress/router-default-5444994796-xl5d7" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.935581 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdltm\" (UniqueName: \"kubernetes.io/projected/f27409fc-b6dd-4573-918b-7b30b3635cc7-kube-api-access-cdltm\") pod \"console-f9d7485db-8v244\" (UID: \"f27409fc-b6dd-4573-918b-7b30b3635cc7\") " pod="openshift-console/console-f9d7485db-8v244" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.937381 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9572819-3894-4603-bd2b-7c9465bb0067-config\") pod \"etcd-operator-b45778765-zztn5\" (UID: \"d9572819-3894-4603-bd2b-7c9465bb0067\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zztn5" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.937501 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f27409fc-b6dd-4573-918b-7b30b3635cc7-trusted-ca-bundle\") pod \"console-f9d7485db-8v244\" (UID: \"f27409fc-b6dd-4573-918b-7b30b3635cc7\") " pod="openshift-console/console-f9d7485db-8v244" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.938033 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/d9572819-3894-4603-bd2b-7c9465bb0067-etcd-ca\") pod \"etcd-operator-b45778765-zztn5\" (UID: \"d9572819-3894-4603-bd2b-7c9465bb0067\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zztn5" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.938068 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/d9572819-3894-4603-bd2b-7c9465bb0067-etcd-service-ca\") pod \"etcd-operator-b45778765-zztn5\" (UID: \"d9572819-3894-4603-bd2b-7c9465bb0067\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zztn5" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.938382 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2539fca8-3dde-43ed-815c-e837f37dfdd5-auth-proxy-config\") pod \"machine-approver-56656f9798-9dd56\" (UID: \"2539fca8-3dde-43ed-815c-e837f37dfdd5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9dd56" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.938460 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f27409fc-b6dd-4573-918b-7b30b3635cc7-oauth-serving-cert\") pod \"console-f9d7485db-8v244\" (UID: \"f27409fc-b6dd-4573-918b-7b30b3635cc7\") " pod="openshift-console/console-f9d7485db-8v244" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.938550 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c5f567e-b38f-44a0-b1fd-1a96857e811f-service-ca-bundle\") pod \"router-default-5444994796-xl5d7\" (UID: \"9c5f567e-b38f-44a0-b1fd-1a96857e811f\") " pod="openshift-ingress/router-default-5444994796-xl5d7" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.936853 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/158a5836-f175-4da3-b22d-6a3130a89d30-service-ca-bundle\") pod \"authentication-operator-69f744f599-xh9n5\" (UID: \"158a5836-f175-4da3-b22d-6a3130a89d30\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xh9n5" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.939054 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f27409fc-b6dd-4573-918b-7b30b3635cc7-console-config\") pod \"console-f9d7485db-8v244\" (UID: \"f27409fc-b6dd-4573-918b-7b30b3635cc7\") " pod="openshift-console/console-f9d7485db-8v244" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.939374 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/158a5836-f175-4da3-b22d-6a3130a89d30-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-xh9n5\" (UID: \"158a5836-f175-4da3-b22d-6a3130a89d30\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xh9n5" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.939814 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6b6a9601-6689-435b-aca1-256a0c3c07fb-trusted-ca\") pod \"ingress-operator-5b745b69d9-ltwbb\" (UID: \"6b6a9601-6689-435b-aca1-256a0c3c07fb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ltwbb" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.939957 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f27409fc-b6dd-4573-918b-7b30b3635cc7-service-ca\") pod \"console-f9d7485db-8v244\" (UID: \"f27409fc-b6dd-4573-918b-7b30b3635cc7\") " pod="openshift-console/console-f9d7485db-8v244" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.941259 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2wc5m"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.941319 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/158a5836-f175-4da3-b22d-6a3130a89d30-config\") pod \"authentication-operator-69f744f599-xh9n5\" (UID: \"158a5836-f175-4da3-b22d-6a3130a89d30\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xh9n5" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.941777 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2539fca8-3dde-43ed-815c-e837f37dfdd5-config\") pod \"machine-approver-56656f9798-9dd56\" (UID: \"2539fca8-3dde-43ed-815c-e837f37dfdd5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9dd56" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.942951 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2539fca8-3dde-43ed-815c-e837f37dfdd5-machine-approver-tls\") pod \"machine-approver-56656f9798-9dd56\" (UID: \"2539fca8-3dde-43ed-815c-e837f37dfdd5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9dd56" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.942978 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6b6a9601-6689-435b-aca1-256a0c3c07fb-metrics-tls\") pod \"ingress-operator-5b745b69d9-ltwbb\" (UID: \"6b6a9601-6689-435b-aca1-256a0c3c07fb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ltwbb" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.943278 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9c5f567e-b38f-44a0-b1fd-1a96857e811f-metrics-certs\") pod \"router-default-5444994796-xl5d7\" (UID: \"9c5f567e-b38f-44a0-b1fd-1a96857e811f\") " pod="openshift-ingress/router-default-5444994796-xl5d7" Mar 18 10:16:24 crc kubenswrapper[4733]: E0318 10:16:24.943447 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 10:16:25.443426925 +0000 UTC m=+224.935161360 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nwhtg" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.943525 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7z2vw"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.944349 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/9c5f567e-b38f-44a0-b1fd-1a96857e811f-default-certificate\") pod \"router-default-5444994796-xl5d7\" (UID: \"9c5f567e-b38f-44a0-b1fd-1a96857e811f\") " pod="openshift-ingress/router-default-5444994796-xl5d7" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.944413 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/99efba52-bc27-49d7-8efb-154b6e3787a9-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-qs72s\" (UID: \"99efba52-bc27-49d7-8efb-154b6e3787a9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qs72s" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.944923 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9572819-3894-4603-bd2b-7c9465bb0067-serving-cert\") pod \"etcd-operator-b45778765-zztn5\" (UID: \"d9572819-3894-4603-bd2b-7c9465bb0067\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zztn5" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.944941 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-zztn5"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.945040 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/158a5836-f175-4da3-b22d-6a3130a89d30-serving-cert\") pod \"authentication-operator-69f744f599-xh9n5\" (UID: \"158a5836-f175-4da3-b22d-6a3130a89d30\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xh9n5" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.945150 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f27409fc-b6dd-4573-918b-7b30b3635cc7-console-serving-cert\") pod \"console-f9d7485db-8v244\" (UID: \"f27409fc-b6dd-4573-918b-7b30b3635cc7\") " pod="openshift-console/console-f9d7485db-8v244" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.946591 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-nbftd"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.947708 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mxb9q"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.947931 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f27409fc-b6dd-4573-918b-7b30b3635cc7-console-oauth-config\") pod \"console-f9d7485db-8v244\" (UID: \"f27409fc-b6dd-4573-918b-7b30b3635cc7\") " pod="openshift-console/console-f9d7485db-8v244" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.949740 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-xh9n5"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.951946 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mj46t"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.953147 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-8v244"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.954552 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-lptjf"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.955535 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563816-4582s"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.957311 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d9572819-3894-4603-bd2b-7c9465bb0067-etcd-client\") pod \"etcd-operator-b45778765-zztn5\" (UID: \"d9572819-3894-4603-bd2b-7c9465bb0067\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zztn5" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.958071 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563815-tsrs6"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.959838 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/9c5f567e-b38f-44a0-b1fd-1a96857e811f-stats-auth\") pod \"router-default-5444994796-xl5d7\" (UID: \"9c5f567e-b38f-44a0-b1fd-1a96857e811f\") " pod="openshift-ingress/router-default-5444994796-xl5d7" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.959956 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-p4b5s"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.961178 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2wcj\" (UniqueName: \"kubernetes.io/projected/0c02459c-3d75-4363-a010-3e9639bb9b4e-kube-api-access-b2wcj\") pod \"machine-api-operator-5694c8668f-nbftd\" (UID: \"0c02459c-3d75-4363-a010-3e9639bb9b4e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nbftd" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.961673 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7rr85"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.961828 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-p4b5s" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.964138 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-hvmrz"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.964852 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hvmrz" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.965639 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-mz68f"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.967229 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z8g4f"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.967432 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m5k95" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.976480 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dxd7p"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.978592 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-h5xdn"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.979665 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-t95b6"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.980595 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w48br\" (UniqueName: \"kubernetes.io/projected/43cea3fb-14f9-4993-a8a9-4618680e8286-kube-api-access-w48br\") pod \"openshift-config-operator-7777fb866f-848w7\" (UID: \"43cea3fb-14f9-4993-a8a9-4618680e8286\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-848w7" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.980784 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9h9xr"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.982093 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g686q"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.983382 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4lbr5"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.984745 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-ltwbb"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.985840 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gdmqx"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.987336 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-nbftd" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.988732 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.989137 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8kv4d"] Mar 18 10:16:24 crc kubenswrapper[4733]: I0318 10:16:24.993231 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bzhq6"] Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.000762 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-hvmrz"] Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.006310 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-vsnq2"] Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.006373 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kd6gw"] Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.008116 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7z2vw" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.008151 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.012002 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-lsqn4"] Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.014477 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-h8kqf"] Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.017414 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-xvnwv"] Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.017454 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-n6hmz"] Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.020602 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-p4b5s"] Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.021503 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-pvlch"] Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.022562 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-pvlch" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.022870 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-fnzxw"] Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.023280 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-fnzxw" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.024316 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-pvlch"] Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.028430 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.036452 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.036727 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-n6hmz\" (UID: \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6hmz" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.036773 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe90c8a8-c79a-4ed5-bec1-5ea07fbad5cf-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-mj46t\" (UID: \"fe90c8a8-c79a-4ed5-bec1-5ea07fbad5cf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mj46t" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.036800 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/34ea1a9f-9093-421f-bef3-228352aa65fb-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-7rr85\" (UID: \"34ea1a9f-9093-421f-bef3-228352aa65fb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7rr85" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.036816 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d915f7d2-5b4d-4017-a839-b615a182fafb-config-volume\") pod \"collect-profiles-29563815-tsrs6\" (UID: \"d915f7d2-5b4d-4017-a839-b615a182fafb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563815-tsrs6" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.036833 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-n6hmz\" (UID: \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6hmz" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.036869 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qqrn\" (UniqueName: \"kubernetes.io/projected/7b5dc098-4a15-429b-8243-1ac75ce2e0c1-kube-api-access-2qqrn\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.036890 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-audit-dir\") pod \"oauth-openshift-558db77b4-n6hmz\" (UID: \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6hmz" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.036915 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzlbf\" (UniqueName: \"kubernetes.io/projected/aa4b5542-dc36-4c93-88e5-a080729b94ae-kube-api-access-dzlbf\") pod \"dns-default-hvmrz\" (UID: \"aa4b5542-dc36-4c93-88e5-a080729b94ae\") " pod="openshift-dns/dns-default-hvmrz" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.036948 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6-node-pullsecrets\") pod \"apiserver-76f77b778f-xvnwv\" (UID: \"56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6\") " pod="openshift-apiserver/apiserver-76f77b778f-xvnwv" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.036970 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/f2b6c2ec-c07f-4d59-ba90-1ed2ec55d8a7-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-4lbr5\" (UID: \"f2b6c2ec-c07f-4d59-ba90-1ed2ec55d8a7\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4lbr5" Mar 18 10:16:25 crc kubenswrapper[4733]: E0318 10:16:25.037034 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:16:25.536996328 +0000 UTC m=+225.028730653 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.037105 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b405f127-b181-49a1-8205-aafd58d1fa7b-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-gdmqx\" (UID: \"b405f127-b181-49a1-8205-aafd58d1fa7b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gdmqx" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.037180 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7b5dc098-4a15-429b-8243-1ac75ce2e0c1-registry-tls\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.037299 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/aa4b5542-dc36-4c93-88e5-a080729b94ae-metrics-tls\") pod \"dns-default-hvmrz\" (UID: \"aa4b5542-dc36-4c93-88e5-a080729b94ae\") " pod="openshift-dns/dns-default-hvmrz" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.037507 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7b5dc098-4a15-429b-8243-1ac75ce2e0c1-bound-sa-token\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.037584 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px4sd\" (UniqueName: \"kubernetes.io/projected/d5979b3d-b4b4-4081-b486-4fcf91f6367c-kube-api-access-px4sd\") pod \"ingress-canary-pvlch\" (UID: \"d5979b3d-b4b4-4081-b486-4fcf91f6367c\") " pod="openshift-ingress-canary/ingress-canary-pvlch" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.037623 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/84ddb369-1909-4d63-a0c0-b250490992c0-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-bzhq6\" (UID: \"84ddb369-1909-4d63-a0c0-b250490992c0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bzhq6" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.037651 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/53a14d61-5c2c-44b8-b3cb-c8daa23762bf-signing-cabundle\") pod \"service-ca-9c57cc56f-h5xdn\" (UID: \"53a14d61-5c2c-44b8-b3cb-c8daa23762bf\") " pod="openshift-service-ca/service-ca-9c57cc56f-h5xdn" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.037714 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad648fa7-2560-4aa0-8634-05bcbc48916f-config\") pod \"service-ca-operator-777779d784-t95b6\" (UID: \"ad648fa7-2560-4aa0-8634-05bcbc48916f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-t95b6" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.037750 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-audit-policies\") pod \"oauth-openshift-558db77b4-n6hmz\" (UID: \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6hmz" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.037767 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ed943a82-ef39-4ebc-9d76-09bb69f3b800-registration-dir\") pod \"csi-hostpathplugin-p4b5s\" (UID: \"ed943a82-ef39-4ebc-9d76-09bb69f3b800\") " pod="hostpath-provisioner/csi-hostpathplugin-p4b5s" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.037783 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/87157be2-0fc3-4120-b9b6-d4494ace940a-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-h8kqf\" (UID: \"87157be2-0fc3-4120-b9b6-d4494ace940a\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-h8kqf" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.037796 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ed943a82-ef39-4ebc-9d76-09bb69f3b800-socket-dir\") pod \"csi-hostpathplugin-p4b5s\" (UID: \"ed943a82-ef39-4ebc-9d76-09bb69f3b800\") " pod="hostpath-provisioner/csi-hostpathplugin-p4b5s" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.037820 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7522\" (UniqueName: \"kubernetes.io/projected/ad648fa7-2560-4aa0-8634-05bcbc48916f-kube-api-access-n7522\") pod \"service-ca-operator-777779d784-t95b6\" (UID: \"ad648fa7-2560-4aa0-8634-05bcbc48916f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-t95b6" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.037837 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-n6hmz\" (UID: \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6hmz" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.037855 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6nw8\" (UniqueName: \"kubernetes.io/projected/10e64d74-2e25-41fd-a9ad-32a3e74e5c01-kube-api-access-h6nw8\") pod \"dns-operator-744455d44c-vsnq2\" (UID: \"10e64d74-2e25-41fd-a9ad-32a3e74e5c01\") " pod="openshift-dns-operator/dns-operator-744455d44c-vsnq2" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.037872 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d915f7d2-5b4d-4017-a839-b615a182fafb-secret-volume\") pod \"collect-profiles-29563815-tsrs6\" (UID: \"d915f7d2-5b4d-4017-a839-b615a182fafb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563815-tsrs6" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.037889 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4edea753-21f5-44fd-b183-daf03845dcd8-srv-cert\") pod \"catalog-operator-68c6474976-g686q\" (UID: \"4edea753-21f5-44fd-b183-daf03845dcd8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g686q" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.037919 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncjbj\" (UniqueName: \"kubernetes.io/projected/9b0edb65-3bcf-484f-9707-d8124df1ec88-kube-api-access-ncjbj\") pod \"package-server-manager-789f6589d5-kd6gw\" (UID: \"9b0edb65-3bcf-484f-9707-d8124df1ec88\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kd6gw" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.037935 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6-serving-cert\") pod \"apiserver-76f77b778f-xvnwv\" (UID: \"56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6\") " pod="openshift-apiserver/apiserver-76f77b778f-xvnwv" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.037981 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4clx\" (UniqueName: \"kubernetes.io/projected/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-kube-api-access-h4clx\") pod \"oauth-openshift-558db77b4-n6hmz\" (UID: \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6hmz" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.038001 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/ed943a82-ef39-4ebc-9d76-09bb69f3b800-mountpoint-dir\") pod \"csi-hostpathplugin-p4b5s\" (UID: \"ed943a82-ef39-4ebc-9d76-09bb69f3b800\") " pod="hostpath-provisioner/csi-hostpathplugin-p4b5s" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.038017 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe90c8a8-c79a-4ed5-bec1-5ea07fbad5cf-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-mj46t\" (UID: \"fe90c8a8-c79a-4ed5-bec1-5ea07fbad5cf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mj46t" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.038040 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10f3d99e-72fa-4c62-8190-059d7a0effd1-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-dxd7p\" (UID: \"10f3d99e-72fa-4c62-8190-059d7a0effd1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dxd7p" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.038259 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/10e64d74-2e25-41fd-a9ad-32a3e74e5c01-metrics-tls\") pod \"dns-operator-744455d44c-vsnq2\" (UID: \"10e64d74-2e25-41fd-a9ad-32a3e74e5c01\") " pod="openshift-dns-operator/dns-operator-744455d44c-vsnq2" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.038278 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad648fa7-2560-4aa0-8634-05bcbc48916f-serving-cert\") pod \"service-ca-operator-777779d784-t95b6\" (UID: \"ad648fa7-2560-4aa0-8634-05bcbc48916f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-t95b6" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.038293 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4edea753-21f5-44fd-b183-daf03845dcd8-profile-collector-cert\") pod \"catalog-operator-68c6474976-g686q\" (UID: \"4edea753-21f5-44fd-b183-daf03845dcd8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g686q" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.038308 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ec8840c3-e0bd-4cf0-9dd4-87d9ae93b806-auth-proxy-config\") pod \"machine-config-operator-74547568cd-mz68f\" (UID: \"ec8840c3-e0bd-4cf0-9dd4-87d9ae93b806\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mz68f" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.038350 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9571ba80-f267-46ed-8d16-e44531cb0ce8-profile-collector-cert\") pod \"olm-operator-6b444d44fb-2wc5m\" (UID: \"9571ba80-f267-46ed-8d16-e44531cb0ce8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2wc5m" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.038366 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4810c2fd-346b-44a0-b985-46d302060373-serving-cert\") pod \"console-operator-58897d9998-lptjf\" (UID: \"4810c2fd-346b-44a0-b985-46d302060373\") " pod="openshift-console-operator/console-operator-58897d9998-lptjf" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.038408 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9b0edb65-3bcf-484f-9707-d8124df1ec88-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-kd6gw\" (UID: \"9b0edb65-3bcf-484f-9707-d8124df1ec88\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kd6gw" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.038426 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4810c2fd-346b-44a0-b985-46d302060373-config\") pod \"console-operator-58897d9998-lptjf\" (UID: \"4810c2fd-346b-44a0-b985-46d302060373\") " pod="openshift-console-operator/console-operator-58897d9998-lptjf" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.038445 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6-trusted-ca-bundle\") pod \"apiserver-76f77b778f-xvnwv\" (UID: \"56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6\") " pod="openshift-apiserver/apiserver-76f77b778f-xvnwv" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.038463 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/352d0ed5-c43b-431f-bd66-1749ab30d013-encryption-config\") pod \"apiserver-7bbb656c7d-lsqn4\" (UID: \"352d0ed5-c43b-431f-bd66-1749ab30d013\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lsqn4" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.038493 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4lrt\" (UniqueName: \"kubernetes.io/projected/352d0ed5-c43b-431f-bd66-1749ab30d013-kube-api-access-l4lrt\") pod \"apiserver-7bbb656c7d-lsqn4\" (UID: \"352d0ed5-c43b-431f-bd66-1749ab30d013\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lsqn4" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.038509 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7b5dc098-4a15-429b-8243-1ac75ce2e0c1-installation-pull-secrets\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.038524 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/ed943a82-ef39-4ebc-9d76-09bb69f3b800-plugins-dir\") pod \"csi-hostpathplugin-p4b5s\" (UID: \"ed943a82-ef39-4ebc-9d76-09bb69f3b800\") " pod="hostpath-provisioner/csi-hostpathplugin-p4b5s" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.038538 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34ea1a9f-9093-421f-bef3-228352aa65fb-config\") pod \"controller-manager-879f6c89f-7rr85\" (UID: \"34ea1a9f-9093-421f-bef3-228352aa65fb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7rr85" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.038565 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mscv\" (UniqueName: \"kubernetes.io/projected/5192f67b-f2ab-45eb-9b1a-64bdff02437a-kube-api-access-8mscv\") pod \"marketplace-operator-79b997595-9h9xr\" (UID: \"5192f67b-f2ab-45eb-9b1a-64bdff02437a\") " pod="openshift-marketplace/marketplace-operator-79b997595-9h9xr" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.038582 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-n6hmz\" (UID: \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6hmz" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.038597 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-n6hmz\" (UID: \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6hmz" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.038614 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/352d0ed5-c43b-431f-bd66-1749ab30d013-audit-policies\") pod \"apiserver-7bbb656c7d-lsqn4\" (UID: \"352d0ed5-c43b-431f-bd66-1749ab30d013\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lsqn4" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.038630 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97ffe185-3f09-44d0-a173-f95bb53c419e-config\") pod \"kube-apiserver-operator-766d6c64bb-8kv4d\" (UID: \"97ffe185-3f09-44d0-a173-f95bb53c419e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8kv4d" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.040311 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/352d0ed5-c43b-431f-bd66-1749ab30d013-audit-policies\") pod \"apiserver-7bbb656c7d-lsqn4\" (UID: \"352d0ed5-c43b-431f-bd66-1749ab30d013\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lsqn4" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.040388 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2kgf\" (UniqueName: \"kubernetes.io/projected/a39a28f7-1fd2-44f7-8b49-05a0faf1e000-kube-api-access-r2kgf\") pod \"machine-config-server-fnzxw\" (UID: \"a39a28f7-1fd2-44f7-8b49-05a0faf1e000\") " pod="openshift-machine-config-operator/machine-config-server-fnzxw" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.040422 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppfv2\" (UniqueName: \"kubernetes.io/projected/fe90c8a8-c79a-4ed5-bec1-5ea07fbad5cf-kube-api-access-ppfv2\") pod \"openshift-controller-manager-operator-756b6f6bc6-mj46t\" (UID: \"fe90c8a8-c79a-4ed5-bec1-5ea07fbad5cf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mj46t" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.040465 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4810c2fd-346b-44a0-b985-46d302060373-trusted-ca\") pod \"console-operator-58897d9998-lptjf\" (UID: \"4810c2fd-346b-44a0-b985-46d302060373\") " pod="openshift-console-operator/console-operator-58897d9998-lptjf" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.040499 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6-image-import-ca\") pod \"apiserver-76f77b778f-xvnwv\" (UID: \"56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6\") " pod="openshift-apiserver/apiserver-76f77b778f-xvnwv" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.040549 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5192f67b-f2ab-45eb-9b1a-64bdff02437a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-9h9xr\" (UID: \"5192f67b-f2ab-45eb-9b1a-64bdff02437a\") " pod="openshift-marketplace/marketplace-operator-79b997595-9h9xr" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.040568 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10f3d99e-72fa-4c62-8190-059d7a0effd1-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-dxd7p\" (UID: \"10f3d99e-72fa-4c62-8190-059d7a0effd1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dxd7p" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.040631 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a39a28f7-1fd2-44f7-8b49-05a0faf1e000-node-bootstrap-token\") pod \"machine-config-server-fnzxw\" (UID: \"a39a28f7-1fd2-44f7-8b49-05a0faf1e000\") " pod="openshift-machine-config-operator/machine-config-server-fnzxw" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.040678 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a39a28f7-1fd2-44f7-8b49-05a0faf1e000-certs\") pod \"machine-config-server-fnzxw\" (UID: \"a39a28f7-1fd2-44f7-8b49-05a0faf1e000\") " pod="openshift-machine-config-operator/machine-config-server-fnzxw" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.040721 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3be6d75e-e4f8-4d9b-8ed3-9d25632de88c-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mxb9q\" (UID: \"3be6d75e-e4f8-4d9b-8ed3-9d25632de88c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mxb9q" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.040862 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngmrr\" (UniqueName: \"kubernetes.io/projected/b405f127-b181-49a1-8205-aafd58d1fa7b-kube-api-access-ngmrr\") pod \"cluster-image-registry-operator-dc59b4c8b-gdmqx\" (UID: \"b405f127-b181-49a1-8205-aafd58d1fa7b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gdmqx" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.040905 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6-encryption-config\") pod \"apiserver-76f77b778f-xvnwv\" (UID: \"56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6\") " pod="openshift-apiserver/apiserver-76f77b778f-xvnwv" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.040970 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/97ffe185-3f09-44d0-a173-f95bb53c419e-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-8kv4d\" (UID: \"97ffe185-3f09-44d0-a173-f95bb53c419e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8kv4d" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.040998 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b405f127-b181-49a1-8205-aafd58d1fa7b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-gdmqx\" (UID: \"b405f127-b181-49a1-8205-aafd58d1fa7b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gdmqx" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.041108 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/352d0ed5-c43b-431f-bd66-1749ab30d013-etcd-client\") pod \"apiserver-7bbb656c7d-lsqn4\" (UID: \"352d0ed5-c43b-431f-bd66-1749ab30d013\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lsqn4" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.041295 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.041414 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b405f127-b181-49a1-8205-aafd58d1fa7b-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-gdmqx\" (UID: \"b405f127-b181-49a1-8205-aafd58d1fa7b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gdmqx" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.041609 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7b5dc098-4a15-429b-8243-1ac75ce2e0c1-ca-trust-extracted\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.041646 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/ed943a82-ef39-4ebc-9d76-09bb69f3b800-csi-data-dir\") pod \"csi-hostpathplugin-p4b5s\" (UID: \"ed943a82-ef39-4ebc-9d76-09bb69f3b800\") " pod="hostpath-provisioner/csi-hostpathplugin-p4b5s" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.041676 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ec8840c3-e0bd-4cf0-9dd4-87d9ae93b806-images\") pod \"machine-config-operator-74547568cd-mz68f\" (UID: \"ec8840c3-e0bd-4cf0-9dd4-87d9ae93b806\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mz68f" Mar 18 10:16:25 crc kubenswrapper[4733]: E0318 10:16:25.041697 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 10:16:25.541678708 +0000 UTC m=+225.033413163 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nwhtg" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.041791 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9571ba80-f267-46ed-8d16-e44531cb0ce8-srv-cert\") pod \"olm-operator-6b444d44fb-2wc5m\" (UID: \"9571ba80-f267-46ed-8d16-e44531cb0ce8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2wc5m" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.041814 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7b5dc098-4a15-429b-8243-1ac75ce2e0c1-registry-certificates\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.041892 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/352d0ed5-c43b-431f-bd66-1749ab30d013-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-lsqn4\" (UID: \"352d0ed5-c43b-431f-bd66-1749ab30d013\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lsqn4" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.042021 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/352d0ed5-c43b-431f-bd66-1749ab30d013-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-lsqn4\" (UID: \"352d0ed5-c43b-431f-bd66-1749ab30d013\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lsqn4" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.042052 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d5979b3d-b4b4-4081-b486-4fcf91f6367c-cert\") pod \"ingress-canary-pvlch\" (UID: \"d5979b3d-b4b4-4081-b486-4fcf91f6367c\") " pod="openshift-ingress-canary/ingress-canary-pvlch" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.042070 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/53a14d61-5c2c-44b8-b3cb-c8daa23762bf-signing-key\") pod \"service-ca-9c57cc56f-h5xdn\" (UID: \"53a14d61-5c2c-44b8-b3cb-c8daa23762bf\") " pod="openshift-service-ca/service-ca-9c57cc56f-h5xdn" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.042116 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fblf6\" (UniqueName: \"kubernetes.io/projected/4810c2fd-346b-44a0-b985-46d302060373-kube-api-access-fblf6\") pod \"console-operator-58897d9998-lptjf\" (UID: \"4810c2fd-346b-44a0-b985-46d302060373\") " pod="openshift-console-operator/console-operator-58897d9998-lptjf" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.042118 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7b5dc098-4a15-429b-8243-1ac75ce2e0c1-ca-trust-extracted\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.042135 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-n6hmz\" (UID: \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6hmz" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.042586 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/352d0ed5-c43b-431f-bd66-1749ab30d013-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-lsqn4\" (UID: \"352d0ed5-c43b-431f-bd66-1749ab30d013\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lsqn4" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.042848 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/352d0ed5-c43b-431f-bd66-1749ab30d013-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-lsqn4\" (UID: \"352d0ed5-c43b-431f-bd66-1749ab30d013\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lsqn4" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.042888 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/34ea1a9f-9093-421f-bef3-228352aa65fb-client-ca\") pod \"controller-manager-879f6c89f-7rr85\" (UID: \"34ea1a9f-9093-421f-bef3-228352aa65fb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7rr85" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.042915 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/57151941-19ac-4bb5-a93b-b5dfbc88e0d6-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-hw7zb\" (UID: \"57151941-19ac-4bb5-a93b-b5dfbc88e0d6\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hw7zb" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.043546 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6-etcd-client\") pod \"apiserver-76f77b778f-xvnwv\" (UID: \"56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6\") " pod="openshift-apiserver/apiserver-76f77b778f-xvnwv" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.043574 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6-etcd-serving-ca\") pod \"apiserver-76f77b778f-xvnwv\" (UID: \"56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6\") " pod="openshift-apiserver/apiserver-76f77b778f-xvnwv" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.043595 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-n6hmz\" (UID: \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6hmz" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.044069 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/57151941-19ac-4bb5-a93b-b5dfbc88e0d6-proxy-tls\") pod \"machine-config-controller-84d6567774-hw7zb\" (UID: \"57151941-19ac-4bb5-a93b-b5dfbc88e0d6\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hw7zb" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.044143 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kwjh\" (UniqueName: \"kubernetes.io/projected/f2b6c2ec-c07f-4d59-ba90-1ed2ec55d8a7-kube-api-access-6kwjh\") pod \"control-plane-machine-set-operator-78cbb6b69f-4lbr5\" (UID: \"f2b6c2ec-c07f-4d59-ba90-1ed2ec55d8a7\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4lbr5" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.044179 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scsd7\" (UniqueName: \"kubernetes.io/projected/53a14d61-5c2c-44b8-b3cb-c8daa23762bf-kube-api-access-scsd7\") pod \"service-ca-9c57cc56f-h5xdn\" (UID: \"53a14d61-5c2c-44b8-b3cb-c8daa23762bf\") " pod="openshift-service-ca/service-ca-9c57cc56f-h5xdn" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.044257 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grpxc\" (UniqueName: \"kubernetes.io/projected/ec8840c3-e0bd-4cf0-9dd4-87d9ae93b806-kube-api-access-grpxc\") pod \"machine-config-operator-74547568cd-mz68f\" (UID: \"ec8840c3-e0bd-4cf0-9dd4-87d9ae93b806\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mz68f" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.044285 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ll8g5\" (UniqueName: \"kubernetes.io/projected/34ea1a9f-9093-421f-bef3-228352aa65fb-kube-api-access-ll8g5\") pod \"controller-manager-879f6c89f-7rr85\" (UID: \"34ea1a9f-9093-421f-bef3-228352aa65fb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7rr85" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.044338 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aa4b5542-dc36-4c93-88e5-a080729b94ae-config-volume\") pod \"dns-default-hvmrz\" (UID: \"aa4b5542-dc36-4c93-88e5-a080729b94ae\") " pod="openshift-dns/dns-default-hvmrz" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.044552 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7b5dc098-4a15-429b-8243-1ac75ce2e0c1-registry-certificates\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.044684 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/57151941-19ac-4bb5-a93b-b5dfbc88e0d6-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-hw7zb\" (UID: \"57151941-19ac-4bb5-a93b-b5dfbc88e0d6\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hw7zb" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.046657 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km4fz\" (UniqueName: \"kubernetes.io/projected/ed943a82-ef39-4ebc-9d76-09bb69f3b800-kube-api-access-km4fz\") pod \"csi-hostpathplugin-p4b5s\" (UID: \"ed943a82-ef39-4ebc-9d76-09bb69f3b800\") " pod="hostpath-provisioner/csi-hostpathplugin-p4b5s" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.046682 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5192f67b-f2ab-45eb-9b1a-64bdff02437a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-9h9xr\" (UID: \"5192f67b-f2ab-45eb-9b1a-64bdff02437a\") " pod="openshift-marketplace/marketplace-operator-79b997595-9h9xr" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.046684 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7b5dc098-4a15-429b-8243-1ac75ce2e0c1-registry-tls\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.046728 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4zf8\" (UniqueName: \"kubernetes.io/projected/10f3d99e-72fa-4c62-8190-059d7a0effd1-kube-api-access-t4zf8\") pod \"kube-storage-version-migrator-operator-b67b599dd-dxd7p\" (UID: \"10f3d99e-72fa-4c62-8190-059d7a0effd1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dxd7p" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.046750 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6-audit-dir\") pod \"apiserver-76f77b778f-xvnwv\" (UID: \"56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6\") " pod="openshift-apiserver/apiserver-76f77b778f-xvnwv" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.046800 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97ffe185-3f09-44d0-a173-f95bb53c419e-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-8kv4d\" (UID: \"97ffe185-3f09-44d0-a173-f95bb53c419e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8kv4d" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.046820 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6-audit\") pod \"apiserver-76f77b778f-xvnwv\" (UID: \"56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6\") " pod="openshift-apiserver/apiserver-76f77b778f-xvnwv" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.046869 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-n6hmz\" (UID: \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6hmz" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.046907 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-n6hmz\" (UID: \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6hmz" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.046973 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vn4vp\" (UniqueName: \"kubernetes.io/projected/3a0400a1-7e6b-4335-8819-586d7a460e3d-kube-api-access-vn4vp\") pod \"packageserver-d55dfcdfc-z8g4f\" (UID: \"3a0400a1-7e6b-4335-8819-586d7a460e3d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z8g4f" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.046998 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3be6d75e-e4f8-4d9b-8ed3-9d25632de88c-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mxb9q\" (UID: \"3be6d75e-e4f8-4d9b-8ed3-9d25632de88c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mxb9q" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.047108 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9l2x\" (UniqueName: \"kubernetes.io/projected/87157be2-0fc3-4120-b9b6-d4494ace940a-kube-api-access-j9l2x\") pod \"multus-admission-controller-857f4d67dd-h8kqf\" (UID: \"87157be2-0fc3-4120-b9b6-d4494ace940a\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-h8kqf" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.048729 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7b5dc098-4a15-429b-8243-1ac75ce2e0c1-installation-pull-secrets\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.049145 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/352d0ed5-c43b-431f-bd66-1749ab30d013-encryption-config\") pod \"apiserver-7bbb656c7d-lsqn4\" (UID: \"352d0ed5-c43b-431f-bd66-1749ab30d013\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lsqn4" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.049294 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9571ba80-f267-46ed-8d16-e44531cb0ce8-srv-cert\") pod \"olm-operator-6b444d44fb-2wc5m\" (UID: \"9571ba80-f267-46ed-8d16-e44531cb0ce8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2wc5m" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.049443 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.050135 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7b5dc098-4a15-429b-8243-1ac75ce2e0c1-trusted-ca\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.052945 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/352d0ed5-c43b-431f-bd66-1749ab30d013-etcd-client\") pod \"apiserver-7bbb656c7d-lsqn4\" (UID: \"352d0ed5-c43b-431f-bd66-1749ab30d013\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lsqn4" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.053170 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9571ba80-f267-46ed-8d16-e44531cb0ce8-profile-collector-cert\") pod \"olm-operator-6b444d44fb-2wc5m\" (UID: \"9571ba80-f267-46ed-8d16-e44531cb0ce8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2wc5m" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.053441 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ec8840c3-e0bd-4cf0-9dd4-87d9ae93b806-proxy-tls\") pod \"machine-config-operator-74547568cd-mz68f\" (UID: \"ec8840c3-e0bd-4cf0-9dd4-87d9ae93b806\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mz68f" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.053496 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34ea1a9f-9093-421f-bef3-228352aa65fb-serving-cert\") pod \"controller-manager-879f6c89f-7rr85\" (UID: \"34ea1a9f-9093-421f-bef3-228352aa65fb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7rr85" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.053522 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84ddb369-1909-4d63-a0c0-b250490992c0-config\") pod \"kube-controller-manager-operator-78b949d7b-bzhq6\" (UID: \"84ddb369-1909-4d63-a0c0-b250490992c0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bzhq6" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.053549 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6-config\") pod \"apiserver-76f77b778f-xvnwv\" (UID: \"56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6\") " pod="openshift-apiserver/apiserver-76f77b778f-xvnwv" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.053574 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3a0400a1-7e6b-4335-8819-586d7a460e3d-webhook-cert\") pod \"packageserver-d55dfcdfc-z8g4f\" (UID: \"3a0400a1-7e6b-4335-8819-586d7a460e3d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z8g4f" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.053615 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-n6hmz\" (UID: \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6hmz" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.053647 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2whnv\" (UniqueName: \"kubernetes.io/projected/71a70c3c-d483-43f4-9f54-10978c7f8cc8-kube-api-access-2whnv\") pod \"auto-csr-approver-29563816-4582s\" (UID: \"71a70c3c-d483-43f4-9f54-10978c7f8cc8\") " pod="openshift-infra/auto-csr-approver-29563816-4582s" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.053673 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3a0400a1-7e6b-4335-8819-586d7a460e3d-apiservice-cert\") pod \"packageserver-d55dfcdfc-z8g4f\" (UID: \"3a0400a1-7e6b-4335-8819-586d7a460e3d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z8g4f" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.053695 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d48vf\" (UniqueName: \"kubernetes.io/projected/57151941-19ac-4bb5-a93b-b5dfbc88e0d6-kube-api-access-d48vf\") pod \"machine-config-controller-84d6567774-hw7zb\" (UID: \"57151941-19ac-4bb5-a93b-b5dfbc88e0d6\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hw7zb" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.053716 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gw7q6\" (UniqueName: \"kubernetes.io/projected/4edea753-21f5-44fd-b183-daf03845dcd8-kube-api-access-gw7q6\") pod \"catalog-operator-68c6474976-g686q\" (UID: \"4edea753-21f5-44fd-b183-daf03845dcd8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g686q" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.053748 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/3a0400a1-7e6b-4335-8819-586d7a460e3d-tmpfs\") pod \"packageserver-d55dfcdfc-z8g4f\" (UID: \"3a0400a1-7e6b-4335-8819-586d7a460e3d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z8g4f" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.053769 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84ddb369-1909-4d63-a0c0-b250490992c0-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-bzhq6\" (UID: \"84ddb369-1909-4d63-a0c0-b250490992c0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bzhq6" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.053786 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2qxh\" (UniqueName: \"kubernetes.io/projected/56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6-kube-api-access-c2qxh\") pod \"apiserver-76f77b778f-xvnwv\" (UID: \"56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6\") " pod="openshift-apiserver/apiserver-76f77b778f-xvnwv" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.053806 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-n6hmz\" (UID: \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6hmz" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.053832 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/352d0ed5-c43b-431f-bd66-1749ab30d013-serving-cert\") pod \"apiserver-7bbb656c7d-lsqn4\" (UID: \"352d0ed5-c43b-431f-bd66-1749ab30d013\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lsqn4" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.054173 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3be6d75e-e4f8-4d9b-8ed3-9d25632de88c-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mxb9q\" (UID: \"3be6d75e-e4f8-4d9b-8ed3-9d25632de88c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mxb9q" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.054233 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lfsh\" (UniqueName: \"kubernetes.io/projected/d915f7d2-5b4d-4017-a839-b615a182fafb-kube-api-access-8lfsh\") pod \"collect-profiles-29563815-tsrs6\" (UID: \"d915f7d2-5b4d-4017-a839-b615a182fafb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563815-tsrs6" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.054321 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfbq5\" (UniqueName: \"kubernetes.io/projected/9571ba80-f267-46ed-8d16-e44531cb0ce8-kube-api-access-wfbq5\") pod \"olm-operator-6b444d44fb-2wc5m\" (UID: \"9571ba80-f267-46ed-8d16-e44531cb0ce8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2wc5m" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.054425 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7b5dc098-4a15-429b-8243-1ac75ce2e0c1-trusted-ca\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.054478 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/352d0ed5-c43b-431f-bd66-1749ab30d013-audit-dir\") pod \"apiserver-7bbb656c7d-lsqn4\" (UID: \"352d0ed5-c43b-431f-bd66-1749ab30d013\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lsqn4" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.054680 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/352d0ed5-c43b-431f-bd66-1749ab30d013-audit-dir\") pod \"apiserver-7bbb656c7d-lsqn4\" (UID: \"352d0ed5-c43b-431f-bd66-1749ab30d013\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lsqn4" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.058490 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/57151941-19ac-4bb5-a93b-b5dfbc88e0d6-proxy-tls\") pod \"machine-config-controller-84d6567774-hw7zb\" (UID: \"57151941-19ac-4bb5-a93b-b5dfbc88e0d6\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hw7zb" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.062560 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/352d0ed5-c43b-431f-bd66-1749ab30d013-serving-cert\") pod \"apiserver-7bbb656c7d-lsqn4\" (UID: \"352d0ed5-c43b-431f-bd66-1749ab30d013\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lsqn4" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.069250 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.095754 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.108868 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.130556 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.149163 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.155103 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.155320 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d915f7d2-5b4d-4017-a839-b615a182fafb-config-volume\") pod \"collect-profiles-29563815-tsrs6\" (UID: \"d915f7d2-5b4d-4017-a839-b615a182fafb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563815-tsrs6" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.155346 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-n6hmz\" (UID: \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6hmz" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.155364 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6-node-pullsecrets\") pod \"apiserver-76f77b778f-xvnwv\" (UID: \"56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6\") " pod="openshift-apiserver/apiserver-76f77b778f-xvnwv" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.155380 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-audit-dir\") pod \"oauth-openshift-558db77b4-n6hmz\" (UID: \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6hmz" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.155398 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzlbf\" (UniqueName: \"kubernetes.io/projected/aa4b5542-dc36-4c93-88e5-a080729b94ae-kube-api-access-dzlbf\") pod \"dns-default-hvmrz\" (UID: \"aa4b5542-dc36-4c93-88e5-a080729b94ae\") " pod="openshift-dns/dns-default-hvmrz" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.155422 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/f2b6c2ec-c07f-4d59-ba90-1ed2ec55d8a7-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-4lbr5\" (UID: \"f2b6c2ec-c07f-4d59-ba90-1ed2ec55d8a7\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4lbr5" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.155439 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b405f127-b181-49a1-8205-aafd58d1fa7b-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-gdmqx\" (UID: \"b405f127-b181-49a1-8205-aafd58d1fa7b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gdmqx" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.155464 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/aa4b5542-dc36-4c93-88e5-a080729b94ae-metrics-tls\") pod \"dns-default-hvmrz\" (UID: \"aa4b5542-dc36-4c93-88e5-a080729b94ae\") " pod="openshift-dns/dns-default-hvmrz" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.155487 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-px4sd\" (UniqueName: \"kubernetes.io/projected/d5979b3d-b4b4-4081-b486-4fcf91f6367c-kube-api-access-px4sd\") pod \"ingress-canary-pvlch\" (UID: \"d5979b3d-b4b4-4081-b486-4fcf91f6367c\") " pod="openshift-ingress-canary/ingress-canary-pvlch" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.155504 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/84ddb369-1909-4d63-a0c0-b250490992c0-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-bzhq6\" (UID: \"84ddb369-1909-4d63-a0c0-b250490992c0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bzhq6" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.155522 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/53a14d61-5c2c-44b8-b3cb-c8daa23762bf-signing-cabundle\") pod \"service-ca-9c57cc56f-h5xdn\" (UID: \"53a14d61-5c2c-44b8-b3cb-c8daa23762bf\") " pod="openshift-service-ca/service-ca-9c57cc56f-h5xdn" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.155544 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad648fa7-2560-4aa0-8634-05bcbc48916f-config\") pod \"service-ca-operator-777779d784-t95b6\" (UID: \"ad648fa7-2560-4aa0-8634-05bcbc48916f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-t95b6" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.155562 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-audit-policies\") pod \"oauth-openshift-558db77b4-n6hmz\" (UID: \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6hmz" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.155578 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/87157be2-0fc3-4120-b9b6-d4494ace940a-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-h8kqf\" (UID: \"87157be2-0fc3-4120-b9b6-d4494ace940a\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-h8kqf" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.155592 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ed943a82-ef39-4ebc-9d76-09bb69f3b800-socket-dir\") pod \"csi-hostpathplugin-p4b5s\" (UID: \"ed943a82-ef39-4ebc-9d76-09bb69f3b800\") " pod="hostpath-provisioner/csi-hostpathplugin-p4b5s" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.155607 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ed943a82-ef39-4ebc-9d76-09bb69f3b800-registration-dir\") pod \"csi-hostpathplugin-p4b5s\" (UID: \"ed943a82-ef39-4ebc-9d76-09bb69f3b800\") " pod="hostpath-provisioner/csi-hostpathplugin-p4b5s" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.155624 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7522\" (UniqueName: \"kubernetes.io/projected/ad648fa7-2560-4aa0-8634-05bcbc48916f-kube-api-access-n7522\") pod \"service-ca-operator-777779d784-t95b6\" (UID: \"ad648fa7-2560-4aa0-8634-05bcbc48916f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-t95b6" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.155642 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-n6hmz\" (UID: \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6hmz" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.155659 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6nw8\" (UniqueName: \"kubernetes.io/projected/10e64d74-2e25-41fd-a9ad-32a3e74e5c01-kube-api-access-h6nw8\") pod \"dns-operator-744455d44c-vsnq2\" (UID: \"10e64d74-2e25-41fd-a9ad-32a3e74e5c01\") " pod="openshift-dns-operator/dns-operator-744455d44c-vsnq2" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.155673 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d915f7d2-5b4d-4017-a839-b615a182fafb-secret-volume\") pod \"collect-profiles-29563815-tsrs6\" (UID: \"d915f7d2-5b4d-4017-a839-b615a182fafb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563815-tsrs6" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.155698 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4edea753-21f5-44fd-b183-daf03845dcd8-srv-cert\") pod \"catalog-operator-68c6474976-g686q\" (UID: \"4edea753-21f5-44fd-b183-daf03845dcd8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g686q" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.155716 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncjbj\" (UniqueName: \"kubernetes.io/projected/9b0edb65-3bcf-484f-9707-d8124df1ec88-kube-api-access-ncjbj\") pod \"package-server-manager-789f6589d5-kd6gw\" (UID: \"9b0edb65-3bcf-484f-9707-d8124df1ec88\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kd6gw" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.155732 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6-serving-cert\") pod \"apiserver-76f77b778f-xvnwv\" (UID: \"56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6\") " pod="openshift-apiserver/apiserver-76f77b778f-xvnwv" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.155747 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4clx\" (UniqueName: \"kubernetes.io/projected/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-kube-api-access-h4clx\") pod \"oauth-openshift-558db77b4-n6hmz\" (UID: \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6hmz" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.155770 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/ed943a82-ef39-4ebc-9d76-09bb69f3b800-mountpoint-dir\") pod \"csi-hostpathplugin-p4b5s\" (UID: \"ed943a82-ef39-4ebc-9d76-09bb69f3b800\") " pod="hostpath-provisioner/csi-hostpathplugin-p4b5s" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.155784 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/10e64d74-2e25-41fd-a9ad-32a3e74e5c01-metrics-tls\") pod \"dns-operator-744455d44c-vsnq2\" (UID: \"10e64d74-2e25-41fd-a9ad-32a3e74e5c01\") " pod="openshift-dns-operator/dns-operator-744455d44c-vsnq2" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.155798 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad648fa7-2560-4aa0-8634-05bcbc48916f-serving-cert\") pod \"service-ca-operator-777779d784-t95b6\" (UID: \"ad648fa7-2560-4aa0-8634-05bcbc48916f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-t95b6" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.155815 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe90c8a8-c79a-4ed5-bec1-5ea07fbad5cf-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-mj46t\" (UID: \"fe90c8a8-c79a-4ed5-bec1-5ea07fbad5cf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mj46t" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.155846 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10f3d99e-72fa-4c62-8190-059d7a0effd1-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-dxd7p\" (UID: \"10f3d99e-72fa-4c62-8190-059d7a0effd1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dxd7p" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.155865 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4edea753-21f5-44fd-b183-daf03845dcd8-profile-collector-cert\") pod \"catalog-operator-68c6474976-g686q\" (UID: \"4edea753-21f5-44fd-b183-daf03845dcd8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g686q" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.155880 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ec8840c3-e0bd-4cf0-9dd4-87d9ae93b806-auth-proxy-config\") pod \"machine-config-operator-74547568cd-mz68f\" (UID: \"ec8840c3-e0bd-4cf0-9dd4-87d9ae93b806\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mz68f" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.155906 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4810c2fd-346b-44a0-b985-46d302060373-serving-cert\") pod \"console-operator-58897d9998-lptjf\" (UID: \"4810c2fd-346b-44a0-b985-46d302060373\") " pod="openshift-console-operator/console-operator-58897d9998-lptjf" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.155921 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9b0edb65-3bcf-484f-9707-d8124df1ec88-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-kd6gw\" (UID: \"9b0edb65-3bcf-484f-9707-d8124df1ec88\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kd6gw" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.155935 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4810c2fd-346b-44a0-b985-46d302060373-config\") pod \"console-operator-58897d9998-lptjf\" (UID: \"4810c2fd-346b-44a0-b985-46d302060373\") " pod="openshift-console-operator/console-operator-58897d9998-lptjf" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.155956 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6-trusted-ca-bundle\") pod \"apiserver-76f77b778f-xvnwv\" (UID: \"56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6\") " pod="openshift-apiserver/apiserver-76f77b778f-xvnwv" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.155971 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/ed943a82-ef39-4ebc-9d76-09bb69f3b800-plugins-dir\") pod \"csi-hostpathplugin-p4b5s\" (UID: \"ed943a82-ef39-4ebc-9d76-09bb69f3b800\") " pod="hostpath-provisioner/csi-hostpathplugin-p4b5s" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.155987 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34ea1a9f-9093-421f-bef3-228352aa65fb-config\") pod \"controller-manager-879f6c89f-7rr85\" (UID: \"34ea1a9f-9093-421f-bef3-228352aa65fb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7rr85" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.156002 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97ffe185-3f09-44d0-a173-f95bb53c419e-config\") pod \"kube-apiserver-operator-766d6c64bb-8kv4d\" (UID: \"97ffe185-3f09-44d0-a173-f95bb53c419e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8kv4d" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.156016 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mscv\" (UniqueName: \"kubernetes.io/projected/5192f67b-f2ab-45eb-9b1a-64bdff02437a-kube-api-access-8mscv\") pod \"marketplace-operator-79b997595-9h9xr\" (UID: \"5192f67b-f2ab-45eb-9b1a-64bdff02437a\") " pod="openshift-marketplace/marketplace-operator-79b997595-9h9xr" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.156034 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-n6hmz\" (UID: \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6hmz" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.156050 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-n6hmz\" (UID: \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6hmz" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.156066 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2kgf\" (UniqueName: \"kubernetes.io/projected/a39a28f7-1fd2-44f7-8b49-05a0faf1e000-kube-api-access-r2kgf\") pod \"machine-config-server-fnzxw\" (UID: \"a39a28f7-1fd2-44f7-8b49-05a0faf1e000\") " pod="openshift-machine-config-operator/machine-config-server-fnzxw" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.156082 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppfv2\" (UniqueName: \"kubernetes.io/projected/fe90c8a8-c79a-4ed5-bec1-5ea07fbad5cf-kube-api-access-ppfv2\") pod \"openshift-controller-manager-operator-756b6f6bc6-mj46t\" (UID: \"fe90c8a8-c79a-4ed5-bec1-5ea07fbad5cf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mj46t" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.156096 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5192f67b-f2ab-45eb-9b1a-64bdff02437a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-9h9xr\" (UID: \"5192f67b-f2ab-45eb-9b1a-64bdff02437a\") " pod="openshift-marketplace/marketplace-operator-79b997595-9h9xr" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.156114 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10f3d99e-72fa-4c62-8190-059d7a0effd1-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-dxd7p\" (UID: \"10f3d99e-72fa-4c62-8190-059d7a0effd1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dxd7p" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.156129 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4810c2fd-346b-44a0-b985-46d302060373-trusted-ca\") pod \"console-operator-58897d9998-lptjf\" (UID: \"4810c2fd-346b-44a0-b985-46d302060373\") " pod="openshift-console-operator/console-operator-58897d9998-lptjf" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.156143 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6-image-import-ca\") pod \"apiserver-76f77b778f-xvnwv\" (UID: \"56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6\") " pod="openshift-apiserver/apiserver-76f77b778f-xvnwv" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.156162 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/97ffe185-3f09-44d0-a173-f95bb53c419e-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-8kv4d\" (UID: \"97ffe185-3f09-44d0-a173-f95bb53c419e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8kv4d" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.156179 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a39a28f7-1fd2-44f7-8b49-05a0faf1e000-node-bootstrap-token\") pod \"machine-config-server-fnzxw\" (UID: \"a39a28f7-1fd2-44f7-8b49-05a0faf1e000\") " pod="openshift-machine-config-operator/machine-config-server-fnzxw" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.156233 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a39a28f7-1fd2-44f7-8b49-05a0faf1e000-certs\") pod \"machine-config-server-fnzxw\" (UID: \"a39a28f7-1fd2-44f7-8b49-05a0faf1e000\") " pod="openshift-machine-config-operator/machine-config-server-fnzxw" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.156248 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3be6d75e-e4f8-4d9b-8ed3-9d25632de88c-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mxb9q\" (UID: \"3be6d75e-e4f8-4d9b-8ed3-9d25632de88c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mxb9q" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.156265 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngmrr\" (UniqueName: \"kubernetes.io/projected/b405f127-b181-49a1-8205-aafd58d1fa7b-kube-api-access-ngmrr\") pod \"cluster-image-registry-operator-dc59b4c8b-gdmqx\" (UID: \"b405f127-b181-49a1-8205-aafd58d1fa7b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gdmqx" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.156281 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6-encryption-config\") pod \"apiserver-76f77b778f-xvnwv\" (UID: \"56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6\") " pod="openshift-apiserver/apiserver-76f77b778f-xvnwv" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.156298 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b405f127-b181-49a1-8205-aafd58d1fa7b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-gdmqx\" (UID: \"b405f127-b181-49a1-8205-aafd58d1fa7b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gdmqx" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.156322 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b405f127-b181-49a1-8205-aafd58d1fa7b-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-gdmqx\" (UID: \"b405f127-b181-49a1-8205-aafd58d1fa7b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gdmqx" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.156338 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/ed943a82-ef39-4ebc-9d76-09bb69f3b800-csi-data-dir\") pod \"csi-hostpathplugin-p4b5s\" (UID: \"ed943a82-ef39-4ebc-9d76-09bb69f3b800\") " pod="hostpath-provisioner/csi-hostpathplugin-p4b5s" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.156354 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ec8840c3-e0bd-4cf0-9dd4-87d9ae93b806-images\") pod \"machine-config-operator-74547568cd-mz68f\" (UID: \"ec8840c3-e0bd-4cf0-9dd4-87d9ae93b806\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mz68f" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.156380 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d5979b3d-b4b4-4081-b486-4fcf91f6367c-cert\") pod \"ingress-canary-pvlch\" (UID: \"d5979b3d-b4b4-4081-b486-4fcf91f6367c\") " pod="openshift-ingress-canary/ingress-canary-pvlch" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.156395 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/53a14d61-5c2c-44b8-b3cb-c8daa23762bf-signing-key\") pod \"service-ca-9c57cc56f-h5xdn\" (UID: \"53a14d61-5c2c-44b8-b3cb-c8daa23762bf\") " pod="openshift-service-ca/service-ca-9c57cc56f-h5xdn" Mar 18 10:16:25 crc kubenswrapper[4733]: E0318 10:16:25.156421 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:16:25.656406238 +0000 UTC m=+225.148140563 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.156440 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-n6hmz\" (UID: \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6hmz" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.156461 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fblf6\" (UniqueName: \"kubernetes.io/projected/4810c2fd-346b-44a0-b985-46d302060373-kube-api-access-fblf6\") pod \"console-operator-58897d9998-lptjf\" (UID: \"4810c2fd-346b-44a0-b985-46d302060373\") " pod="openshift-console-operator/console-operator-58897d9998-lptjf" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.156478 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/34ea1a9f-9093-421f-bef3-228352aa65fb-client-ca\") pod \"controller-manager-879f6c89f-7rr85\" (UID: \"34ea1a9f-9093-421f-bef3-228352aa65fb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7rr85" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.156496 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6-etcd-client\") pod \"apiserver-76f77b778f-xvnwv\" (UID: \"56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6\") " pod="openshift-apiserver/apiserver-76f77b778f-xvnwv" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.156512 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6-etcd-serving-ca\") pod \"apiserver-76f77b778f-xvnwv\" (UID: \"56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6\") " pod="openshift-apiserver/apiserver-76f77b778f-xvnwv" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.156546 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-n6hmz\" (UID: \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6hmz" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.156572 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scsd7\" (UniqueName: \"kubernetes.io/projected/53a14d61-5c2c-44b8-b3cb-c8daa23762bf-kube-api-access-scsd7\") pod \"service-ca-9c57cc56f-h5xdn\" (UID: \"53a14d61-5c2c-44b8-b3cb-c8daa23762bf\") " pod="openshift-service-ca/service-ca-9c57cc56f-h5xdn" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.156589 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kwjh\" (UniqueName: \"kubernetes.io/projected/f2b6c2ec-c07f-4d59-ba90-1ed2ec55d8a7-kube-api-access-6kwjh\") pod \"control-plane-machine-set-operator-78cbb6b69f-4lbr5\" (UID: \"f2b6c2ec-c07f-4d59-ba90-1ed2ec55d8a7\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4lbr5" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.156610 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grpxc\" (UniqueName: \"kubernetes.io/projected/ec8840c3-e0bd-4cf0-9dd4-87d9ae93b806-kube-api-access-grpxc\") pod \"machine-config-operator-74547568cd-mz68f\" (UID: \"ec8840c3-e0bd-4cf0-9dd4-87d9ae93b806\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mz68f" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.156628 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ll8g5\" (UniqueName: \"kubernetes.io/projected/34ea1a9f-9093-421f-bef3-228352aa65fb-kube-api-access-ll8g5\") pod \"controller-manager-879f6c89f-7rr85\" (UID: \"34ea1a9f-9093-421f-bef3-228352aa65fb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7rr85" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.156809 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aa4b5542-dc36-4c93-88e5-a080729b94ae-config-volume\") pod \"dns-default-hvmrz\" (UID: \"aa4b5542-dc36-4c93-88e5-a080729b94ae\") " pod="openshift-dns/dns-default-hvmrz" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.156860 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-km4fz\" (UniqueName: \"kubernetes.io/projected/ed943a82-ef39-4ebc-9d76-09bb69f3b800-kube-api-access-km4fz\") pod \"csi-hostpathplugin-p4b5s\" (UID: \"ed943a82-ef39-4ebc-9d76-09bb69f3b800\") " pod="hostpath-provisioner/csi-hostpathplugin-p4b5s" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.156891 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5192f67b-f2ab-45eb-9b1a-64bdff02437a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-9h9xr\" (UID: \"5192f67b-f2ab-45eb-9b1a-64bdff02437a\") " pod="openshift-marketplace/marketplace-operator-79b997595-9h9xr" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.156910 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4zf8\" (UniqueName: \"kubernetes.io/projected/10f3d99e-72fa-4c62-8190-059d7a0effd1-kube-api-access-t4zf8\") pod \"kube-storage-version-migrator-operator-b67b599dd-dxd7p\" (UID: \"10f3d99e-72fa-4c62-8190-059d7a0effd1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dxd7p" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.156931 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6-audit-dir\") pod \"apiserver-76f77b778f-xvnwv\" (UID: \"56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6\") " pod="openshift-apiserver/apiserver-76f77b778f-xvnwv" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.156951 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97ffe185-3f09-44d0-a173-f95bb53c419e-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-8kv4d\" (UID: \"97ffe185-3f09-44d0-a173-f95bb53c419e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8kv4d" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.156969 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6-audit\") pod \"apiserver-76f77b778f-xvnwv\" (UID: \"56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6\") " pod="openshift-apiserver/apiserver-76f77b778f-xvnwv" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.156986 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-n6hmz\" (UID: \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6hmz" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.157005 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-n6hmz\" (UID: \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6hmz" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.157008 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6-node-pullsecrets\") pod \"apiserver-76f77b778f-xvnwv\" (UID: \"56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6\") " pod="openshift-apiserver/apiserver-76f77b778f-xvnwv" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.157376 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vn4vp\" (UniqueName: \"kubernetes.io/projected/3a0400a1-7e6b-4335-8819-586d7a460e3d-kube-api-access-vn4vp\") pod \"packageserver-d55dfcdfc-z8g4f\" (UID: \"3a0400a1-7e6b-4335-8819-586d7a460e3d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z8g4f" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.157407 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3be6d75e-e4f8-4d9b-8ed3-9d25632de88c-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mxb9q\" (UID: \"3be6d75e-e4f8-4d9b-8ed3-9d25632de88c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mxb9q" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.157428 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9l2x\" (UniqueName: \"kubernetes.io/projected/87157be2-0fc3-4120-b9b6-d4494ace940a-kube-api-access-j9l2x\") pod \"multus-admission-controller-857f4d67dd-h8kqf\" (UID: \"87157be2-0fc3-4120-b9b6-d4494ace940a\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-h8kqf" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.157465 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ec8840c3-e0bd-4cf0-9dd4-87d9ae93b806-proxy-tls\") pod \"machine-config-operator-74547568cd-mz68f\" (UID: \"ec8840c3-e0bd-4cf0-9dd4-87d9ae93b806\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mz68f" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.157487 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34ea1a9f-9093-421f-bef3-228352aa65fb-serving-cert\") pod \"controller-manager-879f6c89f-7rr85\" (UID: \"34ea1a9f-9093-421f-bef3-228352aa65fb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7rr85" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.157508 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84ddb369-1909-4d63-a0c0-b250490992c0-config\") pod \"kube-controller-manager-operator-78b949d7b-bzhq6\" (UID: \"84ddb369-1909-4d63-a0c0-b250490992c0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bzhq6" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.157517 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b405f127-b181-49a1-8205-aafd58d1fa7b-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-gdmqx\" (UID: \"b405f127-b181-49a1-8205-aafd58d1fa7b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gdmqx" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.157525 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6-config\") pod \"apiserver-76f77b778f-xvnwv\" (UID: \"56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6\") " pod="openshift-apiserver/apiserver-76f77b778f-xvnwv" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.157564 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3a0400a1-7e6b-4335-8819-586d7a460e3d-webhook-cert\") pod \"packageserver-d55dfcdfc-z8g4f\" (UID: \"3a0400a1-7e6b-4335-8819-586d7a460e3d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z8g4f" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.157598 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-n6hmz\" (UID: \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6hmz" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.157629 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2whnv\" (UniqueName: \"kubernetes.io/projected/71a70c3c-d483-43f4-9f54-10978c7f8cc8-kube-api-access-2whnv\") pod \"auto-csr-approver-29563816-4582s\" (UID: \"71a70c3c-d483-43f4-9f54-10978c7f8cc8\") " pod="openshift-infra/auto-csr-approver-29563816-4582s" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.157649 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3a0400a1-7e6b-4335-8819-586d7a460e3d-apiservice-cert\") pod \"packageserver-d55dfcdfc-z8g4f\" (UID: \"3a0400a1-7e6b-4335-8819-586d7a460e3d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z8g4f" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.157675 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gw7q6\" (UniqueName: \"kubernetes.io/projected/4edea753-21f5-44fd-b183-daf03845dcd8-kube-api-access-gw7q6\") pod \"catalog-operator-68c6474976-g686q\" (UID: \"4edea753-21f5-44fd-b183-daf03845dcd8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g686q" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.157702 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/3a0400a1-7e6b-4335-8819-586d7a460e3d-tmpfs\") pod \"packageserver-d55dfcdfc-z8g4f\" (UID: \"3a0400a1-7e6b-4335-8819-586d7a460e3d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z8g4f" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.157722 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3be6d75e-e4f8-4d9b-8ed3-9d25632de88c-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mxb9q\" (UID: \"3be6d75e-e4f8-4d9b-8ed3-9d25632de88c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mxb9q" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.157738 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84ddb369-1909-4d63-a0c0-b250490992c0-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-bzhq6\" (UID: \"84ddb369-1909-4d63-a0c0-b250490992c0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bzhq6" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.157756 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2qxh\" (UniqueName: \"kubernetes.io/projected/56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6-kube-api-access-c2qxh\") pod \"apiserver-76f77b778f-xvnwv\" (UID: \"56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6\") " pod="openshift-apiserver/apiserver-76f77b778f-xvnwv" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.157775 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-n6hmz\" (UID: \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6hmz" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.157805 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lfsh\" (UniqueName: \"kubernetes.io/projected/d915f7d2-5b4d-4017-a839-b615a182fafb-kube-api-access-8lfsh\") pod \"collect-profiles-29563815-tsrs6\" (UID: \"d915f7d2-5b4d-4017-a839-b615a182fafb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563815-tsrs6" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.157832 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe90c8a8-c79a-4ed5-bec1-5ea07fbad5cf-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-mj46t\" (UID: \"fe90c8a8-c79a-4ed5-bec1-5ea07fbad5cf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mj46t" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.157850 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-n6hmz\" (UID: \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6hmz" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.157968 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ed943a82-ef39-4ebc-9d76-09bb69f3b800-registration-dir\") pod \"csi-hostpathplugin-p4b5s\" (UID: \"ed943a82-ef39-4ebc-9d76-09bb69f3b800\") " pod="hostpath-provisioner/csi-hostpathplugin-p4b5s" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.158351 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/ed943a82-ef39-4ebc-9d76-09bb69f3b800-csi-data-dir\") pod \"csi-hostpathplugin-p4b5s\" (UID: \"ed943a82-ef39-4ebc-9d76-09bb69f3b800\") " pod="hostpath-provisioner/csi-hostpathplugin-p4b5s" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.158327 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/ed943a82-ef39-4ebc-9d76-09bb69f3b800-mountpoint-dir\") pod \"csi-hostpathplugin-p4b5s\" (UID: \"ed943a82-ef39-4ebc-9d76-09bb69f3b800\") " pod="hostpath-provisioner/csi-hostpathplugin-p4b5s" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.158488 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/ed943a82-ef39-4ebc-9d76-09bb69f3b800-plugins-dir\") pod \"csi-hostpathplugin-p4b5s\" (UID: \"ed943a82-ef39-4ebc-9d76-09bb69f3b800\") " pod="hostpath-provisioner/csi-hostpathplugin-p4b5s" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.158530 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6-audit-dir\") pod \"apiserver-76f77b778f-xvnwv\" (UID: \"56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6\") " pod="openshift-apiserver/apiserver-76f77b778f-xvnwv" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.159479 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-audit-dir\") pod \"oauth-openshift-558db77b4-n6hmz\" (UID: \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6hmz" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.159969 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/53a14d61-5c2c-44b8-b3cb-c8daa23762bf-signing-key\") pod \"service-ca-9c57cc56f-h5xdn\" (UID: \"53a14d61-5c2c-44b8-b3cb-c8daa23762bf\") " pod="openshift-service-ca/service-ca-9c57cc56f-h5xdn" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.159984 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ec8840c3-e0bd-4cf0-9dd4-87d9ae93b806-auth-proxy-config\") pod \"machine-config-operator-74547568cd-mz68f\" (UID: \"ec8840c3-e0bd-4cf0-9dd4-87d9ae93b806\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mz68f" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.160163 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5192f67b-f2ab-45eb-9b1a-64bdff02437a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-9h9xr\" (UID: \"5192f67b-f2ab-45eb-9b1a-64bdff02437a\") " pod="openshift-marketplace/marketplace-operator-79b997595-9h9xr" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.160563 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ed943a82-ef39-4ebc-9d76-09bb69f3b800-socket-dir\") pod \"csi-hostpathplugin-p4b5s\" (UID: \"ed943a82-ef39-4ebc-9d76-09bb69f3b800\") " pod="hostpath-provisioner/csi-hostpathplugin-p4b5s" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.161911 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/53a14d61-5c2c-44b8-b3cb-c8daa23762bf-signing-cabundle\") pod \"service-ca-9c57cc56f-h5xdn\" (UID: \"53a14d61-5c2c-44b8-b3cb-c8daa23762bf\") " pod="openshift-service-ca/service-ca-9c57cc56f-h5xdn" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.162586 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/34ea1a9f-9093-421f-bef3-228352aa65fb-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-7rr85\" (UID: \"34ea1a9f-9093-421f-bef3-228352aa65fb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7rr85" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.162795 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5192f67b-f2ab-45eb-9b1a-64bdff02437a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-9h9xr\" (UID: \"5192f67b-f2ab-45eb-9b1a-64bdff02437a\") " pod="openshift-marketplace/marketplace-operator-79b997595-9h9xr" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.165385 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4edea753-21f5-44fd-b183-daf03845dcd8-profile-collector-cert\") pod \"catalog-operator-68c6474976-g686q\" (UID: \"4edea753-21f5-44fd-b183-daf03845dcd8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g686q" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.166301 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3be6d75e-e4f8-4d9b-8ed3-9d25632de88c-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mxb9q\" (UID: \"3be6d75e-e4f8-4d9b-8ed3-9d25632de88c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mxb9q" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.166495 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ec8840c3-e0bd-4cf0-9dd4-87d9ae93b806-images\") pod \"machine-config-operator-74547568cd-mz68f\" (UID: \"ec8840c3-e0bd-4cf0-9dd4-87d9ae93b806\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mz68f" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.167947 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.169286 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ec8840c3-e0bd-4cf0-9dd4-87d9ae93b806-proxy-tls\") pod \"machine-config-operator-74547568cd-mz68f\" (UID: \"ec8840c3-e0bd-4cf0-9dd4-87d9ae93b806\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mz68f" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.169379 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/34ea1a9f-9093-421f-bef3-228352aa65fb-client-ca\") pod \"controller-manager-879f6c89f-7rr85\" (UID: \"34ea1a9f-9093-421f-bef3-228352aa65fb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7rr85" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.169391 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/3a0400a1-7e6b-4335-8819-586d7a460e3d-tmpfs\") pod \"packageserver-d55dfcdfc-z8g4f\" (UID: \"3a0400a1-7e6b-4335-8819-586d7a460e3d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z8g4f" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.169654 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34ea1a9f-9093-421f-bef3-228352aa65fb-config\") pod \"controller-manager-879f6c89f-7rr85\" (UID: \"34ea1a9f-9093-421f-bef3-228352aa65fb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7rr85" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.170134 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4edea753-21f5-44fd-b183-daf03845dcd8-srv-cert\") pod \"catalog-operator-68c6474976-g686q\" (UID: \"4edea753-21f5-44fd-b183-daf03845dcd8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g686q" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.170577 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3be6d75e-e4f8-4d9b-8ed3-9d25632de88c-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mxb9q\" (UID: \"3be6d75e-e4f8-4d9b-8ed3-9d25632de88c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mxb9q" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.170679 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/f2b6c2ec-c07f-4d59-ba90-1ed2ec55d8a7-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-4lbr5\" (UID: \"f2b6c2ec-c07f-4d59-ba90-1ed2ec55d8a7\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4lbr5" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.172039 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d915f7d2-5b4d-4017-a839-b615a182fafb-secret-volume\") pod \"collect-profiles-29563815-tsrs6\" (UID: \"d915f7d2-5b4d-4017-a839-b615a182fafb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563815-tsrs6" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.172232 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/34ea1a9f-9093-421f-bef3-228352aa65fb-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-7rr85\" (UID: \"34ea1a9f-9093-421f-bef3-228352aa65fb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7rr85" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.188335 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.193685 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34ea1a9f-9093-421f-bef3-228352aa65fb-serving-cert\") pod \"controller-manager-879f6c89f-7rr85\" (UID: \"34ea1a9f-9093-421f-bef3-228352aa65fb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7rr85" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.208569 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.228593 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.229010 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7z2vw"] Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.248381 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.264766 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:25 crc kubenswrapper[4733]: E0318 10:16:25.265131 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 10:16:25.765117175 +0000 UTC m=+225.256851500 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nwhtg" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.268050 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.277748 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-848w7" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.278948 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe90c8a8-c79a-4ed5-bec1-5ea07fbad5cf-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-mj46t\" (UID: \"fe90c8a8-c79a-4ed5-bec1-5ea07fbad5cf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mj46t" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.288502 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.295608 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe90c8a8-c79a-4ed5-bec1-5ea07fbad5cf-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-mj46t\" (UID: \"fe90c8a8-c79a-4ed5-bec1-5ea07fbad5cf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mj46t" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.307785 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.328827 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.341775 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d915f7d2-5b4d-4017-a839-b615a182fafb-config-volume\") pod \"collect-profiles-29563815-tsrs6\" (UID: \"d915f7d2-5b4d-4017-a839-b615a182fafb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563815-tsrs6" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.348303 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.366357 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:16:25 crc kubenswrapper[4733]: E0318 10:16:25.367340 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:16:25.867322115 +0000 UTC m=+225.359056440 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.368887 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.384658 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-m5k95"] Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.390073 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.390506 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-nbftd"] Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.401149 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/10e64d74-2e25-41fd-a9ad-32a3e74e5c01-metrics-tls\") pod \"dns-operator-744455d44c-vsnq2\" (UID: \"10e64d74-2e25-41fd-a9ad-32a3e74e5c01\") " pod="openshift-dns-operator/dns-operator-744455d44c-vsnq2" Mar 18 10:16:25 crc kubenswrapper[4733]: W0318 10:16:25.401807 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25331c44_b639_46f7_8a7f_6f62f8779e2b.slice/crio-0a4e2b2140bacea055efd9eb333f7f7f1da7235e623090af40eaf58bc070ecb2 WatchSource:0}: Error finding container 0a4e2b2140bacea055efd9eb333f7f7f1da7235e623090af40eaf58bc070ecb2: Status 404 returned error can't find the container with id 0a4e2b2140bacea055efd9eb333f7f7f1da7235e623090af40eaf58bc070ecb2 Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.402179 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7z2vw" event={"ID":"ef9e43d5-8b80-4934-82b6-c8ee0591e1bf","Type":"ContainerStarted","Data":"7d91e90576d6f69a752503d1db70731b645844079844bd8d008241f1b17a35ca"} Mar 18 10:16:25 crc kubenswrapper[4733]: W0318 10:16:25.404911 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c02459c_3d75_4363_a010_3e9639bb9b4e.slice/crio-1c6c59013156a64bebe9e58fcc19d04eef4644b14406fc14905ab52fb61b3f46 WatchSource:0}: Error finding container 1c6c59013156a64bebe9e58fcc19d04eef4644b14406fc14905ab52fb61b3f46: Status 404 returned error can't find the container with id 1c6c59013156a64bebe9e58fcc19d04eef4644b14406fc14905ab52fb61b3f46 Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.409125 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.431556 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.448979 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.452828 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b405f127-b181-49a1-8205-aafd58d1fa7b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-gdmqx\" (UID: \"b405f127-b181-49a1-8205-aafd58d1fa7b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gdmqx" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.468366 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-848w7"] Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.468578 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.469107 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:25 crc kubenswrapper[4733]: E0318 10:16:25.469724 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 10:16:25.96970877 +0000 UTC m=+225.461443095 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nwhtg" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.472450 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3a0400a1-7e6b-4335-8819-586d7a460e3d-webhook-cert\") pod \"packageserver-d55dfcdfc-z8g4f\" (UID: \"3a0400a1-7e6b-4335-8819-586d7a460e3d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z8g4f" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.474277 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3a0400a1-7e6b-4335-8819-586d7a460e3d-apiservice-cert\") pod \"packageserver-d55dfcdfc-z8g4f\" (UID: \"3a0400a1-7e6b-4335-8819-586d7a460e3d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z8g4f" Mar 18 10:16:25 crc kubenswrapper[4733]: W0318 10:16:25.475589 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43cea3fb_14f9_4993_a8a9_4618680e8286.slice/crio-266a4c82e657da88a4fe5fb06d7ebc997772f1e97435dac09b83d427850e9c87 WatchSource:0}: Error finding container 266a4c82e657da88a4fe5fb06d7ebc997772f1e97435dac09b83d427850e9c87: Status 404 returned error can't find the container with id 266a4c82e657da88a4fe5fb06d7ebc997772f1e97435dac09b83d427850e9c87 Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.488514 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.509241 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.528543 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.540158 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10f3d99e-72fa-4c62-8190-059d7a0effd1-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-dxd7p\" (UID: \"10f3d99e-72fa-4c62-8190-059d7a0effd1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dxd7p" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.548125 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.567784 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.570699 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:16:25 crc kubenswrapper[4733]: E0318 10:16:25.571026 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:16:26.0710055 +0000 UTC m=+225.562739825 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.571609 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:25 crc kubenswrapper[4733]: E0318 10:16:25.571974 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 10:16:26.07195774 +0000 UTC m=+225.563692155 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nwhtg" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.573446 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10f3d99e-72fa-4c62-8190-059d7a0effd1-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-dxd7p\" (UID: \"10f3d99e-72fa-4c62-8190-059d7a0effd1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dxd7p" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.608399 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.636275 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.639322 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4810c2fd-346b-44a0-b985-46d302060373-trusted-ca\") pod \"console-operator-58897d9998-lptjf\" (UID: \"4810c2fd-346b-44a0-b985-46d302060373\") " pod="openshift-console-operator/console-operator-58897d9998-lptjf" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.648463 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.662015 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4810c2fd-346b-44a0-b985-46d302060373-config\") pod \"console-operator-58897d9998-lptjf\" (UID: \"4810c2fd-346b-44a0-b985-46d302060373\") " pod="openshift-console-operator/console-operator-58897d9998-lptjf" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.669503 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.672362 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:16:25 crc kubenswrapper[4733]: E0318 10:16:25.673296 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:16:26.173282512 +0000 UTC m=+225.665016837 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.688595 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.707910 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.724136 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4810c2fd-346b-44a0-b985-46d302060373-serving-cert\") pod \"console-operator-58897d9998-lptjf\" (UID: \"4810c2fd-346b-44a0-b985-46d302060373\") " pod="openshift-console-operator/console-operator-58897d9998-lptjf" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.728983 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.732706 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/87157be2-0fc3-4120-b9b6-d4494ace940a-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-h8kqf\" (UID: \"87157be2-0fc3-4120-b9b6-d4494ace940a\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-h8kqf" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.748689 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.770289 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.775020 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:25 crc kubenswrapper[4733]: E0318 10:16:25.775454 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 10:16:26.275435649 +0000 UTC m=+225.767170054 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nwhtg" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.782960 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97ffe185-3f09-44d0-a173-f95bb53c419e-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-8kv4d\" (UID: \"97ffe185-3f09-44d0-a173-f95bb53c419e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8kv4d" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.788724 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.808261 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.829021 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.831370 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97ffe185-3f09-44d0-a173-f95bb53c419e-config\") pod \"kube-apiserver-operator-766d6c64bb-8kv4d\" (UID: \"97ffe185-3f09-44d0-a173-f95bb53c419e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8kv4d" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.849359 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.869471 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.876773 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:16:25 crc kubenswrapper[4733]: E0318 10:16:25.877103 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:16:26.37706724 +0000 UTC m=+225.868801595 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.877515 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:25 crc kubenswrapper[4733]: E0318 10:16:25.878100 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 10:16:26.378086793 +0000 UTC m=+225.869821118 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nwhtg" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.888501 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.906497 4733 request.go:700] Waited for 1.005027989s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-service-ca-operator/secrets?fieldSelector=metadata.name%3Dservice-ca-operator-dockercfg-rg9jl&limit=500&resourceVersion=0 Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.908908 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.928816 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.941781 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad648fa7-2560-4aa0-8634-05bcbc48916f-serving-cert\") pod \"service-ca-operator-777779d784-t95b6\" (UID: \"ad648fa7-2560-4aa0-8634-05bcbc48916f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-t95b6" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.948864 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.950207 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad648fa7-2560-4aa0-8634-05bcbc48916f-config\") pod \"service-ca-operator-777779d784-t95b6\" (UID: \"ad648fa7-2560-4aa0-8634-05bcbc48916f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-t95b6" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.969477 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.979314 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:16:25 crc kubenswrapper[4733]: E0318 10:16:25.979470 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:16:26.479438165 +0000 UTC m=+225.971172500 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.980564 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:25 crc kubenswrapper[4733]: E0318 10:16:25.981033 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 10:16:26.481021426 +0000 UTC m=+225.972755761 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nwhtg" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:25 crc kubenswrapper[4733]: I0318 10:16:25.991609 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.016960 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.028613 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-n6hmz\" (UID: \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6hmz" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.028997 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.041444 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-n6hmz\" (UID: \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6hmz" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.050274 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.061472 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-n6hmz\" (UID: \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6hmz" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.068756 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.080861 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-n6hmz\" (UID: \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6hmz" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.083774 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:16:26 crc kubenswrapper[4733]: E0318 10:16:26.084002 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:16:26.583970338 +0000 UTC m=+226.075704683 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.084677 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:26 crc kubenswrapper[4733]: E0318 10:16:26.085167 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 10:16:26.585147986 +0000 UTC m=+226.076882331 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nwhtg" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.089016 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.093124 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-n6hmz\" (UID: \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6hmz" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.110220 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.123394 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-n6hmz\" (UID: \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6hmz" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.143267 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.149157 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.153934 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-n6hmz\" (UID: \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6hmz" Mar 18 10:16:26 crc kubenswrapper[4733]: E0318 10:16:26.158312 4733 configmap.go:193] Couldn't get configMap openshift-authentication/v4-0-config-system-service-ca: failed to sync configmap cache: timed out waiting for the condition Mar 18 10:16:26 crc kubenswrapper[4733]: E0318 10:16:26.158343 4733 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 18 10:16:26 crc kubenswrapper[4733]: E0318 10:16:26.158381 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-system-service-ca podName:486eda8c-6e6f-4761-b28c-8aeb72fcfcc1 nodeName:}" failed. No retries permitted until 2026-03-18 10:16:26.658363548 +0000 UTC m=+226.150097873 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-service-ca" (UniqueName: "kubernetes.io/configmap/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-system-service-ca") pod "oauth-openshift-558db77b4-n6hmz" (UID: "486eda8c-6e6f-4761-b28c-8aeb72fcfcc1") : failed to sync configmap cache: timed out waiting for the condition Mar 18 10:16:26 crc kubenswrapper[4733]: E0318 10:16:26.158425 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b0edb65-3bcf-484f-9707-d8124df1ec88-package-server-manager-serving-cert podName:9b0edb65-3bcf-484f-9707-d8124df1ec88 nodeName:}" failed. No retries permitted until 2026-03-18 10:16:26.658404099 +0000 UTC m=+226.150138434 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/9b0edb65-3bcf-484f-9707-d8124df1ec88-package-server-manager-serving-cert") pod "package-server-manager-789f6589d5-kd6gw" (UID: "9b0edb65-3bcf-484f-9707-d8124df1ec88") : failed to sync secret cache: timed out waiting for the condition Mar 18 10:16:26 crc kubenswrapper[4733]: E0318 10:16:26.158424 4733 configmap.go:193] Couldn't get configMap openshift-apiserver/config: failed to sync configmap cache: timed out waiting for the condition Mar 18 10:16:26 crc kubenswrapper[4733]: E0318 10:16:26.158450 4733 configmap.go:193] Couldn't get configMap openshift-dns/dns-default: failed to sync configmap cache: timed out waiting for the condition Mar 18 10:16:26 crc kubenswrapper[4733]: E0318 10:16:26.158470 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6-config podName:56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6 nodeName:}" failed. No retries permitted until 2026-03-18 10:16:26.658459191 +0000 UTC m=+226.150193526 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6-config") pod "apiserver-76f77b778f-xvnwv" (UID: "56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6") : failed to sync configmap cache: timed out waiting for the condition Mar 18 10:16:26 crc kubenswrapper[4733]: E0318 10:16:26.158487 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/aa4b5542-dc36-4c93-88e5-a080729b94ae-config-volume podName:aa4b5542-dc36-4c93-88e5-a080729b94ae nodeName:}" failed. No retries permitted until 2026-03-18 10:16:26.658477891 +0000 UTC m=+226.150212216 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/aa4b5542-dc36-4c93-88e5-a080729b94ae-config-volume") pod "dns-default-hvmrz" (UID: "aa4b5542-dc36-4c93-88e5-a080729b94ae") : failed to sync configmap cache: timed out waiting for the condition Mar 18 10:16:26 crc kubenswrapper[4733]: E0318 10:16:26.158488 4733 secret.go:188] Couldn't get secret openshift-machine-config-operator/node-bootstrapper-token: failed to sync secret cache: timed out waiting for the condition Mar 18 10:16:26 crc kubenswrapper[4733]: E0318 10:16:26.158501 4733 secret.go:188] Couldn't get secret openshift-machine-config-operator/machine-config-server-tls: failed to sync secret cache: timed out waiting for the condition Mar 18 10:16:26 crc kubenswrapper[4733]: E0318 10:16:26.158516 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a39a28f7-1fd2-44f7-8b49-05a0faf1e000-node-bootstrap-token podName:a39a28f7-1fd2-44f7-8b49-05a0faf1e000 nodeName:}" failed. No retries permitted until 2026-03-18 10:16:26.658510922 +0000 UTC m=+226.150245247 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-bootstrap-token" (UniqueName: "kubernetes.io/secret/a39a28f7-1fd2-44f7-8b49-05a0faf1e000-node-bootstrap-token") pod "machine-config-server-fnzxw" (UID: "a39a28f7-1fd2-44f7-8b49-05a0faf1e000") : failed to sync secret cache: timed out waiting for the condition Mar 18 10:16:26 crc kubenswrapper[4733]: E0318 10:16:26.158318 4733 configmap.go:193] Couldn't get configMap openshift-apiserver/trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 18 10:16:26 crc kubenswrapper[4733]: E0318 10:16:26.158530 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a39a28f7-1fd2-44f7-8b49-05a0faf1e000-certs podName:a39a28f7-1fd2-44f7-8b49-05a0faf1e000 nodeName:}" failed. No retries permitted until 2026-03-18 10:16:26.658522113 +0000 UTC m=+226.150256458 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certs" (UniqueName: "kubernetes.io/secret/a39a28f7-1fd2-44f7-8b49-05a0faf1e000-certs") pod "machine-config-server-fnzxw" (UID: "a39a28f7-1fd2-44f7-8b49-05a0faf1e000") : failed to sync secret cache: timed out waiting for the condition Mar 18 10:16:26 crc kubenswrapper[4733]: E0318 10:16:26.158549 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6-trusted-ca-bundle podName:56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6 nodeName:}" failed. No retries permitted until 2026-03-18 10:16:26.658541093 +0000 UTC m=+226.150275438 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6-trusted-ca-bundle") pod "apiserver-76f77b778f-xvnwv" (UID: "56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6") : failed to sync configmap cache: timed out waiting for the condition Mar 18 10:16:26 crc kubenswrapper[4733]: E0318 10:16:26.158595 4733 secret.go:188] Couldn't get secret openshift-apiserver/encryption-config-1: failed to sync secret cache: timed out waiting for the condition Mar 18 10:16:26 crc kubenswrapper[4733]: E0318 10:16:26.158624 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6-encryption-config podName:56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6 nodeName:}" failed. No retries permitted until 2026-03-18 10:16:26.658617016 +0000 UTC m=+226.150351351 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "encryption-config" (UniqueName: "kubernetes.io/secret/56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6-encryption-config") pod "apiserver-76f77b778f-xvnwv" (UID: "56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6") : failed to sync secret cache: timed out waiting for the condition Mar 18 10:16:26 crc kubenswrapper[4733]: E0318 10:16:26.158644 4733 secret.go:188] Couldn't get secret openshift-ingress-canary/canary-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 18 10:16:26 crc kubenswrapper[4733]: E0318 10:16:26.158671 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5979b3d-b4b4-4081-b486-4fcf91f6367c-cert podName:d5979b3d-b4b4-4081-b486-4fcf91f6367c nodeName:}" failed. No retries permitted until 2026-03-18 10:16:26.658662387 +0000 UTC m=+226.150396722 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d5979b3d-b4b4-4081-b486-4fcf91f6367c-cert") pod "ingress-canary-pvlch" (UID: "d5979b3d-b4b4-4081-b486-4fcf91f6367c") : failed to sync secret cache: timed out waiting for the condition Mar 18 10:16:26 crc kubenswrapper[4733]: E0318 10:16:26.158714 4733 configmap.go:193] Couldn't get configMap openshift-authentication/audit: failed to sync configmap cache: timed out waiting for the condition Mar 18 10:16:26 crc kubenswrapper[4733]: E0318 10:16:26.158741 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-audit-policies podName:486eda8c-6e6f-4761-b28c-8aeb72fcfcc1 nodeName:}" failed. No retries permitted until 2026-03-18 10:16:26.658733919 +0000 UTC m=+226.150468254 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "audit-policies" (UniqueName: "kubernetes.io/configmap/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-audit-policies") pod "oauth-openshift-558db77b4-n6hmz" (UID: "486eda8c-6e6f-4761-b28c-8aeb72fcfcc1") : failed to sync configmap cache: timed out waiting for the condition Mar 18 10:16:26 crc kubenswrapper[4733]: E0318 10:16:26.158760 4733 secret.go:188] Couldn't get secret openshift-apiserver/serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 18 10:16:26 crc kubenswrapper[4733]: E0318 10:16:26.158844 4733 secret.go:188] Couldn't get secret openshift-apiserver/etcd-client: failed to sync secret cache: timed out waiting for the condition Mar 18 10:16:26 crc kubenswrapper[4733]: E0318 10:16:26.158874 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6-etcd-client podName:56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6 nodeName:}" failed. No retries permitted until 2026-03-18 10:16:26.658866774 +0000 UTC m=+226.150601099 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6-etcd-client") pod "apiserver-76f77b778f-xvnwv" (UID: "56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6") : failed to sync secret cache: timed out waiting for the condition Mar 18 10:16:26 crc kubenswrapper[4733]: E0318 10:16:26.158919 4733 secret.go:188] Couldn't get secret openshift-kube-controller-manager-operator/kube-controller-manager-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 18 10:16:26 crc kubenswrapper[4733]: E0318 10:16:26.158943 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84ddb369-1909-4d63-a0c0-b250490992c0-serving-cert podName:84ddb369-1909-4d63-a0c0-b250490992c0 nodeName:}" failed. No retries permitted until 2026-03-18 10:16:26.658937066 +0000 UTC m=+226.150671391 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/84ddb369-1909-4d63-a0c0-b250490992c0-serving-cert") pod "kube-controller-manager-operator-78b949d7b-bzhq6" (UID: "84ddb369-1909-4d63-a0c0-b250490992c0") : failed to sync secret cache: timed out waiting for the condition Mar 18 10:16:26 crc kubenswrapper[4733]: E0318 10:16:26.158976 4733 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-1: failed to sync configmap cache: timed out waiting for the condition Mar 18 10:16:26 crc kubenswrapper[4733]: E0318 10:16:26.159219 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6-audit podName:56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6 nodeName:}" failed. No retries permitted until 2026-03-18 10:16:26.659179714 +0000 UTC m=+226.150914029 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6-audit") pod "apiserver-76f77b778f-xvnwv" (UID: "56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6") : failed to sync configmap cache: timed out waiting for the condition Mar 18 10:16:26 crc kubenswrapper[4733]: E0318 10:16:26.159242 4733 configmap.go:193] Couldn't get configMap openshift-kube-controller-manager-operator/kube-controller-manager-operator-config: failed to sync configmap cache: timed out waiting for the condition Mar 18 10:16:26 crc kubenswrapper[4733]: E0318 10:16:26.159265 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/84ddb369-1909-4d63-a0c0-b250490992c0-config podName:84ddb369-1909-4d63-a0c0-b250490992c0 nodeName:}" failed. No retries permitted until 2026-03-18 10:16:26.659258576 +0000 UTC m=+226.150992901 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/84ddb369-1909-4d63-a0c0-b250490992c0-config") pod "kube-controller-manager-operator-78b949d7b-bzhq6" (UID: "84ddb369-1909-4d63-a0c0-b250490992c0") : failed to sync configmap cache: timed out waiting for the condition Mar 18 10:16:26 crc kubenswrapper[4733]: E0318 10:16:26.160397 4733 secret.go:188] Couldn't get secret openshift-authentication/v4-0-config-system-session: failed to sync secret cache: timed out waiting for the condition Mar 18 10:16:26 crc kubenswrapper[4733]: E0318 10:16:26.160404 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6-serving-cert podName:56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6 nodeName:}" failed. No retries permitted until 2026-03-18 10:16:26.660367842 +0000 UTC m=+226.152102177 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6-serving-cert") pod "apiserver-76f77b778f-xvnwv" (UID: "56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6") : failed to sync secret cache: timed out waiting for the condition Mar 18 10:16:26 crc kubenswrapper[4733]: E0318 10:16:26.160483 4733 configmap.go:193] Couldn't get configMap openshift-authentication/v4-0-config-system-trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 18 10:16:26 crc kubenswrapper[4733]: E0318 10:16:26.160541 4733 configmap.go:193] Couldn't get configMap openshift-apiserver/etcd-serving-ca: failed to sync configmap cache: timed out waiting for the condition Mar 18 10:16:26 crc kubenswrapper[4733]: E0318 10:16:26.160498 4733 configmap.go:193] Couldn't get configMap openshift-apiserver/image-import-ca: failed to sync configmap cache: timed out waiting for the condition Mar 18 10:16:26 crc kubenswrapper[4733]: E0318 10:16:26.160500 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-system-session podName:486eda8c-6e6f-4761-b28c-8aeb72fcfcc1 nodeName:}" failed. No retries permitted until 2026-03-18 10:16:26.660481025 +0000 UTC m=+226.152215440 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-session" (UniqueName: "kubernetes.io/secret/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-system-session") pod "oauth-openshift-558db77b4-n6hmz" (UID: "486eda8c-6e6f-4761-b28c-8aeb72fcfcc1") : failed to sync secret cache: timed out waiting for the condition Mar 18 10:16:26 crc kubenswrapper[4733]: E0318 10:16:26.160614 4733 configmap.go:193] Couldn't get configMap openshift-authentication/v4-0-config-system-cliconfig: failed to sync configmap cache: timed out waiting for the condition Mar 18 10:16:26 crc kubenswrapper[4733]: E0318 10:16:26.160641 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-system-trusted-ca-bundle podName:486eda8c-6e6f-4761-b28c-8aeb72fcfcc1 nodeName:}" failed. No retries permitted until 2026-03-18 10:16:26.66061327 +0000 UTC m=+226.152347595 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-system-trusted-ca-bundle") pod "oauth-openshift-558db77b4-n6hmz" (UID: "486eda8c-6e6f-4761-b28c-8aeb72fcfcc1") : failed to sync configmap cache: timed out waiting for the condition Mar 18 10:16:26 crc kubenswrapper[4733]: E0318 10:16:26.160516 4733 secret.go:188] Couldn't get secret openshift-dns/dns-default-metrics-tls: failed to sync secret cache: timed out waiting for the condition Mar 18 10:16:26 crc kubenswrapper[4733]: E0318 10:16:26.160657 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6-image-import-ca podName:56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6 nodeName:}" failed. No retries permitted until 2026-03-18 10:16:26.660652001 +0000 UTC m=+226.152386326 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "image-import-ca" (UniqueName: "kubernetes.io/configmap/56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6-image-import-ca") pod "apiserver-76f77b778f-xvnwv" (UID: "56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6") : failed to sync configmap cache: timed out waiting for the condition Mar 18 10:16:26 crc kubenswrapper[4733]: E0318 10:16:26.160706 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-system-cliconfig podName:486eda8c-6e6f-4761-b28c-8aeb72fcfcc1 nodeName:}" failed. No retries permitted until 2026-03-18 10:16:26.660693112 +0000 UTC m=+226.152427547 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-cliconfig" (UniqueName: "kubernetes.io/configmap/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-system-cliconfig") pod "oauth-openshift-558db77b4-n6hmz" (UID: "486eda8c-6e6f-4761-b28c-8aeb72fcfcc1") : failed to sync configmap cache: timed out waiting for the condition Mar 18 10:16:26 crc kubenswrapper[4733]: E0318 10:16:26.160724 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa4b5542-dc36-4c93-88e5-a080729b94ae-metrics-tls podName:aa4b5542-dc36-4c93-88e5-a080729b94ae nodeName:}" failed. No retries permitted until 2026-03-18 10:16:26.660713853 +0000 UTC m=+226.152448278 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/aa4b5542-dc36-4c93-88e5-a080729b94ae-metrics-tls") pod "dns-default-hvmrz" (UID: "aa4b5542-dc36-4c93-88e5-a080729b94ae") : failed to sync secret cache: timed out waiting for the condition Mar 18 10:16:26 crc kubenswrapper[4733]: E0318 10:16:26.160757 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6-etcd-serving-ca podName:56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6 nodeName:}" failed. No retries permitted until 2026-03-18 10:16:26.660748944 +0000 UTC m=+226.152483379 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-serving-ca" (UniqueName: "kubernetes.io/configmap/56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6-etcd-serving-ca") pod "apiserver-76f77b778f-xvnwv" (UID: "56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6") : failed to sync configmap cache: timed out waiting for the condition Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.168875 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.185223 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:16:26 crc kubenswrapper[4733]: E0318 10:16:26.185849 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:16:26.685831346 +0000 UTC m=+226.177565671 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.186024 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:26 crc kubenswrapper[4733]: E0318 10:16:26.186374 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 10:16:26.686363433 +0000 UTC m=+226.178097758 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nwhtg" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.188777 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.207602 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.229235 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.256638 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.269436 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.287472 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:16:26 crc kubenswrapper[4733]: E0318 10:16:26.287692 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:16:26.787666874 +0000 UTC m=+226.279401199 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.288097 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:26 crc kubenswrapper[4733]: E0318 10:16:26.289326 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 10:16:26.789279675 +0000 UTC m=+226.281014190 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nwhtg" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.289690 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.308990 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.329162 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.349046 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.368551 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.389145 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.389499 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:16:26 crc kubenswrapper[4733]: E0318 10:16:26.389713 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:16:26.889683797 +0000 UTC m=+226.381418122 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.390673 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:26 crc kubenswrapper[4733]: E0318 10:16:26.390974 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 10:16:26.890964438 +0000 UTC m=+226.382698753 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nwhtg" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.408407 4733 generic.go:334] "Generic (PLEG): container finished" podID="43cea3fb-14f9-4993-a8a9-4618680e8286" containerID="01b8744b846f9fb6ffe485cf34336b1d737fb2ba6bebf606b4c7d34295512d87" exitCode=0 Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.408488 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-848w7" event={"ID":"43cea3fb-14f9-4993-a8a9-4618680e8286","Type":"ContainerDied","Data":"01b8744b846f9fb6ffe485cf34336b1d737fb2ba6bebf606b4c7d34295512d87"} Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.408524 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-848w7" event={"ID":"43cea3fb-14f9-4993-a8a9-4618680e8286","Type":"ContainerStarted","Data":"266a4c82e657da88a4fe5fb06d7ebc997772f1e97435dac09b83d427850e9c87"} Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.408691 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.410690 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-nbftd" event={"ID":"0c02459c-3d75-4363-a010-3e9639bb9b4e","Type":"ContainerStarted","Data":"f90ad87e25013bf2f6f8581f6f43f0b35b6c4a22d87ae5d7ec104a7eda47afaa"} Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.410734 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-nbftd" event={"ID":"0c02459c-3d75-4363-a010-3e9639bb9b4e","Type":"ContainerStarted","Data":"6418ec699be9d8b004cc9e2a35a84ccc8b26c389cdb4bbec509b2660f63b7fd4"} Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.410780 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-nbftd" event={"ID":"0c02459c-3d75-4363-a010-3e9639bb9b4e","Type":"ContainerStarted","Data":"1c6c59013156a64bebe9e58fcc19d04eef4644b14406fc14905ab52fb61b3f46"} Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.412749 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m5k95" event={"ID":"25331c44-b639-46f7-8a7f-6f62f8779e2b","Type":"ContainerStarted","Data":"3c09df7a275938153d455f147ffe12eff185edea72f1d8646898b9ba5ba684d5"} Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.412782 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m5k95" event={"ID":"25331c44-b639-46f7-8a7f-6f62f8779e2b","Type":"ContainerStarted","Data":"0a4e2b2140bacea055efd9eb333f7f7f1da7235e623090af40eaf58bc070ecb2"} Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.412968 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m5k95" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.414461 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7z2vw" event={"ID":"ef9e43d5-8b80-4934-82b6-c8ee0591e1bf","Type":"ContainerStarted","Data":"ee668f2791d6ce76a1d528f6019a92f746e2d687878c1f727c7cad536298f5e3"} Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.428724 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.449376 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.469134 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.489453 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.492080 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:16:26 crc kubenswrapper[4733]: E0318 10:16:26.495070 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:16:26.995034607 +0000 UTC m=+226.486768932 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.509698 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.530621 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.548251 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.574716 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.588339 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.595134 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:26 crc kubenswrapper[4733]: E0318 10:16:26.595491 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 10:16:27.09548036 +0000 UTC m=+226.587214685 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nwhtg" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.622040 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdltm\" (UniqueName: \"kubernetes.io/projected/f27409fc-b6dd-4573-918b-7b30b3635cc7-kube-api-access-cdltm\") pod \"console-f9d7485db-8v244\" (UID: \"f27409fc-b6dd-4573-918b-7b30b3635cc7\") " pod="openshift-console/console-f9d7485db-8v244" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.646767 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddl6d\" (UniqueName: \"kubernetes.io/projected/2539fca8-3dde-43ed-815c-e837f37dfdd5-kube-api-access-ddl6d\") pod \"machine-approver-56656f9798-9dd56\" (UID: \"2539fca8-3dde-43ed-815c-e837f37dfdd5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9dd56" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.662809 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msfql\" (UniqueName: \"kubernetes.io/projected/61e27ee7-5eb0-4cc7-a696-85ddd192b171-kube-api-access-msfql\") pod \"downloads-7954f5f757-gxcb2\" (UID: \"61e27ee7-5eb0-4cc7-a696-85ddd192b171\") " pod="openshift-console/downloads-7954f5f757-gxcb2" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.689816 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9jhd\" (UniqueName: \"kubernetes.io/projected/d9572819-3894-4603-bd2b-7c9465bb0067-kube-api-access-l9jhd\") pod \"etcd-operator-b45778765-zztn5\" (UID: \"d9572819-3894-4603-bd2b-7c9465bb0067\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zztn5" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.691623 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-zztn5" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.698058 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.698817 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-audit-policies\") pod \"oauth-openshift-558db77b4-n6hmz\" (UID: \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6hmz" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.698991 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6-serving-cert\") pod \"apiserver-76f77b778f-xvnwv\" (UID: \"56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6\") " pod="openshift-apiserver/apiserver-76f77b778f-xvnwv" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.699122 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9b0edb65-3bcf-484f-9707-d8124df1ec88-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-kd6gw\" (UID: \"9b0edb65-3bcf-484f-9707-d8124df1ec88\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kd6gw" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.699426 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6-trusted-ca-bundle\") pod \"apiserver-76f77b778f-xvnwv\" (UID: \"56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6\") " pod="openshift-apiserver/apiserver-76f77b778f-xvnwv" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.699556 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-n6hmz\" (UID: \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6hmz" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.699675 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6-image-import-ca\") pod \"apiserver-76f77b778f-xvnwv\" (UID: \"56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6\") " pod="openshift-apiserver/apiserver-76f77b778f-xvnwv" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.699787 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a39a28f7-1fd2-44f7-8b49-05a0faf1e000-node-bootstrap-token\") pod \"machine-config-server-fnzxw\" (UID: \"a39a28f7-1fd2-44f7-8b49-05a0faf1e000\") " pod="openshift-machine-config-operator/machine-config-server-fnzxw" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.699883 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a39a28f7-1fd2-44f7-8b49-05a0faf1e000-certs\") pod \"machine-config-server-fnzxw\" (UID: \"a39a28f7-1fd2-44f7-8b49-05a0faf1e000\") " pod="openshift-machine-config-operator/machine-config-server-fnzxw" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.700021 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6-encryption-config\") pod \"apiserver-76f77b778f-xvnwv\" (UID: \"56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6\") " pod="openshift-apiserver/apiserver-76f77b778f-xvnwv" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.700162 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d5979b3d-b4b4-4081-b486-4fcf91f6367c-cert\") pod \"ingress-canary-pvlch\" (UID: \"d5979b3d-b4b4-4081-b486-4fcf91f6367c\") " pod="openshift-ingress-canary/ingress-canary-pvlch" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.700340 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6-etcd-client\") pod \"apiserver-76f77b778f-xvnwv\" (UID: \"56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6\") " pod="openshift-apiserver/apiserver-76f77b778f-xvnwv" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.700493 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6-etcd-serving-ca\") pod \"apiserver-76f77b778f-xvnwv\" (UID: \"56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6\") " pod="openshift-apiserver/apiserver-76f77b778f-xvnwv" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.700684 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aa4b5542-dc36-4c93-88e5-a080729b94ae-config-volume\") pod \"dns-default-hvmrz\" (UID: \"aa4b5542-dc36-4c93-88e5-a080729b94ae\") " pod="openshift-dns/dns-default-hvmrz" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.700903 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6-audit\") pod \"apiserver-76f77b778f-xvnwv\" (UID: \"56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6\") " pod="openshift-apiserver/apiserver-76f77b778f-xvnwv" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.701015 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-n6hmz\" (UID: \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6hmz" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.701142 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84ddb369-1909-4d63-a0c0-b250490992c0-config\") pod \"kube-controller-manager-operator-78b949d7b-bzhq6\" (UID: \"84ddb369-1909-4d63-a0c0-b250490992c0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bzhq6" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.701285 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6-config\") pod \"apiserver-76f77b778f-xvnwv\" (UID: \"56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6\") " pod="openshift-apiserver/apiserver-76f77b778f-xvnwv" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.701445 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84ddb369-1909-4d63-a0c0-b250490992c0-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-bzhq6\" (UID: \"84ddb369-1909-4d63-a0c0-b250490992c0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bzhq6" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.701573 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-n6hmz\" (UID: \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6hmz" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.701712 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-n6hmz\" (UID: \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6hmz" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.701851 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/aa4b5542-dc36-4c93-88e5-a080729b94ae-metrics-tls\") pod \"dns-default-hvmrz\" (UID: \"aa4b5542-dc36-4c93-88e5-a080729b94ae\") " pod="openshift-dns/dns-default-hvmrz" Mar 18 10:16:26 crc kubenswrapper[4733]: E0318 10:16:26.702235 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:16:27.202215964 +0000 UTC m=+226.693950299 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.703732 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6-etcd-serving-ca\") pod \"apiserver-76f77b778f-xvnwv\" (UID: \"56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6\") " pod="openshift-apiserver/apiserver-76f77b778f-xvnwv" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.703818 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-n6hmz\" (UID: \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6hmz" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.704009 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84ddb369-1909-4d63-a0c0-b250490992c0-config\") pod \"kube-controller-manager-operator-78b949d7b-bzhq6\" (UID: \"84ddb369-1909-4d63-a0c0-b250490992c0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bzhq6" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.704840 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6-audit\") pod \"apiserver-76f77b778f-xvnwv\" (UID: \"56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6\") " pod="openshift-apiserver/apiserver-76f77b778f-xvnwv" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.705808 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6-image-import-ca\") pod \"apiserver-76f77b778f-xvnwv\" (UID: \"56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6\") " pod="openshift-apiserver/apiserver-76f77b778f-xvnwv" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.705992 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-audit-policies\") pod \"oauth-openshift-558db77b4-n6hmz\" (UID: \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6hmz" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.706256 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6-config\") pod \"apiserver-76f77b778f-xvnwv\" (UID: \"56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6\") " pod="openshift-apiserver/apiserver-76f77b778f-xvnwv" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.707683 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-n6hmz\" (UID: \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6hmz" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.708584 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6-encryption-config\") pod \"apiserver-76f77b778f-xvnwv\" (UID: \"56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6\") " pod="openshift-apiserver/apiserver-76f77b778f-xvnwv" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.708838 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-n6hmz\" (UID: \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6hmz" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.710055 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6-etcd-client\") pod \"apiserver-76f77b778f-xvnwv\" (UID: \"56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6\") " pod="openshift-apiserver/apiserver-76f77b778f-xvnwv" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.711880 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-n6hmz\" (UID: \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6hmz" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.712025 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6-serving-cert\") pod \"apiserver-76f77b778f-xvnwv\" (UID: \"56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6\") " pod="openshift-apiserver/apiserver-76f77b778f-xvnwv" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.713802 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84ddb369-1909-4d63-a0c0-b250490992c0-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-bzhq6\" (UID: \"84ddb369-1909-4d63-a0c0-b250490992c0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bzhq6" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.718900 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6-trusted-ca-bundle\") pod \"apiserver-76f77b778f-xvnwv\" (UID: \"56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6\") " pod="openshift-apiserver/apiserver-76f77b778f-xvnwv" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.722342 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9b0edb65-3bcf-484f-9707-d8124df1ec88-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-kd6gw\" (UID: \"9b0edb65-3bcf-484f-9707-d8124df1ec88\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kd6gw" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.722553 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngwlw\" (UniqueName: \"kubernetes.io/projected/158a5836-f175-4da3-b22d-6a3130a89d30-kube-api-access-ngwlw\") pod \"authentication-operator-69f744f599-xh9n5\" (UID: \"158a5836-f175-4da3-b22d-6a3130a89d30\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xh9n5" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.727475 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6s2kq\" (UniqueName: \"kubernetes.io/projected/6b6a9601-6689-435b-aca1-256a0c3c07fb-kube-api-access-6s2kq\") pod \"ingress-operator-5b745b69d9-ltwbb\" (UID: \"6b6a9601-6689-435b-aca1-256a0c3c07fb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ltwbb" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.749147 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5c94c\" (UniqueName: \"kubernetes.io/projected/c0da800f-a7ca-4d0e-89bb-96673854969e-kube-api-access-5c94c\") pod \"migrator-59844c95c7-6572z\" (UID: \"c0da800f-a7ca-4d0e-89bb-96673854969e\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6572z" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.764111 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2cdd\" (UniqueName: \"kubernetes.io/projected/99efba52-bc27-49d7-8efb-154b6e3787a9-kube-api-access-f2cdd\") pod \"cluster-samples-operator-665b6dd947-qs72s\" (UID: \"99efba52-bc27-49d7-8efb-154b6e3787a9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qs72s" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.785221 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6b6a9601-6689-435b-aca1-256a0c3c07fb-bound-sa-token\") pod \"ingress-operator-5b745b69d9-ltwbb\" (UID: \"6b6a9601-6689-435b-aca1-256a0c3c07fb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ltwbb" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.803835 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:26 crc kubenswrapper[4733]: E0318 10:16:26.804666 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 10:16:27.304649401 +0000 UTC m=+226.796383726 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nwhtg" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.811836 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.818956 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z955v\" (UniqueName: \"kubernetes.io/projected/9c5f567e-b38f-44a0-b1fd-1a96857e811f-kube-api-access-z955v\") pod \"router-default-5444994796-xl5d7\" (UID: \"9c5f567e-b38f-44a0-b1fd-1a96857e811f\") " pod="openshift-ingress/router-default-5444994796-xl5d7" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.819293 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-gxcb2" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.838050 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-xl5d7" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.838701 4733 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.842140 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qs72s" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.847841 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.868660 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.872303 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9dd56" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.890547 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.896986 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aa4b5542-dc36-4c93-88e5-a080729b94ae-config-volume\") pod \"dns-default-hvmrz\" (UID: \"aa4b5542-dc36-4c93-88e5-a080729b94ae\") " pod="openshift-dns/dns-default-hvmrz" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.905089 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-8v244" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.905381 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:16:26 crc kubenswrapper[4733]: E0318 10:16:26.905530 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:16:27.405506137 +0000 UTC m=+226.897240462 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.905918 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:26 crc kubenswrapper[4733]: E0318 10:16:26.906258 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 10:16:27.406250841 +0000 UTC m=+226.897985166 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nwhtg" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.908834 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.916432 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/aa4b5542-dc36-4c93-88e5-a080729b94ae-metrics-tls\") pod \"dns-default-hvmrz\" (UID: \"aa4b5542-dc36-4c93-88e5-a080729b94ae\") " pod="openshift-dns/dns-default-hvmrz" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.924420 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ltwbb" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.926297 4733 request.go:700] Waited for 1.903494839s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-canary/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.928065 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.935649 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m5k95" Mar 18 10:16:26 crc kubenswrapper[4733]: W0318 10:16:26.936526 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2539fca8_3dde_43ed_815c_e837f37dfdd5.slice/crio-7641cfeea867267cd87fc0fa0061dc622336101049c7d0d10a3af6628d0c4a6d WatchSource:0}: Error finding container 7641cfeea867267cd87fc0fa0061dc622336101049c7d0d10a3af6628d0c4a6d: Status 404 returned error can't find the container with id 7641cfeea867267cd87fc0fa0061dc622336101049c7d0d10a3af6628d0c4a6d Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.949537 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.970258 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-zztn5"] Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.971516 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.978742 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d5979b3d-b4b4-4081-b486-4fcf91f6367c-cert\") pod \"ingress-canary-pvlch\" (UID: \"d5979b3d-b4b4-4081-b486-4fcf91f6367c\") " pod="openshift-ingress-canary/ingress-canary-pvlch" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.978860 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-xh9n5" Mar 18 10:16:26 crc kubenswrapper[4733]: I0318 10:16:26.988742 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.007281 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:16:27 crc kubenswrapper[4733]: E0318 10:16:27.007448 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:16:27.507422077 +0000 UTC m=+226.999156402 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.007782 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:27 crc kubenswrapper[4733]: E0318 10:16:27.008067 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 10:16:27.508054408 +0000 UTC m=+226.999788733 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nwhtg" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.013806 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.023625 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a39a28f7-1fd2-44f7-8b49-05a0faf1e000-certs\") pod \"machine-config-server-fnzxw\" (UID: \"a39a28f7-1fd2-44f7-8b49-05a0faf1e000\") " pod="openshift-machine-config-operator/machine-config-server-fnzxw" Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.028833 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.032689 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6572z" Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.051220 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.063287 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a39a28f7-1fd2-44f7-8b49-05a0faf1e000-node-bootstrap-token\") pod \"machine-config-server-fnzxw\" (UID: \"a39a28f7-1fd2-44f7-8b49-05a0faf1e000\") " pod="openshift-machine-config-operator/machine-config-server-fnzxw" Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.092506 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qqrn\" (UniqueName: \"kubernetes.io/projected/7b5dc098-4a15-429b-8243-1ac75ce2e0c1-kube-api-access-2qqrn\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.111407 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:16:27 crc kubenswrapper[4733]: E0318 10:16:27.111965 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:16:27.611945101 +0000 UTC m=+227.103679426 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.124820 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7b5dc098-4a15-429b-8243-1ac75ce2e0c1-bound-sa-token\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.131701 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4lrt\" (UniqueName: \"kubernetes.io/projected/352d0ed5-c43b-431f-bd66-1749ab30d013-kube-api-access-l4lrt\") pod \"apiserver-7bbb656c7d-lsqn4\" (UID: \"352d0ed5-c43b-431f-bd66-1749ab30d013\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lsqn4" Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.148325 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d48vf\" (UniqueName: \"kubernetes.io/projected/57151941-19ac-4bb5-a93b-b5dfbc88e0d6-kube-api-access-d48vf\") pod \"machine-config-controller-84d6567774-hw7zb\" (UID: \"57151941-19ac-4bb5-a93b-b5dfbc88e0d6\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hw7zb" Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.165505 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfbq5\" (UniqueName: \"kubernetes.io/projected/9571ba80-f267-46ed-8d16-e44531cb0ce8-kube-api-access-wfbq5\") pod \"olm-operator-6b444d44fb-2wc5m\" (UID: \"9571ba80-f267-46ed-8d16-e44531cb0ce8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2wc5m" Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.187756 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4clx\" (UniqueName: \"kubernetes.io/projected/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-kube-api-access-h4clx\") pod \"oauth-openshift-558db77b4-n6hmz\" (UID: \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6hmz" Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.197909 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qs72s"] Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.204824 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-px4sd\" (UniqueName: \"kubernetes.io/projected/d5979b3d-b4b4-4081-b486-4fcf91f6367c-kube-api-access-px4sd\") pod \"ingress-canary-pvlch\" (UID: \"d5979b3d-b4b4-4081-b486-4fcf91f6367c\") " pod="openshift-ingress-canary/ingress-canary-pvlch" Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.213135 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:27 crc kubenswrapper[4733]: E0318 10:16:27.215428 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 10:16:27.7154095 +0000 UTC m=+227.207143825 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nwhtg" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.225444 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-8v244"] Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.230794 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lsqn4" Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.234165 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mscv\" (UniqueName: \"kubernetes.io/projected/5192f67b-f2ab-45eb-9b1a-64bdff02437a-kube-api-access-8mscv\") pod \"marketplace-operator-79b997595-9h9xr\" (UID: \"5192f67b-f2ab-45eb-9b1a-64bdff02437a\") " pod="openshift-marketplace/marketplace-operator-79b997595-9h9xr" Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.238928 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-n6hmz" Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.246893 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2kgf\" (UniqueName: \"kubernetes.io/projected/a39a28f7-1fd2-44f7-8b49-05a0faf1e000-kube-api-access-r2kgf\") pod \"machine-config-server-fnzxw\" (UID: \"a39a28f7-1fd2-44f7-8b49-05a0faf1e000\") " pod="openshift-machine-config-operator/machine-config-server-fnzxw" Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.250976 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-ltwbb"] Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.252664 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-gxcb2"] Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.268615 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppfv2\" (UniqueName: \"kubernetes.io/projected/fe90c8a8-c79a-4ed5-bec1-5ea07fbad5cf-kube-api-access-ppfv2\") pod \"openshift-controller-manager-operator-756b6f6bc6-mj46t\" (UID: \"fe90c8a8-c79a-4ed5-bec1-5ea07fbad5cf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mj46t" Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.287839 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-xh9n5"] Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.290989 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-pvlch" Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.295855 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-fnzxw" Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.298044 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b405f127-b181-49a1-8205-aafd58d1fa7b-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-gdmqx\" (UID: \"b405f127-b181-49a1-8205-aafd58d1fa7b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gdmqx" Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.301550 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-6572z"] Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.305158 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hw7zb" Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.305820 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-km4fz\" (UniqueName: \"kubernetes.io/projected/ed943a82-ef39-4ebc-9d76-09bb69f3b800-kube-api-access-km4fz\") pod \"csi-hostpathplugin-p4b5s\" (UID: \"ed943a82-ef39-4ebc-9d76-09bb69f3b800\") " pod="hostpath-provisioner/csi-hostpathplugin-p4b5s" Mar 18 10:16:27 crc kubenswrapper[4733]: W0318 10:16:27.312759 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod158a5836_f175_4da3_b22d_6a3130a89d30.slice/crio-a8d759096e3def15bd7fc6782fe1ae3d0b85fb4bb74cad2d5fdba68e3116790f WatchSource:0}: Error finding container a8d759096e3def15bd7fc6782fe1ae3d0b85fb4bb74cad2d5fdba68e3116790f: Status 404 returned error can't find the container with id a8d759096e3def15bd7fc6782fe1ae3d0b85fb4bb74cad2d5fdba68e3116790f Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.314293 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:16:27 crc kubenswrapper[4733]: E0318 10:16:27.314488 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:16:27.814463249 +0000 UTC m=+227.306197574 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.314804 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:27 crc kubenswrapper[4733]: E0318 10:16:27.315247 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 10:16:27.815238174 +0000 UTC m=+227.306972499 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nwhtg" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:27 crc kubenswrapper[4733]: W0318 10:16:27.323013 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0da800f_a7ca_4d0e_89bb_96673854969e.slice/crio-8c23f5367b08d7ec924e81e14494da32936d3eded7ada5c1129339081e4668e4 WatchSource:0}: Error finding container 8c23f5367b08d7ec924e81e14494da32936d3eded7ada5c1129339081e4668e4: Status 404 returned error can't find the container with id 8c23f5367b08d7ec924e81e14494da32936d3eded7ada5c1129339081e4668e4 Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.324531 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngmrr\" (UniqueName: \"kubernetes.io/projected/b405f127-b181-49a1-8205-aafd58d1fa7b-kube-api-access-ngmrr\") pod \"cluster-image-registry-operator-dc59b4c8b-gdmqx\" (UID: \"b405f127-b181-49a1-8205-aafd58d1fa7b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gdmqx" Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.346422 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2wc5m" Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.350264 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/84ddb369-1909-4d63-a0c0-b250490992c0-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-bzhq6\" (UID: \"84ddb369-1909-4d63-a0c0-b250490992c0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bzhq6" Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.369570 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-9h9xr" Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.370672 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2qxh\" (UniqueName: \"kubernetes.io/projected/56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6-kube-api-access-c2qxh\") pod \"apiserver-76f77b778f-xvnwv\" (UID: \"56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6\") " pod="openshift-apiserver/apiserver-76f77b778f-xvnwv" Mar 18 10:16:27 crc kubenswrapper[4733]: W0318 10:16:27.400135 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda39a28f7_1fd2_44f7_8b49_05a0faf1e000.slice/crio-b8f6e716a5c1af59b334b099ae81e6b4488d512e7cedc4016a21f3ebde2403f2 WatchSource:0}: Error finding container b8f6e716a5c1af59b334b099ae81e6b4488d512e7cedc4016a21f3ebde2403f2: Status 404 returned error can't find the container with id b8f6e716a5c1af59b334b099ae81e6b4488d512e7cedc4016a21f3ebde2403f2 Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.402446 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6nw8\" (UniqueName: \"kubernetes.io/projected/10e64d74-2e25-41fd-a9ad-32a3e74e5c01-kube-api-access-h6nw8\") pod \"dns-operator-744455d44c-vsnq2\" (UID: \"10e64d74-2e25-41fd-a9ad-32a3e74e5c01\") " pod="openshift-dns-operator/dns-operator-744455d44c-vsnq2" Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.404579 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fblf6\" (UniqueName: \"kubernetes.io/projected/4810c2fd-346b-44a0-b985-46d302060373-kube-api-access-fblf6\") pod \"console-operator-58897d9998-lptjf\" (UID: \"4810c2fd-346b-44a0-b985-46d302060373\") " pod="openshift-console-operator/console-operator-58897d9998-lptjf" Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.408452 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mj46t" Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.415507 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:16:27 crc kubenswrapper[4733]: E0318 10:16:27.415948 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:16:27.915932915 +0000 UTC m=+227.407667230 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.420838 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-vsnq2" Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.425143 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-xh9n5" event={"ID":"158a5836-f175-4da3-b22d-6a3130a89d30","Type":"ContainerStarted","Data":"a8d759096e3def15bd7fc6782fe1ae3d0b85fb4bb74cad2d5fdba68e3116790f"} Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.427634 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gdmqx" Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.428518 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-xl5d7" event={"ID":"9c5f567e-b38f-44a0-b1fd-1a96857e811f","Type":"ContainerStarted","Data":"6a0821eaf0ace459007c5e1ac4ea5e8e71a9bf775a8c12d7396c4a66aae6f399"} Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.428556 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-xl5d7" event={"ID":"9c5f567e-b38f-44a0-b1fd-1a96857e811f","Type":"ContainerStarted","Data":"77a44e88ff52ee3e93fedaca9ef3e99488e511df1a94502edd941b321649b22f"} Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.434585 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2whnv\" (UniqueName: \"kubernetes.io/projected/71a70c3c-d483-43f4-9f54-10978c7f8cc8-kube-api-access-2whnv\") pod \"auto-csr-approver-29563816-4582s\" (UID: \"71a70c3c-d483-43f4-9f54-10978c7f8cc8\") " pod="openshift-infra/auto-csr-approver-29563816-4582s" Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.436005 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-gxcb2" event={"ID":"61e27ee7-5eb0-4cc7-a696-85ddd192b171","Type":"ContainerStarted","Data":"c6a2e1b73a77cc11d754771ab069f0ca8832726e42d557307e0ab4ffb83a3bf0"} Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.445597 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-8v244" event={"ID":"f27409fc-b6dd-4573-918b-7b30b3635cc7","Type":"ContainerStarted","Data":"a5e5da5d6249a1112447a42843768f7217f63fd427eb58063240eac26ad5daee"} Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.448473 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qs72s" event={"ID":"99efba52-bc27-49d7-8efb-154b6e3787a9","Type":"ContainerStarted","Data":"d43b8bdc150cb1529e884ad3fa3abe213ef64ab2d2f93043bda6683af7158f68"} Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.449350 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vn4vp\" (UniqueName: \"kubernetes.io/projected/3a0400a1-7e6b-4335-8819-586d7a460e3d-kube-api-access-vn4vp\") pod \"packageserver-d55dfcdfc-z8g4f\" (UID: \"3a0400a1-7e6b-4335-8819-586d7a460e3d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z8g4f" Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.456461 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-848w7" event={"ID":"43cea3fb-14f9-4993-a8a9-4618680e8286","Type":"ContainerStarted","Data":"84a76e4c28a1a47f5b3b223ed33580d3157658233ba71b9ee938f5d574b00588"} Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.456505 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-848w7" Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.458834 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-lptjf" Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.470426 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scsd7\" (UniqueName: \"kubernetes.io/projected/53a14d61-5c2c-44b8-b3cb-c8daa23762bf-kube-api-access-scsd7\") pod \"service-ca-9c57cc56f-h5xdn\" (UID: \"53a14d61-5c2c-44b8-b3cb-c8daa23762bf\") " pod="openshift-service-ca/service-ca-9c57cc56f-h5xdn" Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.476102 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-zztn5" event={"ID":"d9572819-3894-4603-bd2b-7c9465bb0067","Type":"ContainerStarted","Data":"60211db626956dc1b7fbfef86dfdf4c016f3357c9749f94b72f116821c89798b"} Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.479629 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6572z" event={"ID":"c0da800f-a7ca-4d0e-89bb-96673854969e","Type":"ContainerStarted","Data":"8c23f5367b08d7ec924e81e14494da32936d3eded7ada5c1129339081e4668e4"} Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.488390 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ltwbb" event={"ID":"6b6a9601-6689-435b-aca1-256a0c3c07fb","Type":"ContainerStarted","Data":"af40990270173839420767ce10fda62401f1d57ecdf0a8838d88225d26980226"} Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.496408 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9dd56" event={"ID":"2539fca8-3dde-43ed-815c-e837f37dfdd5","Type":"ContainerStarted","Data":"6cd2d6a279ab711c3730b39d682356d39f6069afd8b89fe32581e162caee0dfd"} Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.496455 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9dd56" event={"ID":"2539fca8-3dde-43ed-815c-e837f37dfdd5","Type":"ContainerStarted","Data":"7641cfeea867267cd87fc0fa0061dc622336101049c7d0d10a3af6628d0c4a6d"} Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.497498 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-fnzxw" event={"ID":"a39a28f7-1fd2-44f7-8b49-05a0faf1e000","Type":"ContainerStarted","Data":"b8f6e716a5c1af59b334b099ae81e6b4488d512e7cedc4016a21f3ebde2403f2"} Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.511339 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7522\" (UniqueName: \"kubernetes.io/projected/ad648fa7-2560-4aa0-8634-05bcbc48916f-kube-api-access-n7522\") pod \"service-ca-operator-777779d784-t95b6\" (UID: \"ad648fa7-2560-4aa0-8634-05bcbc48916f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-t95b6" Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.516136 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grpxc\" (UniqueName: \"kubernetes.io/projected/ec8840c3-e0bd-4cf0-9dd4-87d9ae93b806-kube-api-access-grpxc\") pod \"machine-config-operator-74547568cd-mz68f\" (UID: \"ec8840c3-e0bd-4cf0-9dd4-87d9ae93b806\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mz68f" Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.517791 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:27 crc kubenswrapper[4733]: E0318 10:16:27.520477 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 10:16:28.020457248 +0000 UTC m=+227.512191573 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nwhtg" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.541822 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-t95b6" Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.541951 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563816-4582s" Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.551586 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bzhq6" Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.554585 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-lsqn4"] Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.558244 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9l2x\" (UniqueName: \"kubernetes.io/projected/87157be2-0fc3-4120-b9b6-d4494ace940a-kube-api-access-j9l2x\") pod \"multus-admission-controller-857f4d67dd-h8kqf\" (UID: \"87157be2-0fc3-4120-b9b6-d4494ace940a\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-h8kqf" Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.561047 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-xvnwv" Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.561735 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3be6d75e-e4f8-4d9b-8ed3-9d25632de88c-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mxb9q\" (UID: \"3be6d75e-e4f8-4d9b-8ed3-9d25632de88c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mxb9q" Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.563675 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncjbj\" (UniqueName: \"kubernetes.io/projected/9b0edb65-3bcf-484f-9707-d8124df1ec88-kube-api-access-ncjbj\") pod \"package-server-manager-789f6589d5-kd6gw\" (UID: \"9b0edb65-3bcf-484f-9707-d8124df1ec88\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kd6gw" Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.577047 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-p4b5s" Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.591707 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4zf8\" (UniqueName: \"kubernetes.io/projected/10f3d99e-72fa-4c62-8190-059d7a0effd1-kube-api-access-t4zf8\") pod \"kube-storage-version-migrator-operator-b67b599dd-dxd7p\" (UID: \"10f3d99e-72fa-4c62-8190-059d7a0effd1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dxd7p" Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.605290 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/97ffe185-3f09-44d0-a173-f95bb53c419e-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-8kv4d\" (UID: \"97ffe185-3f09-44d0-a173-f95bb53c419e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8kv4d" Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.618958 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:16:27 crc kubenswrapper[4733]: E0318 10:16:27.621130 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:16:28.121114378 +0000 UTC m=+227.612848703 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.624418 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lfsh\" (UniqueName: \"kubernetes.io/projected/d915f7d2-5b4d-4017-a839-b615a182fafb-kube-api-access-8lfsh\") pod \"collect-profiles-29563815-tsrs6\" (UID: \"d915f7d2-5b4d-4017-a839-b615a182fafb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563815-tsrs6" Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.645547 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kwjh\" (UniqueName: \"kubernetes.io/projected/f2b6c2ec-c07f-4d59-ba90-1ed2ec55d8a7-kube-api-access-6kwjh\") pod \"control-plane-machine-set-operator-78cbb6b69f-4lbr5\" (UID: \"f2b6c2ec-c07f-4d59-ba90-1ed2ec55d8a7\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4lbr5" Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.652104 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mz68f" Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.660407 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4lbr5" Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.675330 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mxb9q" Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.681645 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzlbf\" (UniqueName: \"kubernetes.io/projected/aa4b5542-dc36-4c93-88e5-a080729b94ae-kube-api-access-dzlbf\") pod \"dns-default-hvmrz\" (UID: \"aa4b5542-dc36-4c93-88e5-a080729b94ae\") " pod="openshift-dns/dns-default-hvmrz" Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.689999 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gw7q6\" (UniqueName: \"kubernetes.io/projected/4edea753-21f5-44fd-b183-daf03845dcd8-kube-api-access-gw7q6\") pod \"catalog-operator-68c6474976-g686q\" (UID: \"4edea753-21f5-44fd-b183-daf03845dcd8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g686q" Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.696792 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-h5xdn" Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.709346 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ll8g5\" (UniqueName: \"kubernetes.io/projected/34ea1a9f-9093-421f-bef3-228352aa65fb-kube-api-access-ll8g5\") pod \"controller-manager-879f6c89f-7rr85\" (UID: \"34ea1a9f-9093-421f-bef3-228352aa65fb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7rr85" Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.714839 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563815-tsrs6" Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.721656 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:27 crc kubenswrapper[4733]: E0318 10:16:27.722168 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 10:16:28.22215283 +0000 UTC m=+227.713887165 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nwhtg" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.738960 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z8g4f" Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.753555 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dxd7p" Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.766084 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-h8kqf" Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.777673 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8kv4d" Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.825298 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:16:27 crc kubenswrapper[4733]: E0318 10:16:27.825759 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:16:28.325740574 +0000 UTC m=+227.817474899 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.840566 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-xl5d7" Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.845740 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kd6gw" Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.846014 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-pvlch"] Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.856708 4733 patch_prober.go:28] interesting pod/router-default-5444994796-xl5d7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 10:16:27 crc kubenswrapper[4733]: [-]has-synced failed: reason withheld Mar 18 10:16:27 crc kubenswrapper[4733]: [+]process-running ok Mar 18 10:16:27 crc kubenswrapper[4733]: healthz check failed Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.856753 4733 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xl5d7" podUID="9c5f567e-b38f-44a0-b1fd-1a96857e811f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.860640 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-n6hmz"] Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.891573 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hvmrz" Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.902660 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2wc5m"] Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.909444 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-hw7zb"] Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.929970 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:27 crc kubenswrapper[4733]: E0318 10:16:27.930561 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 10:16:28.430537936 +0000 UTC m=+227.922272261 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nwhtg" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:27 crc kubenswrapper[4733]: I0318 10:16:27.980767 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g686q" Mar 18 10:16:28 crc kubenswrapper[4733]: I0318 10:16:27.999262 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-7rr85" Mar 18 10:16:28 crc kubenswrapper[4733]: I0318 10:16:28.031620 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:16:28 crc kubenswrapper[4733]: E0318 10:16:28.032097 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:16:28.532081524 +0000 UTC m=+228.023815849 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:28 crc kubenswrapper[4733]: I0318 10:16:28.133764 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:28 crc kubenswrapper[4733]: E0318 10:16:28.134341 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 10:16:28.634317315 +0000 UTC m=+228.126051640 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nwhtg" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:28 crc kubenswrapper[4733]: I0318 10:16:28.235134 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:16:28 crc kubenswrapper[4733]: E0318 10:16:28.235416 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:16:28.735394928 +0000 UTC m=+228.227129253 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:28 crc kubenswrapper[4733]: I0318 10:16:28.235752 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:28 crc kubenswrapper[4733]: E0318 10:16:28.235999 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 10:16:28.735988067 +0000 UTC m=+228.227722382 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nwhtg" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:28 crc kubenswrapper[4733]: I0318 10:16:28.288998 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-lptjf"] Mar 18 10:16:28 crc kubenswrapper[4733]: I0318 10:16:28.307302 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-vsnq2"] Mar 18 10:16:28 crc kubenswrapper[4733]: I0318 10:16:28.315347 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gdmqx"] Mar 18 10:16:28 crc kubenswrapper[4733]: I0318 10:16:28.339135 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:16:28 crc kubenswrapper[4733]: E0318 10:16:28.339390 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:16:28.839354963 +0000 UTC m=+228.331089288 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:28 crc kubenswrapper[4733]: I0318 10:16:28.339634 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:28 crc kubenswrapper[4733]: E0318 10:16:28.339998 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 10:16:28.839990154 +0000 UTC m=+228.331724479 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nwhtg" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:28 crc kubenswrapper[4733]: I0318 10:16:28.441510 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:16:28 crc kubenswrapper[4733]: E0318 10:16:28.442305 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:16:28.942281476 +0000 UTC m=+228.434015801 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:28 crc kubenswrapper[4733]: I0318 10:16:28.446580 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:28 crc kubenswrapper[4733]: E0318 10:16:28.447170 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 10:16:28.947158812 +0000 UTC m=+228.438893127 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nwhtg" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:28 crc kubenswrapper[4733]: I0318 10:16:28.534257 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-n6hmz" event={"ID":"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1","Type":"ContainerStarted","Data":"a4a546ed80545bf50a0d399d05bcd3718be5de86367b6c0e97b326427eeeb776"} Mar 18 10:16:28 crc kubenswrapper[4733]: I0318 10:16:28.541621 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ltwbb" event={"ID":"6b6a9601-6689-435b-aca1-256a0c3c07fb","Type":"ContainerStarted","Data":"72854d30dcf4277263030dcc04fa8e3b931cb135f23b8f64dd5d658527b9294a"} Mar 18 10:16:28 crc kubenswrapper[4733]: I0318 10:16:28.541670 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ltwbb" event={"ID":"6b6a9601-6689-435b-aca1-256a0c3c07fb","Type":"ContainerStarted","Data":"c1a12174bc8f1eb67af8ffffdb1fffbcbcdb362d4ea4dbcf508b0589cbe29f08"} Mar 18 10:16:28 crc kubenswrapper[4733]: I0318 10:16:28.545361 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lsqn4" event={"ID":"352d0ed5-c43b-431f-bd66-1749ab30d013","Type":"ContainerStarted","Data":"203bbcbf6e2d03e6169b7db13046fb530cfcba776d38b189faf47c799b7d4c84"} Mar 18 10:16:28 crc kubenswrapper[4733]: I0318 10:16:28.547847 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:16:28 crc kubenswrapper[4733]: E0318 10:16:28.548020 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:16:29.047996687 +0000 UTC m=+228.539731012 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:28 crc kubenswrapper[4733]: I0318 10:16:28.548343 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:28 crc kubenswrapper[4733]: E0318 10:16:28.548747 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 10:16:29.048730261 +0000 UTC m=+228.540464586 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nwhtg" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:28 crc kubenswrapper[4733]: I0318 10:16:28.554027 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bzhq6"] Mar 18 10:16:28 crc kubenswrapper[4733]: I0318 10:16:28.555869 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-gxcb2" event={"ID":"61e27ee7-5eb0-4cc7-a696-85ddd192b171","Type":"ContainerStarted","Data":"0e6c3d3da7414da2e0ca8efb58ca0763564c7f6014da1bcee33997eb225ede97"} Mar 18 10:16:28 crc kubenswrapper[4733]: I0318 10:16:28.555993 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-gxcb2" Mar 18 10:16:28 crc kubenswrapper[4733]: I0318 10:16:28.562993 4733 patch_prober.go:28] interesting pod/downloads-7954f5f757-gxcb2 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" start-of-body= Mar 18 10:16:28 crc kubenswrapper[4733]: I0318 10:16:28.563066 4733 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-gxcb2" podUID="61e27ee7-5eb0-4cc7-a696-85ddd192b171" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" Mar 18 10:16:28 crc kubenswrapper[4733]: I0318 10:16:28.564351 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563816-4582s"] Mar 18 10:16:28 crc kubenswrapper[4733]: I0318 10:16:28.566152 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-pvlch" event={"ID":"d5979b3d-b4b4-4081-b486-4fcf91f6367c","Type":"ContainerStarted","Data":"113cc6c1a897cb5a44dd78a65cb2ea6a6b8c07d1baccca5b4ff1d285831d52f9"} Mar 18 10:16:28 crc kubenswrapper[4733]: I0318 10:16:28.569117 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-8v244" event={"ID":"f27409fc-b6dd-4573-918b-7b30b3635cc7","Type":"ContainerStarted","Data":"bd6ef4d994ae506be5343c7bd62e3c9d5c8d51a521ee2a66c4d08bede745d9e1"} Mar 18 10:16:28 crc kubenswrapper[4733]: I0318 10:16:28.588221 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-t95b6"] Mar 18 10:16:28 crc kubenswrapper[4733]: I0318 10:16:28.588281 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-lptjf" event={"ID":"4810c2fd-346b-44a0-b985-46d302060373","Type":"ContainerStarted","Data":"424301d2dbd8aeaa7b172e2a161e360ebcd9209429a5cadd81a09d0602e3686c"} Mar 18 10:16:28 crc kubenswrapper[4733]: I0318 10:16:28.588305 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mj46t"] Mar 18 10:16:28 crc kubenswrapper[4733]: I0318 10:16:28.594287 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9h9xr"] Mar 18 10:16:28 crc kubenswrapper[4733]: I0318 10:16:28.612628 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6572z" event={"ID":"c0da800f-a7ca-4d0e-89bb-96673854969e","Type":"ContainerStarted","Data":"118ea78d245d5b728468627b8c7ab47e6f832335830c25ed70b25f6dd7a01a25"} Mar 18 10:16:28 crc kubenswrapper[4733]: I0318 10:16:28.640779 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-fnzxw" event={"ID":"a39a28f7-1fd2-44f7-8b49-05a0faf1e000","Type":"ContainerStarted","Data":"30b798fe671c0535b3caa99e9a4d56b7a41c2f648180dfebad30e138508b8aa4"} Mar 18 10:16:28 crc kubenswrapper[4733]: I0318 10:16:28.651607 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:16:28 crc kubenswrapper[4733]: E0318 10:16:28.652731 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:16:29.152709927 +0000 UTC m=+228.644444242 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:28 crc kubenswrapper[4733]: I0318 10:16:28.675387 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2wc5m" event={"ID":"9571ba80-f267-46ed-8d16-e44531cb0ce8","Type":"ContainerStarted","Data":"7509536c23a530965d60a2babd5ba5f501ce7c3e5ae88f80ac6edd7689874c81"} Mar 18 10:16:28 crc kubenswrapper[4733]: I0318 10:16:28.676560 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hw7zb" event={"ID":"57151941-19ac-4bb5-a93b-b5dfbc88e0d6","Type":"ContainerStarted","Data":"a711eb4e22c7790a7f75329be3d0673d6babb3f6cd16985bba1d5a761f53ad59"} Mar 18 10:16:28 crc kubenswrapper[4733]: I0318 10:16:28.690175 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qs72s" event={"ID":"99efba52-bc27-49d7-8efb-154b6e3787a9","Type":"ContainerStarted","Data":"6ca6d3f8258ac7f1ba833759822d563e438abce6593a7151642000287f32dcb6"} Mar 18 10:16:28 crc kubenswrapper[4733]: I0318 10:16:28.693972 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-zztn5" event={"ID":"d9572819-3894-4603-bd2b-7c9465bb0067","Type":"ContainerStarted","Data":"80055ea6064b163f8ba1a16c665bd9bac35a28e65e0935c42762d28779dcf6cd"} Mar 18 10:16:28 crc kubenswrapper[4733]: I0318 10:16:28.708013 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9dd56" event={"ID":"2539fca8-3dde-43ed-815c-e837f37dfdd5","Type":"ContainerStarted","Data":"ecf92323ff5f1b4df76e0fcb1d32e8f9f8f8594734aaf26d3d3727e3e12ba2b6"} Mar 18 10:16:28 crc kubenswrapper[4733]: I0318 10:16:28.718879 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-xh9n5" event={"ID":"158a5836-f175-4da3-b22d-6a3130a89d30","Type":"ContainerStarted","Data":"35bffb9ac015b9208a2749f27ac917300045ccb592684109b3caa94bf45c1e9b"} Mar 18 10:16:28 crc kubenswrapper[4733]: I0318 10:16:28.753774 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:28 crc kubenswrapper[4733]: E0318 10:16:28.755766 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 10:16:29.255750593 +0000 UTC m=+228.747484918 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nwhtg" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:28 crc kubenswrapper[4733]: I0318 10:16:28.844921 4733 patch_prober.go:28] interesting pod/router-default-5444994796-xl5d7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 10:16:28 crc kubenswrapper[4733]: [-]has-synced failed: reason withheld Mar 18 10:16:28 crc kubenswrapper[4733]: [+]process-running ok Mar 18 10:16:28 crc kubenswrapper[4733]: healthz check failed Mar 18 10:16:28 crc kubenswrapper[4733]: I0318 10:16:28.845444 4733 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xl5d7" podUID="9c5f567e-b38f-44a0-b1fd-1a96857e811f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 10:16:28 crc kubenswrapper[4733]: I0318 10:16:28.859592 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:16:28 crc kubenswrapper[4733]: E0318 10:16:28.860004 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:16:29.359989058 +0000 UTC m=+228.851723383 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:28 crc kubenswrapper[4733]: I0318 10:16:28.961323 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:28 crc kubenswrapper[4733]: E0318 10:16:28.961756 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 10:16:29.461743942 +0000 UTC m=+228.953478267 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nwhtg" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:28 crc kubenswrapper[4733]: I0318 10:16:28.967507 4733 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 10:16:28 crc kubenswrapper[4733]: I0318 10:16:28.985759 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4lbr5"] Mar 18 10:16:28 crc kubenswrapper[4733]: I0318 10:16:28.985827 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-xvnwv"] Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.062712 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:16:29 crc kubenswrapper[4733]: E0318 10:16:29.063353 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:16:29.563332832 +0000 UTC m=+229.055067157 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.063511 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:29 crc kubenswrapper[4733]: E0318 10:16:29.063789 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 10:16:29.563775796 +0000 UTC m=+229.055510121 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nwhtg" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.065862 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-848w7" podStartSLOduration=171.065838862 podStartE2EDuration="2m51.065838862s" podCreationTimestamp="2026-03-18 10:13:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:16:29.062256858 +0000 UTC m=+228.553991183" watchObservedRunningTime="2026-03-18 10:16:29.065838862 +0000 UTC m=+228.557573187" Mar 18 10:16:29 crc kubenswrapper[4733]: E0318 10:16:29.164574 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:16:29.66454133 +0000 UTC m=+229.156275655 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.164075 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.168453 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:29 crc kubenswrapper[4733]: E0318 10:16:29.177396 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 10:16:29.6773641 +0000 UTC m=+229.169098425 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nwhtg" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.181814 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-nbftd" podStartSLOduration=171.181794921 podStartE2EDuration="2m51.181794921s" podCreationTimestamp="2026-03-18 10:13:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:16:29.180307764 +0000 UTC m=+228.672042089" watchObservedRunningTime="2026-03-18 10:16:29.181794921 +0000 UTC m=+228.673529246" Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.269354 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:16:29 crc kubenswrapper[4733]: E0318 10:16:29.270121 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:16:29.770105156 +0000 UTC m=+229.261839481 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.372182 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:29 crc kubenswrapper[4733]: E0318 10:16:29.372520 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 10:16:29.872500232 +0000 UTC m=+229.364234557 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nwhtg" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.384223 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7z2vw" podStartSLOduration=171.384205606 podStartE2EDuration="2m51.384205606s" podCreationTimestamp="2026-03-18 10:13:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:16:29.383523114 +0000 UTC m=+228.875257449" watchObservedRunningTime="2026-03-18 10:16:29.384205606 +0000 UTC m=+228.875939931" Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.447484 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-p4b5s"] Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.451548 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-mz68f"] Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.461038 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kd6gw"] Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.466811 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m5k95" podStartSLOduration=171.466795118 podStartE2EDuration="2m51.466795118s" podCreationTimestamp="2026-03-18 10:13:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:16:29.466293052 +0000 UTC m=+228.958027377" watchObservedRunningTime="2026-03-18 10:16:29.466795118 +0000 UTC m=+228.958529443" Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.476812 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:16:29 crc kubenswrapper[4733]: E0318 10:16:29.477248 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:16:29.977232482 +0000 UTC m=+229.468966807 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.491096 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7rr85"] Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.492629 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mxb9q"] Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.558244 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8kv4d"] Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.571628 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-xl5d7" podStartSLOduration=171.571602391 podStartE2EDuration="2m51.571602391s" podCreationTimestamp="2026-03-18 10:13:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:16:29.555216756 +0000 UTC m=+229.046951081" watchObservedRunningTime="2026-03-18 10:16:29.571602391 +0000 UTC m=+229.063336716" Mar 18 10:16:29 crc kubenswrapper[4733]: W0318 10:16:29.574821 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34ea1a9f_9093_421f_bef3_228352aa65fb.slice/crio-9ae227eb47f15060adbbd96eb5744108b09bf4ae0d948bfba7b04b2c867d1d95 WatchSource:0}: Error finding container 9ae227eb47f15060adbbd96eb5744108b09bf4ae0d948bfba7b04b2c867d1d95: Status 404 returned error can't find the container with id 9ae227eb47f15060adbbd96eb5744108b09bf4ae0d948bfba7b04b2c867d1d95 Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.579085 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:29 crc kubenswrapper[4733]: E0318 10:16:29.579496 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 10:16:30.079484173 +0000 UTC m=+229.571218498 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nwhtg" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.590663 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z8g4f"] Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.600707 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-xh9n5" podStartSLOduration=171.600685941 podStartE2EDuration="2m51.600685941s" podCreationTimestamp="2026-03-18 10:13:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:16:29.599619767 +0000 UTC m=+229.091354092" watchObservedRunningTime="2026-03-18 10:16:29.600685941 +0000 UTC m=+229.092420266" Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.626609 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-h8kqf"] Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.642725 4733 ???:1] "http: TLS handshake error from 192.168.126.11:40554: no serving certificate available for the kubelet" Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.649720 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-h5xdn"] Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.659684 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g686q"] Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.662855 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-fnzxw" podStartSLOduration=5.6628391879999995 podStartE2EDuration="5.662839188s" podCreationTimestamp="2026-03-18 10:16:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:16:29.627025724 +0000 UTC m=+229.118760059" watchObservedRunningTime="2026-03-18 10:16:29.662839188 +0000 UTC m=+229.154573513" Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.670971 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563815-tsrs6"] Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.672041 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-8v244" podStartSLOduration=171.672028932 podStartE2EDuration="2m51.672028932s" podCreationTimestamp="2026-03-18 10:13:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:16:29.66538943 +0000 UTC m=+229.157123755" watchObservedRunningTime="2026-03-18 10:16:29.672028932 +0000 UTC m=+229.163763257" Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.680559 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:16:29 crc kubenswrapper[4733]: E0318 10:16:29.681050 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:16:30.18103307 +0000 UTC m=+229.672767395 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.712227 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dxd7p"] Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.715702 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-gxcb2" podStartSLOduration=171.715663268 podStartE2EDuration="2m51.715663268s" podCreationTimestamp="2026-03-18 10:13:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:16:29.709233692 +0000 UTC m=+229.200968027" watchObservedRunningTime="2026-03-18 10:16:29.715663268 +0000 UTC m=+229.207397603" Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.723483 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-hvmrz"] Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.738076 4733 ???:1] "http: TLS handshake error from 192.168.126.11:40556: no serving certificate available for the kubelet" Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.746282 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9dd56" podStartSLOduration=171.746254796 podStartE2EDuration="2m51.746254796s" podCreationTimestamp="2026-03-18 10:13:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:16:29.746247316 +0000 UTC m=+229.237981661" watchObservedRunningTime="2026-03-18 10:16:29.746254796 +0000 UTC m=+229.237989121" Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.758213 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z8g4f" event={"ID":"3a0400a1-7e6b-4335-8819-586d7a460e3d","Type":"ContainerStarted","Data":"c4a9ee364cd2466cb55a3d51888b8c4101c267164bf3b5436f6235972044a23d"} Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.766080 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8kv4d" event={"ID":"97ffe185-3f09-44d0-a173-f95bb53c419e","Type":"ContainerStarted","Data":"c46d0d6a694f3a8b4e2de7f5805dda623a4efdfb1d033cf681612d347d2aa0a3"} Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.788129 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:29 crc kubenswrapper[4733]: E0318 10:16:29.789607 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 10:16:30.289587673 +0000 UTC m=+229.781321998 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nwhtg" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.791435 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mj46t" event={"ID":"fe90c8a8-c79a-4ed5-bec1-5ea07fbad5cf","Type":"ContainerStarted","Data":"7968cddcf548bf1f43ac4db94f04ec9f35ec3eef36c903d4c32d8745369c8245"} Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.798723 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kd6gw" event={"ID":"9b0edb65-3bcf-484f-9707-d8124df1ec88","Type":"ContainerStarted","Data":"ffbc1663983c6339c5f1c43fda83f1cfb6b583c431dab23cd21586b747a1455c"} Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.802524 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qs72s" podStartSLOduration=171.802503586 podStartE2EDuration="2m51.802503586s" podCreationTimestamp="2026-03-18 10:13:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:16:29.794249452 +0000 UTC m=+229.285983777" watchObservedRunningTime="2026-03-18 10:16:29.802503586 +0000 UTC m=+229.294237901" Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.828998 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gdmqx" event={"ID":"b405f127-b181-49a1-8205-aafd58d1fa7b","Type":"ContainerStarted","Data":"5fe24ce835535b837457b8d82041843f9c571cab16780054b4369e26fe3d7709"} Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.829049 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gdmqx" event={"ID":"b405f127-b181-49a1-8205-aafd58d1fa7b","Type":"ContainerStarted","Data":"235ba7c7a945d007329d95e7f0539da81a2264608d4ac0ddc4e6122ef0b4674b"} Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.832415 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-p4b5s" event={"ID":"ed943a82-ef39-4ebc-9d76-09bb69f3b800","Type":"ContainerStarted","Data":"7c30dd242387191c6740e0ff74bbf53b6106d712df90701a2547298bdbf37b7f"} Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.833324 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-zztn5" podStartSLOduration=171.833305901 podStartE2EDuration="2m51.833305901s" podCreationTimestamp="2026-03-18 10:13:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:16:29.830825312 +0000 UTC m=+229.322559637" watchObservedRunningTime="2026-03-18 10:16:29.833305901 +0000 UTC m=+229.325040226" Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.835641 4733 ???:1] "http: TLS handshake error from 192.168.126.11:40566: no serving certificate available for the kubelet" Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.838681 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2wc5m" event={"ID":"9571ba80-f267-46ed-8d16-e44531cb0ce8","Type":"ContainerStarted","Data":"184a1347e2d9e7d55a6bb8afbead124af12ccaa51acec4577b15265b8ce508c8"} Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.842107 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2wc5m" Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.842131 4733 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-2wc5m container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.842180 4733 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2wc5m" podUID="9571ba80-f267-46ed-8d16-e44531cb0ce8" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.844626 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xvnwv" event={"ID":"56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6","Type":"ContainerStarted","Data":"82712470f77bccb95e10909493ae2791b9254e9d32324a113d332c37e6e2d3fa"} Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.848973 4733 patch_prober.go:28] interesting pod/router-default-5444994796-xl5d7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 10:16:29 crc kubenswrapper[4733]: [-]has-synced failed: reason withheld Mar 18 10:16:29 crc kubenswrapper[4733]: [+]process-running ok Mar 18 10:16:29 crc kubenswrapper[4733]: healthz check failed Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.849043 4733 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xl5d7" podUID="9c5f567e-b38f-44a0-b1fd-1a96857e811f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.857364 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563816-4582s" event={"ID":"71a70c3c-d483-43f4-9f54-10978c7f8cc8","Type":"ContainerStarted","Data":"f9019fd1aca4002d61050c62413d5f0b6ff4613e81da7416fe1c8a2924a20e45"} Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.861392 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mxb9q" event={"ID":"3be6d75e-e4f8-4d9b-8ed3-9d25632de88c","Type":"ContainerStarted","Data":"1f126ed2ad081a81751bb1d13eee61d744b9ca8e02bc19959b6f61e8304ddd49"} Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.866726 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-vsnq2" event={"ID":"10e64d74-2e25-41fd-a9ad-32a3e74e5c01","Type":"ContainerStarted","Data":"7ad83af7f3d2affc856d9f1d6fdf699c0d30ad628e2aaa849d3a379d7eac769b"} Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.866844 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-vsnq2" event={"ID":"10e64d74-2e25-41fd-a9ad-32a3e74e5c01","Type":"ContainerStarted","Data":"19c874d28f8aad6daae528555cc1f657cd4fe256c49afabb8796c57d354445a5"} Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.889649 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-pvlch" event={"ID":"d5979b3d-b4b4-4081-b486-4fcf91f6367c","Type":"ContainerStarted","Data":"819d4caa673ff793c5cea8327f5cf458e46f808282d254d0e7ec7b02cebaa2d8"} Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.890101 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ltwbb" podStartSLOduration=171.890075127 podStartE2EDuration="2m51.890075127s" podCreationTimestamp="2026-03-18 10:13:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:16:29.864881911 +0000 UTC m=+229.356616236" watchObservedRunningTime="2026-03-18 10:16:29.890075127 +0000 UTC m=+229.381809452" Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.891061 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:16:29 crc kubenswrapper[4733]: E0318 10:16:29.891766 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:16:30.391749221 +0000 UTC m=+229.883483536 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.899642 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4lbr5" event={"ID":"f2b6c2ec-c07f-4d59-ba90-1ed2ec55d8a7","Type":"ContainerStarted","Data":"5a8e674f13f4d362f5825ede2deb20ef47683fe21530ead2b54068d18e923d72"} Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.902425 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-9h9xr" event={"ID":"5192f67b-f2ab-45eb-9b1a-64bdff02437a","Type":"ContainerStarted","Data":"3d92f9fbfa1c8b8490e331060d587d908cf420777497bf90bb4815f3f49e79dd"} Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.902468 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-9h9xr" event={"ID":"5192f67b-f2ab-45eb-9b1a-64bdff02437a","Type":"ContainerStarted","Data":"7e583c6a058ccd4e267ac556fbc1ecc397a1e062881c05b38f716c2d4a35947b"} Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.902891 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-9h9xr" Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.910725 4733 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-9h9xr container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.910799 4733 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-9h9xr" podUID="5192f67b-f2ab-45eb-9b1a-64bdff02437a" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.922731 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bzhq6" event={"ID":"84ddb369-1909-4d63-a0c0-b250490992c0","Type":"ContainerStarted","Data":"47c628a39b3f5cebbc6098ab5941f0a3a5f4fd2b1f27c116d8289f2beae5970f"} Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.943624 4733 ???:1] "http: TLS handshake error from 192.168.126.11:40578: no serving certificate available for the kubelet" Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.944920 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2wc5m" podStartSLOduration=171.94488931 podStartE2EDuration="2m51.94488931s" podCreationTimestamp="2026-03-18 10:13:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:16:29.943962361 +0000 UTC m=+229.435696686" watchObservedRunningTime="2026-03-18 10:16:29.94488931 +0000 UTC m=+229.436623635" Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.946300 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gdmqx" podStartSLOduration=171.946292025 podStartE2EDuration="2m51.946292025s" podCreationTimestamp="2026-03-18 10:13:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:16:29.907292568 +0000 UTC m=+229.399026913" watchObservedRunningTime="2026-03-18 10:16:29.946292025 +0000 UTC m=+229.438026350" Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.968628 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-lptjf" event={"ID":"4810c2fd-346b-44a0-b985-46d302060373","Type":"ContainerStarted","Data":"735d2c405cd183d67cea5e3779b6db05a6c1b4858cd0631b0d3af1260094e44e"} Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.969923 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-lptjf" Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.973608 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-h5xdn" event={"ID":"53a14d61-5c2c-44b8-b3cb-c8daa23762bf","Type":"ContainerStarted","Data":"a7322d3ee85a057af8ee5370aa5b68beac85fd2e9f9f76ed7fe65f16fe471252"} Mar 18 10:16:29 crc kubenswrapper[4733]: I0318 10:16:29.995948 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:29 crc kubenswrapper[4733]: E0318 10:16:29.997742 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 10:16:30.497729071 +0000 UTC m=+229.989463396 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nwhtg" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:30 crc kubenswrapper[4733]: I0318 10:16:30.002492 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-7rr85" event={"ID":"34ea1a9f-9093-421f-bef3-228352aa65fb","Type":"ContainerStarted","Data":"9ae227eb47f15060adbbd96eb5744108b09bf4ae0d948bfba7b04b2c867d1d95"} Mar 18 10:16:30 crc kubenswrapper[4733]: I0318 10:16:30.002541 4733 patch_prober.go:28] interesting pod/console-operator-58897d9998-lptjf container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/readyz\": dial tcp 10.217.0.22:8443: connect: connection refused" start-of-body= Mar 18 10:16:30 crc kubenswrapper[4733]: I0318 10:16:30.002583 4733 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-lptjf" podUID="4810c2fd-346b-44a0-b985-46d302060373" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.22:8443/readyz\": dial tcp 10.217.0.22:8443: connect: connection refused" Mar 18 10:16:30 crc kubenswrapper[4733]: I0318 10:16:30.006493 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mz68f" event={"ID":"ec8840c3-e0bd-4cf0-9dd4-87d9ae93b806","Type":"ContainerStarted","Data":"11401c60ab3a6d8f146de60797f2a2f1f201df5e3acb2ea3c5026e188ee89b7c"} Mar 18 10:16:30 crc kubenswrapper[4733]: I0318 10:16:30.012145 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-t95b6" event={"ID":"ad648fa7-2560-4aa0-8634-05bcbc48916f","Type":"ContainerStarted","Data":"4ae7727fa124f9a4a77bef56557f443b2ac6b4ef5dd776d94f91112478e6c6e7"} Mar 18 10:16:30 crc kubenswrapper[4733]: I0318 10:16:30.022319 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4lbr5" podStartSLOduration=172.022300197 podStartE2EDuration="2m52.022300197s" podCreationTimestamp="2026-03-18 10:13:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:16:30.022277816 +0000 UTC m=+229.514012141" watchObservedRunningTime="2026-03-18 10:16:30.022300197 +0000 UTC m=+229.514034522" Mar 18 10:16:30 crc kubenswrapper[4733]: I0318 10:16:30.023248 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-pvlch" podStartSLOduration=6.023242097 podStartE2EDuration="6.023242097s" podCreationTimestamp="2026-03-18 10:16:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:16:30.002296777 +0000 UTC m=+229.494031102" watchObservedRunningTime="2026-03-18 10:16:30.023242097 +0000 UTC m=+229.514976412" Mar 18 10:16:30 crc kubenswrapper[4733]: I0318 10:16:30.037528 4733 ???:1] "http: TLS handshake error from 192.168.126.11:40586: no serving certificate available for the kubelet" Mar 18 10:16:30 crc kubenswrapper[4733]: I0318 10:16:30.045714 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-h8kqf" event={"ID":"87157be2-0fc3-4120-b9b6-d4494ace940a","Type":"ContainerStarted","Data":"2b43a0d12ed658afd5390df634ceef4336e71a96359cd4aabe1b8017a2b0df25"} Mar 18 10:16:30 crc kubenswrapper[4733]: I0318 10:16:30.054087 4733 generic.go:334] "Generic (PLEG): container finished" podID="352d0ed5-c43b-431f-bd66-1749ab30d013" containerID="c66b0d6638a9b20b71ffbf84d0f23d56294dbac26257e2ca3a04af6eb34a1ccf" exitCode=0 Mar 18 10:16:30 crc kubenswrapper[4733]: I0318 10:16:30.054150 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lsqn4" event={"ID":"352d0ed5-c43b-431f-bd66-1749ab30d013","Type":"ContainerDied","Data":"c66b0d6638a9b20b71ffbf84d0f23d56294dbac26257e2ca3a04af6eb34a1ccf"} Mar 18 10:16:30 crc kubenswrapper[4733]: I0318 10:16:30.070767 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bzhq6" podStartSLOduration=172.070744516 podStartE2EDuration="2m52.070744516s" podCreationTimestamp="2026-03-18 10:13:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:16:30.065677294 +0000 UTC m=+229.557411619" watchObservedRunningTime="2026-03-18 10:16:30.070744516 +0000 UTC m=+229.562478841" Mar 18 10:16:30 crc kubenswrapper[4733]: I0318 10:16:30.071120 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hw7zb" event={"ID":"57151941-19ac-4bb5-a93b-b5dfbc88e0d6","Type":"ContainerStarted","Data":"38b3b1f22a2a3afbcf93b73f49a0b9a28fa07f5a614dc83f62b7fd5c9b7ffac1"} Mar 18 10:16:30 crc kubenswrapper[4733]: I0318 10:16:30.078440 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6572z" event={"ID":"c0da800f-a7ca-4d0e-89bb-96673854969e","Type":"ContainerStarted","Data":"8864b7b9e0f87345c4e5bb9b84bac1c3dd7621197cee5f9ae21fd5fc4bee7055"} Mar 18 10:16:30 crc kubenswrapper[4733]: I0318 10:16:30.086488 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qs72s" event={"ID":"99efba52-bc27-49d7-8efb-154b6e3787a9","Type":"ContainerStarted","Data":"f36c5a646fe472dc33637a91fcbb88bf31006bcc9aac430a12510e48916562d6"} Mar 18 10:16:30 crc kubenswrapper[4733]: I0318 10:16:30.092476 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563815-tsrs6" event={"ID":"d915f7d2-5b4d-4017-a839-b615a182fafb","Type":"ContainerStarted","Data":"4abe99dbd7bc3b694bde422289cb5e8d4d69c342990c6d29b9ffcb65e8f885f7"} Mar 18 10:16:30 crc kubenswrapper[4733]: I0318 10:16:30.098583 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:16:30 crc kubenswrapper[4733]: E0318 10:16:30.100429 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:16:30.600411745 +0000 UTC m=+230.092146070 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:30 crc kubenswrapper[4733]: I0318 10:16:30.116100 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-9h9xr" podStartSLOduration=172.116080677 podStartE2EDuration="2m52.116080677s" podCreationTimestamp="2026-03-18 10:13:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:16:30.115679634 +0000 UTC m=+229.607413969" watchObservedRunningTime="2026-03-18 10:16:30.116080677 +0000 UTC m=+229.607815002" Mar 18 10:16:30 crc kubenswrapper[4733]: I0318 10:16:30.139519 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g686q" event={"ID":"4edea753-21f5-44fd-b183-daf03845dcd8","Type":"ContainerStarted","Data":"b051ef59b46aa4f4ca585c5faa0bd51cc805330540dddec38e277e217f6b2385"} Mar 18 10:16:30 crc kubenswrapper[4733]: I0318 10:16:30.147871 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-lptjf" podStartSLOduration=172.147849913 podStartE2EDuration="2m52.147849913s" podCreationTimestamp="2026-03-18 10:13:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:16:30.147076668 +0000 UTC m=+229.638810993" watchObservedRunningTime="2026-03-18 10:16:30.147849913 +0000 UTC m=+229.639584238" Mar 18 10:16:30 crc kubenswrapper[4733]: I0318 10:16:30.155008 4733 ???:1] "http: TLS handshake error from 192.168.126.11:40590: no serving certificate available for the kubelet" Mar 18 10:16:30 crc kubenswrapper[4733]: I0318 10:16:30.167820 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-n6hmz" event={"ID":"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1","Type":"ContainerStarted","Data":"2063ba38b8f338dff7686f6578cd42c9d0c532672eb45f293854b46ba18f0fea"} Mar 18 10:16:30 crc kubenswrapper[4733]: I0318 10:16:30.168684 4733 patch_prober.go:28] interesting pod/downloads-7954f5f757-gxcb2 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" start-of-body= Mar 18 10:16:30 crc kubenswrapper[4733]: I0318 10:16:30.168709 4733 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-gxcb2" podUID="61e27ee7-5eb0-4cc7-a696-85ddd192b171" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" Mar 18 10:16:30 crc kubenswrapper[4733]: I0318 10:16:30.169277 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-n6hmz" Mar 18 10:16:30 crc kubenswrapper[4733]: I0318 10:16:30.201787 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:30 crc kubenswrapper[4733]: E0318 10:16:30.237045 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 10:16:30.737017945 +0000 UTC m=+230.228752270 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nwhtg" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:30 crc kubenswrapper[4733]: I0318 10:16:30.275437 4733 ???:1] "http: TLS handshake error from 192.168.126.11:40592: no serving certificate available for the kubelet" Mar 18 10:16:30 crc kubenswrapper[4733]: I0318 10:16:30.296062 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hw7zb" podStartSLOduration=172.296044933 podStartE2EDuration="2m52.296044933s" podCreationTimestamp="2026-03-18 10:13:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:16:30.251444077 +0000 UTC m=+229.743178392" watchObservedRunningTime="2026-03-18 10:16:30.296044933 +0000 UTC m=+229.787779258" Mar 18 10:16:30 crc kubenswrapper[4733]: I0318 10:16:30.303396 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6572z" podStartSLOduration=172.303387358 podStartE2EDuration="2m52.303387358s" podCreationTimestamp="2026-03-18 10:13:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:16:30.297654905 +0000 UTC m=+229.789389230" watchObservedRunningTime="2026-03-18 10:16:30.303387358 +0000 UTC m=+229.795121683" Mar 18 10:16:30 crc kubenswrapper[4733]: I0318 10:16:30.312393 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:16:30 crc kubenswrapper[4733]: E0318 10:16:30.312860 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:16:30.812841541 +0000 UTC m=+230.304575866 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:30 crc kubenswrapper[4733]: I0318 10:16:30.313111 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:30 crc kubenswrapper[4733]: E0318 10:16:30.314495 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 10:16:30.814487683 +0000 UTC m=+230.306222008 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nwhtg" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:30 crc kubenswrapper[4733]: I0318 10:16:30.330500 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-t95b6" podStartSLOduration=172.330479025 podStartE2EDuration="2m52.330479025s" podCreationTimestamp="2026-03-18 10:13:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:16:30.329568986 +0000 UTC m=+229.821303311" watchObservedRunningTime="2026-03-18 10:16:30.330479025 +0000 UTC m=+229.822213350" Mar 18 10:16:30 crc kubenswrapper[4733]: I0318 10:16:30.370237 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-n6hmz" podStartSLOduration=172.370211876 podStartE2EDuration="2m52.370211876s" podCreationTimestamp="2026-03-18 10:13:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:16:30.366801027 +0000 UTC m=+229.858535362" watchObservedRunningTime="2026-03-18 10:16:30.370211876 +0000 UTC m=+229.861946201" Mar 18 10:16:30 crc kubenswrapper[4733]: I0318 10:16:30.382979 4733 ???:1] "http: TLS handshake error from 192.168.126.11:40606: no serving certificate available for the kubelet" Mar 18 10:16:30 crc kubenswrapper[4733]: I0318 10:16:30.414098 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:16:30 crc kubenswrapper[4733]: E0318 10:16:30.414319 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:16:30.914276455 +0000 UTC m=+230.406010780 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:30 crc kubenswrapper[4733]: I0318 10:16:30.414494 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:30 crc kubenswrapper[4733]: E0318 10:16:30.414850 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 10:16:30.914832163 +0000 UTC m=+230.406566488 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nwhtg" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:30 crc kubenswrapper[4733]: I0318 10:16:30.515489 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:16:30 crc kubenswrapper[4733]: E0318 10:16:30.516030 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:16:31.01600975 +0000 UTC m=+230.507744075 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:30 crc kubenswrapper[4733]: I0318 10:16:30.617136 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:30 crc kubenswrapper[4733]: E0318 10:16:30.617932 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 10:16:31.117907649 +0000 UTC m=+230.609641974 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nwhtg" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:30 crc kubenswrapper[4733]: I0318 10:16:30.719109 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:16:30 crc kubenswrapper[4733]: E0318 10:16:30.719464 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:16:31.219433637 +0000 UTC m=+230.711167962 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:30 crc kubenswrapper[4733]: I0318 10:16:30.719730 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:30 crc kubenswrapper[4733]: E0318 10:16:30.720281 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 10:16:31.220273854 +0000 UTC m=+230.712008179 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nwhtg" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:30 crc kubenswrapper[4733]: I0318 10:16:30.821272 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:16:30 crc kubenswrapper[4733]: E0318 10:16:30.825422 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:16:31.325393196 +0000 UTC m=+230.817127521 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:30 crc kubenswrapper[4733]: I0318 10:16:30.851086 4733 patch_prober.go:28] interesting pod/router-default-5444994796-xl5d7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 10:16:30 crc kubenswrapper[4733]: [-]has-synced failed: reason withheld Mar 18 10:16:30 crc kubenswrapper[4733]: [+]process-running ok Mar 18 10:16:30 crc kubenswrapper[4733]: healthz check failed Mar 18 10:16:30 crc kubenswrapper[4733]: I0318 10:16:30.851162 4733 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xl5d7" podUID="9c5f567e-b38f-44a0-b1fd-1a96857e811f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 10:16:30 crc kubenswrapper[4733]: I0318 10:16:30.926653 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:30 crc kubenswrapper[4733]: E0318 10:16:30.927211 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 10:16:31.427161872 +0000 UTC m=+230.918896197 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nwhtg" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:31 crc kubenswrapper[4733]: I0318 10:16:31.028608 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:16:31 crc kubenswrapper[4733]: E0318 10:16:31.029203 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:16:31.529161754 +0000 UTC m=+231.020896079 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:31 crc kubenswrapper[4733]: I0318 10:16:31.067215 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-n6hmz" Mar 18 10:16:31 crc kubenswrapper[4733]: I0318 10:16:31.092153 4733 ???:1] "http: TLS handshake error from 192.168.126.11:40622: no serving certificate available for the kubelet" Mar 18 10:16:31 crc kubenswrapper[4733]: I0318 10:16:31.135130 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:31 crc kubenswrapper[4733]: E0318 10:16:31.135553 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 10:16:31.635536987 +0000 UTC m=+231.127271312 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nwhtg" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:31 crc kubenswrapper[4733]: I0318 10:16:31.219615 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8kv4d" event={"ID":"97ffe185-3f09-44d0-a173-f95bb53c419e","Type":"ContainerStarted","Data":"bb2753716ca08e78d86f779de672cd1596dc670da8dd0801512daa94492f1359"} Mar 18 10:16:31 crc kubenswrapper[4733]: I0318 10:16:31.226304 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bzhq6" event={"ID":"84ddb369-1909-4d63-a0c0-b250490992c0","Type":"ContainerStarted","Data":"75e60fb3f0f458ee4c62fb018066b15b13329856a88c2a0a0d038e4092ccd720"} Mar 18 10:16:31 crc kubenswrapper[4733]: I0318 10:16:31.234215 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dxd7p" event={"ID":"10f3d99e-72fa-4c62-8190-059d7a0effd1","Type":"ContainerStarted","Data":"5302f1d354e8e9a2de91a66e848b2bc30e7eb85e32973869e327011ccf471b52"} Mar 18 10:16:31 crc kubenswrapper[4733]: I0318 10:16:31.234288 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dxd7p" event={"ID":"10f3d99e-72fa-4c62-8190-059d7a0effd1","Type":"ContainerStarted","Data":"90cf3f6d25ee865494955cc69830e65df1fe45b9b7f73813f62b695dffa405ec"} Mar 18 10:16:31 crc kubenswrapper[4733]: I0318 10:16:31.235976 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:16:31 crc kubenswrapper[4733]: E0318 10:16:31.236653 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:16:31.736626661 +0000 UTC m=+231.228360986 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:31 crc kubenswrapper[4733]: I0318 10:16:31.239325 4733 generic.go:334] "Generic (PLEG): container finished" podID="56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6" containerID="6a4e3a2300d0d51d2c8f47aef73b486ee2ec5f3083882676ec91b5dcd4699c4b" exitCode=0 Mar 18 10:16:31 crc kubenswrapper[4733]: I0318 10:16:31.239460 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xvnwv" event={"ID":"56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6","Type":"ContainerDied","Data":"6a4e3a2300d0d51d2c8f47aef73b486ee2ec5f3083882676ec91b5dcd4699c4b"} Mar 18 10:16:31 crc kubenswrapper[4733]: I0318 10:16:31.272832 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hvmrz" event={"ID":"aa4b5542-dc36-4c93-88e5-a080729b94ae","Type":"ContainerStarted","Data":"6aeac68338b17556e78e0693c265521b8112d378d658532a0beb788af0e2a498"} Mar 18 10:16:31 crc kubenswrapper[4733]: I0318 10:16:31.273453 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hvmrz" event={"ID":"aa4b5542-dc36-4c93-88e5-a080729b94ae","Type":"ContainerStarted","Data":"f28fd3ee37f3b46e37e0643e10dfc0ade0cab785919ea69a9d55380f3f92801a"} Mar 18 10:16:31 crc kubenswrapper[4733]: I0318 10:16:31.288568 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mz68f" event={"ID":"ec8840c3-e0bd-4cf0-9dd4-87d9ae93b806","Type":"ContainerStarted","Data":"17fd416c7f22590f986e89b3b7da77f9f43747162343171b8e0d88fabbe2b739"} Mar 18 10:16:31 crc kubenswrapper[4733]: I0318 10:16:31.288639 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mz68f" event={"ID":"ec8840c3-e0bd-4cf0-9dd4-87d9ae93b806","Type":"ContainerStarted","Data":"299bbe114466f39c88fc0e927e7d41abfe8761da05c19e5db337ebba724013bc"} Mar 18 10:16:31 crc kubenswrapper[4733]: I0318 10:16:31.290203 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-848w7" Mar 18 10:16:31 crc kubenswrapper[4733]: I0318 10:16:31.294334 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z8g4f" event={"ID":"3a0400a1-7e6b-4335-8819-586d7a460e3d","Type":"ContainerStarted","Data":"9f721d740124d28412f1818fcb188fd9065da5b33f9c191a8e29248b89331523"} Mar 18 10:16:31 crc kubenswrapper[4733]: I0318 10:16:31.295449 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z8g4f" Mar 18 10:16:31 crc kubenswrapper[4733]: I0318 10:16:31.300427 4733 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-z8g4f container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.27:5443/healthz\": dial tcp 10.217.0.27:5443: connect: connection refused" start-of-body= Mar 18 10:16:31 crc kubenswrapper[4733]: I0318 10:16:31.300489 4733 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z8g4f" podUID="3a0400a1-7e6b-4335-8819-586d7a460e3d" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.27:5443/healthz\": dial tcp 10.217.0.27:5443: connect: connection refused" Mar 18 10:16:31 crc kubenswrapper[4733]: I0318 10:16:31.323770 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-t95b6" event={"ID":"ad648fa7-2560-4aa0-8634-05bcbc48916f","Type":"ContainerStarted","Data":"f271cbb98f96b3dc9e2b032327eccbc87d87a011c3c9b9e24fef1ee6ebfc6652"} Mar 18 10:16:31 crc kubenswrapper[4733]: I0318 10:16:31.335987 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-7rr85" event={"ID":"34ea1a9f-9093-421f-bef3-228352aa65fb","Type":"ContainerStarted","Data":"03eb463e4dd483e49a80db7e81b412b8d031879512dded00ef00cc215cf14f23"} Mar 18 10:16:31 crc kubenswrapper[4733]: I0318 10:16:31.336900 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-7rr85" Mar 18 10:16:31 crc kubenswrapper[4733]: I0318 10:16:31.338055 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:31 crc kubenswrapper[4733]: E0318 10:16:31.339991 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 10:16:31.839966316 +0000 UTC m=+231.331700641 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nwhtg" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:31 crc kubenswrapper[4733]: I0318 10:16:31.343116 4733 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-7rr85 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Mar 18 10:16:31 crc kubenswrapper[4733]: I0318 10:16:31.343372 4733 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-7rr85" podUID="34ea1a9f-9093-421f-bef3-228352aa65fb" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" Mar 18 10:16:31 crc kubenswrapper[4733]: I0318 10:16:31.356411 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4lbr5" event={"ID":"f2b6c2ec-c07f-4d59-ba90-1ed2ec55d8a7","Type":"ContainerStarted","Data":"b1105779c36b18b11083ade64d7fa22cfa35c4260c2e8d7fff9f833daff1fb14"} Mar 18 10:16:31 crc kubenswrapper[4733]: I0318 10:16:31.368378 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mj46t" event={"ID":"fe90c8a8-c79a-4ed5-bec1-5ea07fbad5cf","Type":"ContainerStarted","Data":"f9d06b9c4e750bd26bf32a4044dfa96fbe08e2c0e576b45953de970b2906834a"} Mar 18 10:16:31 crc kubenswrapper[4733]: I0318 10:16:31.395882 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mxb9q" event={"ID":"3be6d75e-e4f8-4d9b-8ed3-9d25632de88c","Type":"ContainerStarted","Data":"83e067e82308fb0fa37c1329e5952542a9cf5a80f17058f1f5f162bce23dc702"} Mar 18 10:16:31 crc kubenswrapper[4733]: I0318 10:16:31.424680 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hw7zb" event={"ID":"57151941-19ac-4bb5-a93b-b5dfbc88e0d6","Type":"ContainerStarted","Data":"4be9612a0415c389c710939bcff2afe74ad74bc4f0a20b0a6cbc8607ac4f5a0c"} Mar 18 10:16:31 crc kubenswrapper[4733]: I0318 10:16:31.442896 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:16:31 crc kubenswrapper[4733]: E0318 10:16:31.443066 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:16:31.943040604 +0000 UTC m=+231.434774929 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:31 crc kubenswrapper[4733]: I0318 10:16:31.443280 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:31 crc kubenswrapper[4733]: E0318 10:16:31.444761 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 10:16:31.944753428 +0000 UTC m=+231.436487753 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nwhtg" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:31 crc kubenswrapper[4733]: I0318 10:16:31.461147 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-vsnq2" event={"ID":"10e64d74-2e25-41fd-a9ad-32a3e74e5c01","Type":"ContainerStarted","Data":"15a437c1a993be4ad55457ce7627e92e26a819579687176c93eacbf55e8fea4b"} Mar 18 10:16:31 crc kubenswrapper[4733]: I0318 10:16:31.476375 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563815-tsrs6" event={"ID":"d915f7d2-5b4d-4017-a839-b615a182fafb","Type":"ContainerStarted","Data":"6068780e861c95e2a5524c6995b5943bf2eb924f4e716f49bfa978772d8dc58d"} Mar 18 10:16:31 crc kubenswrapper[4733]: I0318 10:16:31.513162 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lsqn4" event={"ID":"352d0ed5-c43b-431f-bd66-1749ab30d013","Type":"ContainerStarted","Data":"6a709b328a9b33a56f3bd83964c1b7d7d9b38767e78a50508642cd2cb1b3e3e3"} Mar 18 10:16:31 crc kubenswrapper[4733]: I0318 10:16:31.531480 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g686q" event={"ID":"4edea753-21f5-44fd-b183-daf03845dcd8","Type":"ContainerStarted","Data":"43e71689f4fab8e9f651663bc2478910fb3f8ac94b0e8ba1b81778232bce284c"} Mar 18 10:16:31 crc kubenswrapper[4733]: I0318 10:16:31.532918 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g686q" Mar 18 10:16:31 crc kubenswrapper[4733]: I0318 10:16:31.542532 4733 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-g686q container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" start-of-body= Mar 18 10:16:31 crc kubenswrapper[4733]: I0318 10:16:31.542592 4733 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g686q" podUID="4edea753-21f5-44fd-b183-daf03845dcd8" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" Mar 18 10:16:31 crc kubenswrapper[4733]: I0318 10:16:31.544944 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:16:31 crc kubenswrapper[4733]: E0318 10:16:31.546265 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:16:32.046245315 +0000 UTC m=+231.537979640 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:31 crc kubenswrapper[4733]: I0318 10:16:31.548817 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-h8kqf" event={"ID":"87157be2-0fc3-4120-b9b6-d4494ace940a","Type":"ContainerStarted","Data":"c4b3ec47f6207612c1d201c727747846b79e1d7aa74dde8f58a8c4d4d9bdafd6"} Mar 18 10:16:31 crc kubenswrapper[4733]: I0318 10:16:31.548857 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-h8kqf" event={"ID":"87157be2-0fc3-4120-b9b6-d4494ace940a","Type":"ContainerStarted","Data":"a5688395d385853aec18579f7e1a245d67a8328ab4238c4893b9955e24c11802"} Mar 18 10:16:31 crc kubenswrapper[4733]: I0318 10:16:31.568998 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kd6gw" event={"ID":"9b0edb65-3bcf-484f-9707-d8124df1ec88","Type":"ContainerStarted","Data":"d3395f5c56bba0e16edb19611a603f0877a770333d6a8dd11f8b6322718eafec"} Mar 18 10:16:31 crc kubenswrapper[4733]: I0318 10:16:31.569049 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kd6gw" event={"ID":"9b0edb65-3bcf-484f-9707-d8124df1ec88","Type":"ContainerStarted","Data":"661d9316b3add46a7098ad8b08dc2f8d0026c9f764dbf292a94aa781a7beff3d"} Mar 18 10:16:31 crc kubenswrapper[4733]: I0318 10:16:31.573745 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kd6gw" Mar 18 10:16:31 crc kubenswrapper[4733]: I0318 10:16:31.586403 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-h5xdn" event={"ID":"53a14d61-5c2c-44b8-b3cb-c8daa23762bf","Type":"ContainerStarted","Data":"808f7dc813d586a3560911f32dd518a52d54492e86216f58c1d8de30be696b25"} Mar 18 10:16:31 crc kubenswrapper[4733]: I0318 10:16:31.600645 4733 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-9h9xr container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Mar 18 10:16:31 crc kubenswrapper[4733]: I0318 10:16:31.600709 4733 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-9h9xr" podUID="5192f67b-f2ab-45eb-9b1a-64bdff02437a" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" Mar 18 10:16:31 crc kubenswrapper[4733]: I0318 10:16:31.601143 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2wc5m" Mar 18 10:16:31 crc kubenswrapper[4733]: I0318 10:16:31.647419 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:31 crc kubenswrapper[4733]: E0318 10:16:31.652857 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 10:16:32.152844625 +0000 UTC m=+231.644578950 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nwhtg" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:31 crc kubenswrapper[4733]: I0318 10:16:31.749122 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:16:31 crc kubenswrapper[4733]: E0318 10:16:31.750302 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:16:32.250276122 +0000 UTC m=+231.742010447 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:31 crc kubenswrapper[4733]: I0318 10:16:31.762388 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lsqn4" podStartSLOduration=173.762356988 podStartE2EDuration="2m53.762356988s" podCreationTimestamp="2026-03-18 10:13:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:16:31.723710702 +0000 UTC m=+231.215445027" watchObservedRunningTime="2026-03-18 10:16:31.762356988 +0000 UTC m=+231.254091303" Mar 18 10:16:31 crc kubenswrapper[4733]: I0318 10:16:31.764660 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mz68f" podStartSLOduration=173.764651651 podStartE2EDuration="2m53.764651651s" podCreationTimestamp="2026-03-18 10:13:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:16:31.763581057 +0000 UTC m=+231.255315382" watchObservedRunningTime="2026-03-18 10:16:31.764651651 +0000 UTC m=+231.256385976" Mar 18 10:16:31 crc kubenswrapper[4733]: I0318 10:16:31.793822 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-h5xdn" podStartSLOduration=173.793806074 podStartE2EDuration="2m53.793806074s" podCreationTimestamp="2026-03-18 10:13:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:16:31.791466289 +0000 UTC m=+231.283200614" watchObservedRunningTime="2026-03-18 10:16:31.793806074 +0000 UTC m=+231.285540399" Mar 18 10:16:31 crc kubenswrapper[4733]: I0318 10:16:31.840722 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-vsnq2" podStartSLOduration=173.840699154 podStartE2EDuration="2m53.840699154s" podCreationTimestamp="2026-03-18 10:13:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:16:31.815488588 +0000 UTC m=+231.307222913" watchObservedRunningTime="2026-03-18 10:16:31.840699154 +0000 UTC m=+231.332433479" Mar 18 10:16:31 crc kubenswrapper[4733]: I0318 10:16:31.851119 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:31 crc kubenswrapper[4733]: E0318 10:16:31.851627 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 10:16:32.351610983 +0000 UTC m=+231.843345308 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nwhtg" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:31 crc kubenswrapper[4733]: I0318 10:16:31.861599 4733 patch_prober.go:28] interesting pod/router-default-5444994796-xl5d7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 10:16:31 crc kubenswrapper[4733]: [-]has-synced failed: reason withheld Mar 18 10:16:31 crc kubenswrapper[4733]: [+]process-running ok Mar 18 10:16:31 crc kubenswrapper[4733]: healthz check failed Mar 18 10:16:31 crc kubenswrapper[4733]: I0318 10:16:31.861730 4733 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xl5d7" podUID="9c5f567e-b38f-44a0-b1fd-1a96857e811f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 10:16:31 crc kubenswrapper[4733]: I0318 10:16:31.888715 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-h8kqf" podStartSLOduration=173.888689059 podStartE2EDuration="2m53.888689059s" podCreationTimestamp="2026-03-18 10:13:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:16:31.883636077 +0000 UTC m=+231.375370402" watchObservedRunningTime="2026-03-18 10:16:31.888689059 +0000 UTC m=+231.380423384" Mar 18 10:16:31 crc kubenswrapper[4733]: I0318 10:16:31.894331 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-7rr85" podStartSLOduration=173.894304859 podStartE2EDuration="2m53.894304859s" podCreationTimestamp="2026-03-18 10:13:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:16:31.844398542 +0000 UTC m=+231.336132867" watchObservedRunningTime="2026-03-18 10:16:31.894304859 +0000 UTC m=+231.386039184" Mar 18 10:16:31 crc kubenswrapper[4733]: I0318 10:16:31.953814 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:16:31 crc kubenswrapper[4733]: E0318 10:16:31.954305 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:16:32.454289647 +0000 UTC m=+231.946023972 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:31 crc kubenswrapper[4733]: I0318 10:16:31.955797 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8kv4d" podStartSLOduration=173.955780725 podStartE2EDuration="2m53.955780725s" podCreationTimestamp="2026-03-18 10:13:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:16:31.908588486 +0000 UTC m=+231.400322821" watchObservedRunningTime="2026-03-18 10:16:31.955780725 +0000 UTC m=+231.447515040" Mar 18 10:16:31 crc kubenswrapper[4733]: I0318 10:16:31.991170 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mxb9q" podStartSLOduration=173.991142256 podStartE2EDuration="2m53.991142256s" podCreationTimestamp="2026-03-18 10:13:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:16:31.989612307 +0000 UTC m=+231.481346642" watchObservedRunningTime="2026-03-18 10:16:31.991142256 +0000 UTC m=+231.482876581" Mar 18 10:16:32 crc kubenswrapper[4733]: I0318 10:16:32.021381 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mj46t" podStartSLOduration=174.021355973 podStartE2EDuration="2m54.021355973s" podCreationTimestamp="2026-03-18 10:13:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:16:32.018738719 +0000 UTC m=+231.510473044" watchObservedRunningTime="2026-03-18 10:16:32.021355973 +0000 UTC m=+231.513090298" Mar 18 10:16:32 crc kubenswrapper[4733]: I0318 10:16:32.056466 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:32 crc kubenswrapper[4733]: E0318 10:16:32.057436 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 10:16:32.557420146 +0000 UTC m=+232.049154461 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nwhtg" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:32 crc kubenswrapper[4733]: I0318 10:16:32.114722 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g686q" podStartSLOduration=174.114707299 podStartE2EDuration="2m54.114707299s" podCreationTimestamp="2026-03-18 10:13:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:16:32.108550532 +0000 UTC m=+231.600284857" watchObservedRunningTime="2026-03-18 10:16:32.114707299 +0000 UTC m=+231.606441624" Mar 18 10:16:32 crc kubenswrapper[4733]: I0318 10:16:32.161329 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:16:32 crc kubenswrapper[4733]: E0318 10:16:32.162000 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:16:32.661982731 +0000 UTC m=+232.153717046 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:32 crc kubenswrapper[4733]: I0318 10:16:32.236497 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lsqn4" Mar 18 10:16:32 crc kubenswrapper[4733]: I0318 10:16:32.236986 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lsqn4" Mar 18 10:16:32 crc kubenswrapper[4733]: I0318 10:16:32.266685 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:32 crc kubenswrapper[4733]: E0318 10:16:32.267262 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 10:16:32.767238828 +0000 UTC m=+232.258973153 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nwhtg" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:32 crc kubenswrapper[4733]: I0318 10:16:32.270797 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29563815-tsrs6" podStartSLOduration=92.270771631 podStartE2EDuration="1m32.270771631s" podCreationTimestamp="2026-03-18 10:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:16:32.26793437 +0000 UTC m=+231.759668715" watchObservedRunningTime="2026-03-18 10:16:32.270771631 +0000 UTC m=+231.762505956" Mar 18 10:16:32 crc kubenswrapper[4733]: I0318 10:16:32.357247 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dxd7p" podStartSLOduration=174.357221587 podStartE2EDuration="2m54.357221587s" podCreationTimestamp="2026-03-18 10:13:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:16:32.317072422 +0000 UTC m=+231.808806747" watchObservedRunningTime="2026-03-18 10:16:32.357221587 +0000 UTC m=+231.848955902" Mar 18 10:16:32 crc kubenswrapper[4733]: I0318 10:16:32.370945 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:16:32 crc kubenswrapper[4733]: E0318 10:16:32.371305 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:16:32.871287196 +0000 UTC m=+232.363021521 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:32 crc kubenswrapper[4733]: I0318 10:16:32.389660 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z8g4f" podStartSLOduration=174.389633853 podStartE2EDuration="2m54.389633853s" podCreationTimestamp="2026-03-18 10:13:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:16:32.389067495 +0000 UTC m=+231.880801820" watchObservedRunningTime="2026-03-18 10:16:32.389633853 +0000 UTC m=+231.881368178" Mar 18 10:16:32 crc kubenswrapper[4733]: I0318 10:16:32.430539 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kd6gw" podStartSLOduration=174.430524061 podStartE2EDuration="2m54.430524061s" podCreationTimestamp="2026-03-18 10:13:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:16:32.428877829 +0000 UTC m=+231.920612154" watchObservedRunningTime="2026-03-18 10:16:32.430524061 +0000 UTC m=+231.922258386" Mar 18 10:16:32 crc kubenswrapper[4733]: I0318 10:16:32.473916 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:32 crc kubenswrapper[4733]: E0318 10:16:32.474308 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 10:16:32.974295731 +0000 UTC m=+232.466030056 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nwhtg" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:32 crc kubenswrapper[4733]: I0318 10:16:32.499505 4733 ???:1] "http: TLS handshake error from 192.168.126.11:40628: no serving certificate available for the kubelet" Mar 18 10:16:32 crc kubenswrapper[4733]: I0318 10:16:32.576031 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:16:32 crc kubenswrapper[4733]: E0318 10:16:32.577006 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:16:33.076982556 +0000 UTC m=+232.568716881 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:32 crc kubenswrapper[4733]: I0318 10:16:32.589770 4733 patch_prober.go:28] interesting pod/console-operator-58897d9998-lptjf container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 10:16:32 crc kubenswrapper[4733]: I0318 10:16:32.590041 4733 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-lptjf" podUID="4810c2fd-346b-44a0-b985-46d302060373" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.22:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 10:16:32 crc kubenswrapper[4733]: I0318 10:16:32.631921 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xvnwv" event={"ID":"56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6","Type":"ContainerStarted","Data":"749934ceee0a98d547c6ab88a5257bf4a284fe859a9a2040c67a6910c663a176"} Mar 18 10:16:32 crc kubenswrapper[4733]: I0318 10:16:32.632431 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xvnwv" event={"ID":"56f4298d-e4ce-42a3-a0dc-d9c94b84dfe6","Type":"ContainerStarted","Data":"293173a5534f294877b5577b8f8b99df7629cfb86013c76769749a28770ec93d"} Mar 18 10:16:32 crc kubenswrapper[4733]: I0318 10:16:32.654895 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hvmrz" event={"ID":"aa4b5542-dc36-4c93-88e5-a080729b94ae","Type":"ContainerStarted","Data":"442de960ed251c6b8645a6adcae9131d7e11d33e7e1b5f646f14c0f5f4d67fde"} Mar 18 10:16:32 crc kubenswrapper[4733]: I0318 10:16:32.655966 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-hvmrz" Mar 18 10:16:32 crc kubenswrapper[4733]: I0318 10:16:32.681100 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:32 crc kubenswrapper[4733]: E0318 10:16:32.684361 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 10:16:33.184344371 +0000 UTC m=+232.676078696 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nwhtg" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:32 crc kubenswrapper[4733]: I0318 10:16:32.685420 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-p4b5s" event={"ID":"ed943a82-ef39-4ebc-9d76-09bb69f3b800","Type":"ContainerStarted","Data":"5569db947d2a68531f9a1c2042230297737a979a298e0604d72073782235d034"} Mar 18 10:16:32 crc kubenswrapper[4733]: I0318 10:16:32.686554 4733 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-9h9xr container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Mar 18 10:16:32 crc kubenswrapper[4733]: I0318 10:16:32.686632 4733 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-9h9xr" podUID="5192f67b-f2ab-45eb-9b1a-64bdff02437a" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" Mar 18 10:16:32 crc kubenswrapper[4733]: I0318 10:16:32.693047 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-7rr85" Mar 18 10:16:32 crc kubenswrapper[4733]: I0318 10:16:32.694584 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-hvmrz" podStartSLOduration=8.694574308 podStartE2EDuration="8.694574308s" podCreationTimestamp="2026-03-18 10:16:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:16:32.691760118 +0000 UTC m=+232.183494443" watchObservedRunningTime="2026-03-18 10:16:32.694574308 +0000 UTC m=+232.186308633" Mar 18 10:16:32 crc kubenswrapper[4733]: I0318 10:16:32.696933 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-xvnwv" podStartSLOduration=174.696923063 podStartE2EDuration="2m54.696923063s" podCreationTimestamp="2026-03-18 10:13:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:16:32.670605711 +0000 UTC m=+232.162340036" watchObservedRunningTime="2026-03-18 10:16:32.696923063 +0000 UTC m=+232.188657388" Mar 18 10:16:32 crc kubenswrapper[4733]: I0318 10:16:32.709091 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-lptjf" Mar 18 10:16:32 crc kubenswrapper[4733]: I0318 10:16:32.709405 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g686q" Mar 18 10:16:32 crc kubenswrapper[4733]: I0318 10:16:32.764127 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z8g4f" Mar 18 10:16:32 crc kubenswrapper[4733]: I0318 10:16:32.783875 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:16:32 crc kubenswrapper[4733]: E0318 10:16:32.786563 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:16:33.28653698 +0000 UTC m=+232.778271305 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:32 crc kubenswrapper[4733]: I0318 10:16:32.850825 4733 patch_prober.go:28] interesting pod/router-default-5444994796-xl5d7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 10:16:32 crc kubenswrapper[4733]: [-]has-synced failed: reason withheld Mar 18 10:16:32 crc kubenswrapper[4733]: [+]process-running ok Mar 18 10:16:32 crc kubenswrapper[4733]: healthz check failed Mar 18 10:16:32 crc kubenswrapper[4733]: I0318 10:16:32.850908 4733 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xl5d7" podUID="9c5f567e-b38f-44a0-b1fd-1a96857e811f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 10:16:32 crc kubenswrapper[4733]: I0318 10:16:32.886372 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:32 crc kubenswrapper[4733]: E0318 10:16:32.887102 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 10:16:33.387083696 +0000 UTC m=+232.878818021 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nwhtg" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:32 crc kubenswrapper[4733]: I0318 10:16:32.989991 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:16:32 crc kubenswrapper[4733]: E0318 10:16:32.990513 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:16:33.490470693 +0000 UTC m=+232.982205018 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:32 crc kubenswrapper[4733]: I0318 10:16:32.990934 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:32 crc kubenswrapper[4733]: E0318 10:16:32.991577 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 10:16:33.491567228 +0000 UTC m=+232.983301553 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nwhtg" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:33 crc kubenswrapper[4733]: I0318 10:16:33.054763 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lsqn4" Mar 18 10:16:33 crc kubenswrapper[4733]: I0318 10:16:33.092251 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:16:33 crc kubenswrapper[4733]: E0318 10:16:33.092769 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:16:33.592742685 +0000 UTC m=+233.084477010 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:33 crc kubenswrapper[4733]: I0318 10:16:33.092947 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:33 crc kubenswrapper[4733]: E0318 10:16:33.093599 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 10:16:33.593580211 +0000 UTC m=+233.085314726 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nwhtg" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:33 crc kubenswrapper[4733]: I0318 10:16:33.194025 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:16:33 crc kubenswrapper[4733]: E0318 10:16:33.194224 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:16:33.69419688 +0000 UTC m=+233.185931205 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:33 crc kubenswrapper[4733]: I0318 10:16:33.194671 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:33 crc kubenswrapper[4733]: E0318 10:16:33.194986 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 10:16:33.694974825 +0000 UTC m=+233.186709150 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nwhtg" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:33 crc kubenswrapper[4733]: I0318 10:16:33.296308 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:16:33 crc kubenswrapper[4733]: E0318 10:16:33.296652 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:16:33.796586894 +0000 UTC m=+233.288321219 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:33 crc kubenswrapper[4733]: I0318 10:16:33.297169 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:33 crc kubenswrapper[4733]: E0318 10:16:33.297694 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 10:16:33.797685959 +0000 UTC m=+233.289420284 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nwhtg" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:33 crc kubenswrapper[4733]: I0318 10:16:33.398136 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:16:33 crc kubenswrapper[4733]: E0318 10:16:33.398331 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:16:33.898303178 +0000 UTC m=+233.390037503 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:33 crc kubenswrapper[4733]: I0318 10:16:33.398464 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:33 crc kubenswrapper[4733]: E0318 10:16:33.398866 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 10:16:33.898850785 +0000 UTC m=+233.390585110 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nwhtg" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:33 crc kubenswrapper[4733]: I0318 10:16:33.499962 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:16:33 crc kubenswrapper[4733]: E0318 10:16:33.500884 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:16:34.000841198 +0000 UTC m=+233.492575523 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:33 crc kubenswrapper[4733]: I0318 10:16:33.500963 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:33 crc kubenswrapper[4733]: E0318 10:16:33.501348 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 10:16:34.001332883 +0000 UTC m=+233.493067208 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nwhtg" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:33 crc kubenswrapper[4733]: I0318 10:16:33.544563 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7rr85"] Mar 18 10:16:33 crc kubenswrapper[4733]: I0318 10:16:33.565096 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-m5k95"] Mar 18 10:16:33 crc kubenswrapper[4733]: I0318 10:16:33.565764 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m5k95" podUID="25331c44-b639-46f7-8a7f-6f62f8779e2b" containerName="route-controller-manager" containerID="cri-o://3c09df7a275938153d455f147ffe12eff185edea72f1d8646898b9ba5ba684d5" gracePeriod=30 Mar 18 10:16:33 crc kubenswrapper[4733]: I0318 10:16:33.602665 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:16:33 crc kubenswrapper[4733]: E0318 10:16:33.603140 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:16:34.103118029 +0000 UTC m=+233.594852354 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:33 crc kubenswrapper[4733]: I0318 10:16:33.704243 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-p4b5s" event={"ID":"ed943a82-ef39-4ebc-9d76-09bb69f3b800","Type":"ContainerStarted","Data":"8cb1ca64aafbb94fb628a8e324049ab71e0109fbe0ae58bb9968f02da229b684"} Mar 18 10:16:33 crc kubenswrapper[4733]: I0318 10:16:33.704899 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:33 crc kubenswrapper[4733]: E0318 10:16:33.705365 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 10:16:34.20535277 +0000 UTC m=+233.697087095 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nwhtg" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:33 crc kubenswrapper[4733]: I0318 10:16:33.718220 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lsqn4" Mar 18 10:16:33 crc kubenswrapper[4733]: I0318 10:16:33.806276 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:16:33 crc kubenswrapper[4733]: E0318 10:16:33.807456 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:16:34.307410934 +0000 UTC m=+233.799145259 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:33 crc kubenswrapper[4733]: I0318 10:16:33.855843 4733 patch_prober.go:28] interesting pod/router-default-5444994796-xl5d7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 10:16:33 crc kubenswrapper[4733]: [-]has-synced failed: reason withheld Mar 18 10:16:33 crc kubenswrapper[4733]: [+]process-running ok Mar 18 10:16:33 crc kubenswrapper[4733]: healthz check failed Mar 18 10:16:33 crc kubenswrapper[4733]: I0318 10:16:33.855916 4733 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xl5d7" podUID="9c5f567e-b38f-44a0-b1fd-1a96857e811f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 10:16:33 crc kubenswrapper[4733]: I0318 10:16:33.908475 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:33 crc kubenswrapper[4733]: E0318 10:16:33.908900 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 10:16:34.40888337 +0000 UTC m=+233.900617695 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nwhtg" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.009727 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:16:34 crc kubenswrapper[4733]: E0318 10:16:34.010237 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:16:34.510207972 +0000 UTC m=+234.001942297 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.096336 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rls2r"] Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.097254 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rls2r" Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.109143 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.111983 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:34 crc kubenswrapper[4733]: E0318 10:16:34.112455 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 10:16:34.612422151 +0000 UTC m=+234.104156476 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nwhtg" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.164762 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rls2r"] Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.164816 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-f92nl"] Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.165663 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f92nl" Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.172037 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.194172 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f92nl"] Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.213973 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.214355 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9j4k9\" (UniqueName: \"kubernetes.io/projected/527056ad-4daf-4dd5-9e31-887d55be0336-kube-api-access-9j4k9\") pod \"community-operators-f92nl\" (UID: \"527056ad-4daf-4dd5-9e31-887d55be0336\") " pod="openshift-marketplace/community-operators-f92nl" Mar 18 10:16:34 crc kubenswrapper[4733]: E0318 10:16:34.214435 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:16:34.714401423 +0000 UTC m=+234.206135748 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.214533 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92996997-080b-42c9-bc2c-19c2e68db896-catalog-content\") pod \"certified-operators-rls2r\" (UID: \"92996997-080b-42c9-bc2c-19c2e68db896\") " pod="openshift-marketplace/certified-operators-rls2r" Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.214583 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92996997-080b-42c9-bc2c-19c2e68db896-utilities\") pod \"certified-operators-rls2r\" (UID: \"92996997-080b-42c9-bc2c-19c2e68db896\") " pod="openshift-marketplace/certified-operators-rls2r" Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.214647 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/527056ad-4daf-4dd5-9e31-887d55be0336-catalog-content\") pod \"community-operators-f92nl\" (UID: \"527056ad-4daf-4dd5-9e31-887d55be0336\") " pod="openshift-marketplace/community-operators-f92nl" Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.214812 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/527056ad-4daf-4dd5-9e31-887d55be0336-utilities\") pod \"community-operators-f92nl\" (UID: \"527056ad-4daf-4dd5-9e31-887d55be0336\") " pod="openshift-marketplace/community-operators-f92nl" Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.214832 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7hv7\" (UniqueName: \"kubernetes.io/projected/92996997-080b-42c9-bc2c-19c2e68db896-kube-api-access-w7hv7\") pod \"certified-operators-rls2r\" (UID: \"92996997-080b-42c9-bc2c-19c2e68db896\") " pod="openshift-marketplace/certified-operators-rls2r" Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.320374 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92996997-080b-42c9-bc2c-19c2e68db896-catalog-content\") pod \"certified-operators-rls2r\" (UID: \"92996997-080b-42c9-bc2c-19c2e68db896\") " pod="openshift-marketplace/certified-operators-rls2r" Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.320426 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92996997-080b-42c9-bc2c-19c2e68db896-utilities\") pod \"certified-operators-rls2r\" (UID: \"92996997-080b-42c9-bc2c-19c2e68db896\") " pod="openshift-marketplace/certified-operators-rls2r" Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.320456 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/527056ad-4daf-4dd5-9e31-887d55be0336-catalog-content\") pod \"community-operators-f92nl\" (UID: \"527056ad-4daf-4dd5-9e31-887d55be0336\") " pod="openshift-marketplace/community-operators-f92nl" Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.320510 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/527056ad-4daf-4dd5-9e31-887d55be0336-utilities\") pod \"community-operators-f92nl\" (UID: \"527056ad-4daf-4dd5-9e31-887d55be0336\") " pod="openshift-marketplace/community-operators-f92nl" Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.320528 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7hv7\" (UniqueName: \"kubernetes.io/projected/92996997-080b-42c9-bc2c-19c2e68db896-kube-api-access-w7hv7\") pod \"certified-operators-rls2r\" (UID: \"92996997-080b-42c9-bc2c-19c2e68db896\") " pod="openshift-marketplace/certified-operators-rls2r" Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.320548 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.320574 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9j4k9\" (UniqueName: \"kubernetes.io/projected/527056ad-4daf-4dd5-9e31-887d55be0336-kube-api-access-9j4k9\") pod \"community-operators-f92nl\" (UID: \"527056ad-4daf-4dd5-9e31-887d55be0336\") " pod="openshift-marketplace/community-operators-f92nl" Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.321690 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92996997-080b-42c9-bc2c-19c2e68db896-catalog-content\") pod \"certified-operators-rls2r\" (UID: \"92996997-080b-42c9-bc2c-19c2e68db896\") " pod="openshift-marketplace/certified-operators-rls2r" Mar 18 10:16:34 crc kubenswrapper[4733]: E0318 10:16:34.322053 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 10:16:34.822042607 +0000 UTC m=+234.313776932 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nwhtg" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.322129 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92996997-080b-42c9-bc2c-19c2e68db896-utilities\") pod \"certified-operators-rls2r\" (UID: \"92996997-080b-42c9-bc2c-19c2e68db896\") " pod="openshift-marketplace/certified-operators-rls2r" Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.322419 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/527056ad-4daf-4dd5-9e31-887d55be0336-catalog-content\") pod \"community-operators-f92nl\" (UID: \"527056ad-4daf-4dd5-9e31-887d55be0336\") " pod="openshift-marketplace/community-operators-f92nl" Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.322506 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/527056ad-4daf-4dd5-9e31-887d55be0336-utilities\") pod \"community-operators-f92nl\" (UID: \"527056ad-4daf-4dd5-9e31-887d55be0336\") " pod="openshift-marketplace/community-operators-f92nl" Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.328388 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-w7rrs"] Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.331085 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w7rrs" Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.342885 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w7rrs"] Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.359800 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7hv7\" (UniqueName: \"kubernetes.io/projected/92996997-080b-42c9-bc2c-19c2e68db896-kube-api-access-w7hv7\") pod \"certified-operators-rls2r\" (UID: \"92996997-080b-42c9-bc2c-19c2e68db896\") " pod="openshift-marketplace/certified-operators-rls2r" Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.362506 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9j4k9\" (UniqueName: \"kubernetes.io/projected/527056ad-4daf-4dd5-9e31-887d55be0336-kube-api-access-9j4k9\") pod \"community-operators-f92nl\" (UID: \"527056ad-4daf-4dd5-9e31-887d55be0336\") " pod="openshift-marketplace/community-operators-f92nl" Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.421759 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:16:34 crc kubenswrapper[4733]: E0318 10:16:34.421918 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:16:34.921891951 +0000 UTC m=+234.413626276 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.421958 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.422010 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5vpf\" (UniqueName: \"kubernetes.io/projected/02cd6358-355c-4db8-b0f7-2528618602ff-kube-api-access-s5vpf\") pod \"certified-operators-w7rrs\" (UID: \"02cd6358-355c-4db8-b0f7-2528618602ff\") " pod="openshift-marketplace/certified-operators-w7rrs" Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.422041 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02cd6358-355c-4db8-b0f7-2528618602ff-catalog-content\") pod \"certified-operators-w7rrs\" (UID: \"02cd6358-355c-4db8-b0f7-2528618602ff\") " pod="openshift-marketplace/certified-operators-w7rrs" Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.422081 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02cd6358-355c-4db8-b0f7-2528618602ff-utilities\") pod \"certified-operators-w7rrs\" (UID: \"02cd6358-355c-4db8-b0f7-2528618602ff\") " pod="openshift-marketplace/certified-operators-w7rrs" Mar 18 10:16:34 crc kubenswrapper[4733]: E0318 10:16:34.422392 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 10:16:34.922381256 +0000 UTC m=+234.414115581 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nwhtg" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.427658 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rls2r" Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.501498 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f92nl" Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.525548 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gmw2d"] Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.525902 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.526121 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5vpf\" (UniqueName: \"kubernetes.io/projected/02cd6358-355c-4db8-b0f7-2528618602ff-kube-api-access-s5vpf\") pod \"certified-operators-w7rrs\" (UID: \"02cd6358-355c-4db8-b0f7-2528618602ff\") " pod="openshift-marketplace/certified-operators-w7rrs" Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.526159 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02cd6358-355c-4db8-b0f7-2528618602ff-catalog-content\") pod \"certified-operators-w7rrs\" (UID: \"02cd6358-355c-4db8-b0f7-2528618602ff\") " pod="openshift-marketplace/certified-operators-w7rrs" Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.526226 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02cd6358-355c-4db8-b0f7-2528618602ff-utilities\") pod \"certified-operators-w7rrs\" (UID: \"02cd6358-355c-4db8-b0f7-2528618602ff\") " pod="openshift-marketplace/certified-operators-w7rrs" Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.526596 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gmw2d" Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.526686 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02cd6358-355c-4db8-b0f7-2528618602ff-utilities\") pod \"certified-operators-w7rrs\" (UID: \"02cd6358-355c-4db8-b0f7-2528618602ff\") " pod="openshift-marketplace/certified-operators-w7rrs" Mar 18 10:16:34 crc kubenswrapper[4733]: E0318 10:16:34.526761 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:16:35.026746845 +0000 UTC m=+234.518481170 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.527414 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02cd6358-355c-4db8-b0f7-2528618602ff-catalog-content\") pod \"certified-operators-w7rrs\" (UID: \"02cd6358-355c-4db8-b0f7-2528618602ff\") " pod="openshift-marketplace/certified-operators-w7rrs" Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.568340 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gmw2d"] Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.580898 4733 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.582537 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5vpf\" (UniqueName: \"kubernetes.io/projected/02cd6358-355c-4db8-b0f7-2528618602ff-kube-api-access-s5vpf\") pod \"certified-operators-w7rrs\" (UID: \"02cd6358-355c-4db8-b0f7-2528618602ff\") " pod="openshift-marketplace/certified-operators-w7rrs" Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.636316 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7eb97f2d-18fa-4e8c-895f-de4602c9dbbc-catalog-content\") pod \"community-operators-gmw2d\" (UID: \"7eb97f2d-18fa-4e8c-895f-de4602c9dbbc\") " pod="openshift-marketplace/community-operators-gmw2d" Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.636402 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rs5rx\" (UniqueName: \"kubernetes.io/projected/7eb97f2d-18fa-4e8c-895f-de4602c9dbbc-kube-api-access-rs5rx\") pod \"community-operators-gmw2d\" (UID: \"7eb97f2d-18fa-4e8c-895f-de4602c9dbbc\") " pod="openshift-marketplace/community-operators-gmw2d" Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.636433 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:34 crc kubenswrapper[4733]: E0318 10:16:34.636820 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 10:16:35.136804125 +0000 UTC m=+234.628538450 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nwhtg" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.637035 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7eb97f2d-18fa-4e8c-895f-de4602c9dbbc-utilities\") pod \"community-operators-gmw2d\" (UID: \"7eb97f2d-18fa-4e8c-895f-de4602c9dbbc\") " pod="openshift-marketplace/community-operators-gmw2d" Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.722273 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w7rrs" Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.727636 4733 generic.go:334] "Generic (PLEG): container finished" podID="25331c44-b639-46f7-8a7f-6f62f8779e2b" containerID="3c09df7a275938153d455f147ffe12eff185edea72f1d8646898b9ba5ba684d5" exitCode=0 Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.727739 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m5k95" event={"ID":"25331c44-b639-46f7-8a7f-6f62f8779e2b","Type":"ContainerDied","Data":"3c09df7a275938153d455f147ffe12eff185edea72f1d8646898b9ba5ba684d5"} Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.735568 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-7rr85" podUID="34ea1a9f-9093-421f-bef3-228352aa65fb" containerName="controller-manager" containerID="cri-o://03eb463e4dd483e49a80db7e81b412b8d031879512dded00ef00cc215cf14f23" gracePeriod=30 Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.735732 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-p4b5s" event={"ID":"ed943a82-ef39-4ebc-9d76-09bb69f3b800","Type":"ContainerStarted","Data":"9e583c49869d66f8518955622944a13bd3b0b124a8088e3d47f4120cdd1db2a7"} Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.735766 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-p4b5s" event={"ID":"ed943a82-ef39-4ebc-9d76-09bb69f3b800","Type":"ContainerStarted","Data":"54d261c692de36f53168ef4d6dffbad6c0b15a9ca777692c089156d4c4a063d1"} Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.738819 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.739179 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7eb97f2d-18fa-4e8c-895f-de4602c9dbbc-utilities\") pod \"community-operators-gmw2d\" (UID: \"7eb97f2d-18fa-4e8c-895f-de4602c9dbbc\") " pod="openshift-marketplace/community-operators-gmw2d" Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.739278 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7eb97f2d-18fa-4e8c-895f-de4602c9dbbc-catalog-content\") pod \"community-operators-gmw2d\" (UID: \"7eb97f2d-18fa-4e8c-895f-de4602c9dbbc\") " pod="openshift-marketplace/community-operators-gmw2d" Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.739326 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rs5rx\" (UniqueName: \"kubernetes.io/projected/7eb97f2d-18fa-4e8c-895f-de4602c9dbbc-kube-api-access-rs5rx\") pod \"community-operators-gmw2d\" (UID: \"7eb97f2d-18fa-4e8c-895f-de4602c9dbbc\") " pod="openshift-marketplace/community-operators-gmw2d" Mar 18 10:16:34 crc kubenswrapper[4733]: E0318 10:16:34.739833 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:16:35.23981108 +0000 UTC m=+234.731545405 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.740313 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7eb97f2d-18fa-4e8c-895f-de4602c9dbbc-utilities\") pod \"community-operators-gmw2d\" (UID: \"7eb97f2d-18fa-4e8c-895f-de4602c9dbbc\") " pod="openshift-marketplace/community-operators-gmw2d" Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.740550 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7eb97f2d-18fa-4e8c-895f-de4602c9dbbc-catalog-content\") pod \"community-operators-gmw2d\" (UID: \"7eb97f2d-18fa-4e8c-895f-de4602c9dbbc\") " pod="openshift-marketplace/community-operators-gmw2d" Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.768707 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rs5rx\" (UniqueName: \"kubernetes.io/projected/7eb97f2d-18fa-4e8c-895f-de4602c9dbbc-kube-api-access-rs5rx\") pod \"community-operators-gmw2d\" (UID: \"7eb97f2d-18fa-4e8c-895f-de4602c9dbbc\") " pod="openshift-marketplace/community-operators-gmw2d" Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.855117 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.856549 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gmw2d" Mar 18 10:16:34 crc kubenswrapper[4733]: E0318 10:16:34.858586 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 10:16:35.358567629 +0000 UTC m=+234.850301954 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nwhtg" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.867801 4733 patch_prober.go:28] interesting pod/router-default-5444994796-xl5d7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 10:16:34 crc kubenswrapper[4733]: [-]has-synced failed: reason withheld Mar 18 10:16:34 crc kubenswrapper[4733]: [+]process-running ok Mar 18 10:16:34 crc kubenswrapper[4733]: healthz check failed Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.867856 4733 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xl5d7" podUID="9c5f567e-b38f-44a0-b1fd-1a96857e811f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.956609 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:16:34 crc kubenswrapper[4733]: E0318 10:16:34.957087 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:16:35.45706341 +0000 UTC m=+234.948797735 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.957313 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m5k95" Mar 18 10:16:34 crc kubenswrapper[4733]: I0318 10:16:34.990856 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-p4b5s" podStartSLOduration=10.9908261 podStartE2EDuration="10.9908261s" podCreationTimestamp="2026-03-18 10:16:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:16:34.789867221 +0000 UTC m=+234.281601546" watchObservedRunningTime="2026-03-18 10:16:34.9908261 +0000 UTC m=+234.482560425" Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.058150 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25331c44-b639-46f7-8a7f-6f62f8779e2b-serving-cert\") pod \"25331c44-b639-46f7-8a7f-6f62f8779e2b\" (UID: \"25331c44-b639-46f7-8a7f-6f62f8779e2b\") " Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.058260 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25331c44-b639-46f7-8a7f-6f62f8779e2b-config\") pod \"25331c44-b639-46f7-8a7f-6f62f8779e2b\" (UID: \"25331c44-b639-46f7-8a7f-6f62f8779e2b\") " Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.058379 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzj8w\" (UniqueName: \"kubernetes.io/projected/25331c44-b639-46f7-8a7f-6f62f8779e2b-kube-api-access-rzj8w\") pod \"25331c44-b639-46f7-8a7f-6f62f8779e2b\" (UID: \"25331c44-b639-46f7-8a7f-6f62f8779e2b\") " Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.059450 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/25331c44-b639-46f7-8a7f-6f62f8779e2b-client-ca\") pod \"25331c44-b639-46f7-8a7f-6f62f8779e2b\" (UID: \"25331c44-b639-46f7-8a7f-6f62f8779e2b\") " Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.059566 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25331c44-b639-46f7-8a7f-6f62f8779e2b-config" (OuterVolumeSpecName: "config") pod "25331c44-b639-46f7-8a7f-6f62f8779e2b" (UID: "25331c44-b639-46f7-8a7f-6f62f8779e2b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.060232 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25331c44-b639-46f7-8a7f-6f62f8779e2b-client-ca" (OuterVolumeSpecName: "client-ca") pod "25331c44-b639-46f7-8a7f-6f62f8779e2b" (UID: "25331c44-b639-46f7-8a7f-6f62f8779e2b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.060415 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.060614 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25331c44-b639-46f7-8a7f-6f62f8779e2b-config\") on node \"crc\" DevicePath \"\"" Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.060639 4733 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/25331c44-b639-46f7-8a7f-6f62f8779e2b-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 10:16:35 crc kubenswrapper[4733]: E0318 10:16:35.064173 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 10:16:35.564149355 +0000 UTC m=+235.055883680 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nwhtg" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.069111 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25331c44-b639-46f7-8a7f-6f62f8779e2b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "25331c44-b639-46f7-8a7f-6f62f8779e2b" (UID: "25331c44-b639-46f7-8a7f-6f62f8779e2b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.071562 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25331c44-b639-46f7-8a7f-6f62f8779e2b-kube-api-access-rzj8w" (OuterVolumeSpecName: "kube-api-access-rzj8w") pod "25331c44-b639-46f7-8a7f-6f62f8779e2b" (UID: "25331c44-b639-46f7-8a7f-6f62f8779e2b"). InnerVolumeSpecName "kube-api-access-rzj8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.097746 4733 ???:1] "http: TLS handshake error from 192.168.126.11:33538: no serving certificate available for the kubelet" Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.163254 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.163646 4733 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25331c44-b639-46f7-8a7f-6f62f8779e2b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.163661 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzj8w\" (UniqueName: \"kubernetes.io/projected/25331c44-b639-46f7-8a7f-6f62f8779e2b-kube-api-access-rzj8w\") on node \"crc\" DevicePath \"\"" Mar 18 10:16:35 crc kubenswrapper[4733]: E0318 10:16:35.163747 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:16:35.66372416 +0000 UTC m=+235.155458485 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.168126 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f92nl"] Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.194967 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 18 10:16:35 crc kubenswrapper[4733]: E0318 10:16:35.195344 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25331c44-b639-46f7-8a7f-6f62f8779e2b" containerName="route-controller-manager" Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.195423 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="25331c44-b639-46f7-8a7f-6f62f8779e2b" containerName="route-controller-manager" Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.195598 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="25331c44-b639-46f7-8a7f-6f62f8779e2b" containerName="route-controller-manager" Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.196048 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rls2r"] Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.196125 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.196271 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.200215 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.200338 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.265008 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/75f42e15-d1dc-4edf-8f2e-daef04ccc601-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"75f42e15-d1dc-4edf-8f2e-daef04ccc601\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.265054 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/75f42e15-d1dc-4edf-8f2e-daef04ccc601-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"75f42e15-d1dc-4edf-8f2e-daef04ccc601\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.265086 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:35 crc kubenswrapper[4733]: E0318 10:16:35.265501 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 10:16:35.765487836 +0000 UTC m=+235.257222161 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nwhtg" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.296763 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-7rr85" Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.333143 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gmw2d"] Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.360746 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w7rrs"] Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.366863 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34ea1a9f-9093-421f-bef3-228352aa65fb-config\") pod \"34ea1a9f-9093-421f-bef3-228352aa65fb\" (UID: \"34ea1a9f-9093-421f-bef3-228352aa65fb\") " Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.367214 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ll8g5\" (UniqueName: \"kubernetes.io/projected/34ea1a9f-9093-421f-bef3-228352aa65fb-kube-api-access-ll8g5\") pod \"34ea1a9f-9093-421f-bef3-228352aa65fb\" (UID: \"34ea1a9f-9093-421f-bef3-228352aa65fb\") " Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.367418 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34ea1a9f-9093-421f-bef3-228352aa65fb-serving-cert\") pod \"34ea1a9f-9093-421f-bef3-228352aa65fb\" (UID: \"34ea1a9f-9093-421f-bef3-228352aa65fb\") " Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.367533 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/34ea1a9f-9093-421f-bef3-228352aa65fb-proxy-ca-bundles\") pod \"34ea1a9f-9093-421f-bef3-228352aa65fb\" (UID: \"34ea1a9f-9093-421f-bef3-228352aa65fb\") " Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.369765 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.370980 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/34ea1a9f-9093-421f-bef3-228352aa65fb-client-ca\") pod \"34ea1a9f-9093-421f-bef3-228352aa65fb\" (UID: \"34ea1a9f-9093-421f-bef3-228352aa65fb\") " Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.371460 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34ea1a9f-9093-421f-bef3-228352aa65fb-config" (OuterVolumeSpecName: "config") pod "34ea1a9f-9093-421f-bef3-228352aa65fb" (UID: "34ea1a9f-9093-421f-bef3-228352aa65fb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.373706 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34ea1a9f-9093-421f-bef3-228352aa65fb-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "34ea1a9f-9093-421f-bef3-228352aa65fb" (UID: "34ea1a9f-9093-421f-bef3-228352aa65fb"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.375614 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34ea1a9f-9093-421f-bef3-228352aa65fb-client-ca" (OuterVolumeSpecName: "client-ca") pod "34ea1a9f-9093-421f-bef3-228352aa65fb" (UID: "34ea1a9f-9093-421f-bef3-228352aa65fb"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.376098 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34ea1a9f-9093-421f-bef3-228352aa65fb-kube-api-access-ll8g5" (OuterVolumeSpecName: "kube-api-access-ll8g5") pod "34ea1a9f-9093-421f-bef3-228352aa65fb" (UID: "34ea1a9f-9093-421f-bef3-228352aa65fb"). InnerVolumeSpecName "kube-api-access-ll8g5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:16:35 crc kubenswrapper[4733]: E0318 10:16:35.386902 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 10:16:35.886852968 +0000 UTC m=+235.378587293 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.387032 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/75f42e15-d1dc-4edf-8f2e-daef04ccc601-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"75f42e15-d1dc-4edf-8f2e-daef04ccc601\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.387080 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/75f42e15-d1dc-4edf-8f2e-daef04ccc601-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"75f42e15-d1dc-4edf-8f2e-daef04ccc601\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.387136 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.387370 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34ea1a9f-9093-421f-bef3-228352aa65fb-config\") on node \"crc\" DevicePath \"\"" Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.387387 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ll8g5\" (UniqueName: \"kubernetes.io/projected/34ea1a9f-9093-421f-bef3-228352aa65fb-kube-api-access-ll8g5\") on node \"crc\" DevicePath \"\"" Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.387399 4733 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/34ea1a9f-9093-421f-bef3-228352aa65fb-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.387412 4733 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/34ea1a9f-9093-421f-bef3-228352aa65fb-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 10:16:35 crc kubenswrapper[4733]: E0318 10:16:35.387778 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 10:16:35.887770307 +0000 UTC m=+235.379504632 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nwhtg" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.387813 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/75f42e15-d1dc-4edf-8f2e-daef04ccc601-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"75f42e15-d1dc-4edf-8f2e-daef04ccc601\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.392309 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34ea1a9f-9093-421f-bef3-228352aa65fb-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "34ea1a9f-9093-421f-bef3-228352aa65fb" (UID: "34ea1a9f-9093-421f-bef3-228352aa65fb"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.403281 4733 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-18T10:16:34.580923168Z","Handler":null,"Name":""} Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.421731 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6946475f8-lnppg"] Mar 18 10:16:35 crc kubenswrapper[4733]: E0318 10:16:35.422168 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34ea1a9f-9093-421f-bef3-228352aa65fb" containerName="controller-manager" Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.422201 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="34ea1a9f-9093-421f-bef3-228352aa65fb" containerName="controller-manager" Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.422409 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="34ea1a9f-9093-421f-bef3-228352aa65fb" containerName="controller-manager" Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.423032 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6946475f8-lnppg" Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.428593 4733 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.428650 4733 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.435745 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6946475f8-lnppg"] Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.437419 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/75f42e15-d1dc-4edf-8f2e-daef04ccc601-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"75f42e15-d1dc-4edf-8f2e-daef04ccc601\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.488527 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.488972 4733 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34ea1a9f-9093-421f-bef3-228352aa65fb-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.492865 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.531816 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.590927 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1954584-f4d6-467a-8b0c-9db32f9e385c-serving-cert\") pod \"route-controller-manager-6946475f8-lnppg\" (UID: \"d1954584-f4d6-467a-8b0c-9db32f9e385c\") " pod="openshift-route-controller-manager/route-controller-manager-6946475f8-lnppg" Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.591134 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.591228 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d1954584-f4d6-467a-8b0c-9db32f9e385c-client-ca\") pod \"route-controller-manager-6946475f8-lnppg\" (UID: \"d1954584-f4d6-467a-8b0c-9db32f9e385c\") " pod="openshift-route-controller-manager/route-controller-manager-6946475f8-lnppg" Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.591298 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmg68\" (UniqueName: \"kubernetes.io/projected/d1954584-f4d6-467a-8b0c-9db32f9e385c-kube-api-access-dmg68\") pod \"route-controller-manager-6946475f8-lnppg\" (UID: \"d1954584-f4d6-467a-8b0c-9db32f9e385c\") " pod="openshift-route-controller-manager/route-controller-manager-6946475f8-lnppg" Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.591488 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1954584-f4d6-467a-8b0c-9db32f9e385c-config\") pod \"route-controller-manager-6946475f8-lnppg\" (UID: \"d1954584-f4d6-467a-8b0c-9db32f9e385c\") " pod="openshift-route-controller-manager/route-controller-manager-6946475f8-lnppg" Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.599489 4733 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.599544 4733 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.630505 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nwhtg\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.693354 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1954584-f4d6-467a-8b0c-9db32f9e385c-config\") pod \"route-controller-manager-6946475f8-lnppg\" (UID: \"d1954584-f4d6-467a-8b0c-9db32f9e385c\") " pod="openshift-route-controller-manager/route-controller-manager-6946475f8-lnppg" Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.693440 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1954584-f4d6-467a-8b0c-9db32f9e385c-serving-cert\") pod \"route-controller-manager-6946475f8-lnppg\" (UID: \"d1954584-f4d6-467a-8b0c-9db32f9e385c\") " pod="openshift-route-controller-manager/route-controller-manager-6946475f8-lnppg" Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.693496 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d1954584-f4d6-467a-8b0c-9db32f9e385c-client-ca\") pod \"route-controller-manager-6946475f8-lnppg\" (UID: \"d1954584-f4d6-467a-8b0c-9db32f9e385c\") " pod="openshift-route-controller-manager/route-controller-manager-6946475f8-lnppg" Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.693533 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmg68\" (UniqueName: \"kubernetes.io/projected/d1954584-f4d6-467a-8b0c-9db32f9e385c-kube-api-access-dmg68\") pod \"route-controller-manager-6946475f8-lnppg\" (UID: \"d1954584-f4d6-467a-8b0c-9db32f9e385c\") " pod="openshift-route-controller-manager/route-controller-manager-6946475f8-lnppg" Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.695127 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1954584-f4d6-467a-8b0c-9db32f9e385c-config\") pod \"route-controller-manager-6946475f8-lnppg\" (UID: \"d1954584-f4d6-467a-8b0c-9db32f9e385c\") " pod="openshift-route-controller-manager/route-controller-manager-6946475f8-lnppg" Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.696824 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d1954584-f4d6-467a-8b0c-9db32f9e385c-client-ca\") pod \"route-controller-manager-6946475f8-lnppg\" (UID: \"d1954584-f4d6-467a-8b0c-9db32f9e385c\") " pod="openshift-route-controller-manager/route-controller-manager-6946475f8-lnppg" Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.700347 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1954584-f4d6-467a-8b0c-9db32f9e385c-serving-cert\") pod \"route-controller-manager-6946475f8-lnppg\" (UID: \"d1954584-f4d6-467a-8b0c-9db32f9e385c\") " pod="openshift-route-controller-manager/route-controller-manager-6946475f8-lnppg" Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.709733 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmg68\" (UniqueName: \"kubernetes.io/projected/d1954584-f4d6-467a-8b0c-9db32f9e385c-kube-api-access-dmg68\") pod \"route-controller-manager-6946475f8-lnppg\" (UID: \"d1954584-f4d6-467a-8b0c-9db32f9e385c\") " pod="openshift-route-controller-manager/route-controller-manager-6946475f8-lnppg" Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.722782 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.741571 4733 generic.go:334] "Generic (PLEG): container finished" podID="34ea1a9f-9093-421f-bef3-228352aa65fb" containerID="03eb463e4dd483e49a80db7e81b412b8d031879512dded00ef00cc215cf14f23" exitCode=0 Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.741683 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-7rr85" Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.741656 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-7rr85" event={"ID":"34ea1a9f-9093-421f-bef3-228352aa65fb","Type":"ContainerDied","Data":"03eb463e4dd483e49a80db7e81b412b8d031879512dded00ef00cc215cf14f23"} Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.742045 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-7rr85" event={"ID":"34ea1a9f-9093-421f-bef3-228352aa65fb","Type":"ContainerDied","Data":"9ae227eb47f15060adbbd96eb5744108b09bf4ae0d948bfba7b04b2c867d1d95"} Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.742084 4733 scope.go:117] "RemoveContainer" containerID="03eb463e4dd483e49a80db7e81b412b8d031879512dded00ef00cc215cf14f23" Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.744322 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gmw2d" event={"ID":"7eb97f2d-18fa-4e8c-895f-de4602c9dbbc","Type":"ContainerStarted","Data":"a2b42f75b17ecdce018f92ac6406accaeca335b14c1245cfd417767d5e5802c4"} Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.747600 4733 generic.go:334] "Generic (PLEG): container finished" podID="92996997-080b-42c9-bc2c-19c2e68db896" containerID="a9bf744158dbc316b120322e1385bd5232386e738d2db0f1d91d2ac7d8a7ad1a" exitCode=0 Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.747859 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rls2r" event={"ID":"92996997-080b-42c9-bc2c-19c2e68db896","Type":"ContainerDied","Data":"a9bf744158dbc316b120322e1385bd5232386e738d2db0f1d91d2ac7d8a7ad1a"} Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.747890 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rls2r" event={"ID":"92996997-080b-42c9-bc2c-19c2e68db896","Type":"ContainerStarted","Data":"448a9e96bdf06f234c1da361f4be5cda2d36bf670a134ff4f206711028d80cac"} Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.750528 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w7rrs" event={"ID":"02cd6358-355c-4db8-b0f7-2528618602ff","Type":"ContainerStarted","Data":"bf716e26a7a1e4408c9cf17e7366833bdc30d38efd823adf2eb5d92d8a80e381"} Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.755441 4733 generic.go:334] "Generic (PLEG): container finished" podID="527056ad-4daf-4dd5-9e31-887d55be0336" containerID="83fe7a9d478dddba70a4985b321c90b2fd18ace1a534bec99183ab383ee3f274" exitCode=0 Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.755587 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f92nl" event={"ID":"527056ad-4daf-4dd5-9e31-887d55be0336","Type":"ContainerDied","Data":"83fe7a9d478dddba70a4985b321c90b2fd18ace1a534bec99183ab383ee3f274"} Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.756000 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f92nl" event={"ID":"527056ad-4daf-4dd5-9e31-887d55be0336","Type":"ContainerStarted","Data":"5d9e5dab0932c3cd3cd8b8f12fa8d0d49db59eddcefaa706bd16f11d86be1eac"} Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.779789 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m5k95" Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.780384 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m5k95" event={"ID":"25331c44-b639-46f7-8a7f-6f62f8779e2b","Type":"ContainerDied","Data":"0a4e2b2140bacea055efd9eb333f7f7f1da7235e623090af40eaf58bc070ecb2"} Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.813210 4733 scope.go:117] "RemoveContainer" containerID="03eb463e4dd483e49a80db7e81b412b8d031879512dded00ef00cc215cf14f23" Mar 18 10:16:35 crc kubenswrapper[4733]: E0318 10:16:35.817109 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03eb463e4dd483e49a80db7e81b412b8d031879512dded00ef00cc215cf14f23\": container with ID starting with 03eb463e4dd483e49a80db7e81b412b8d031879512dded00ef00cc215cf14f23 not found: ID does not exist" containerID="03eb463e4dd483e49a80db7e81b412b8d031879512dded00ef00cc215cf14f23" Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.817176 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03eb463e4dd483e49a80db7e81b412b8d031879512dded00ef00cc215cf14f23"} err="failed to get container status \"03eb463e4dd483e49a80db7e81b412b8d031879512dded00ef00cc215cf14f23\": rpc error: code = NotFound desc = could not find container \"03eb463e4dd483e49a80db7e81b412b8d031879512dded00ef00cc215cf14f23\": container with ID starting with 03eb463e4dd483e49a80db7e81b412b8d031879512dded00ef00cc215cf14f23 not found: ID does not exist" Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.817243 4733 scope.go:117] "RemoveContainer" containerID="3c09df7a275938153d455f147ffe12eff185edea72f1d8646898b9ba5ba684d5" Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.830508 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7rr85"] Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.849980 4733 patch_prober.go:28] interesting pod/router-default-5444994796-xl5d7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 10:16:35 crc kubenswrapper[4733]: [-]has-synced failed: reason withheld Mar 18 10:16:35 crc kubenswrapper[4733]: [+]process-running ok Mar 18 10:16:35 crc kubenswrapper[4733]: healthz check failed Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.850039 4733 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xl5d7" podUID="9c5f567e-b38f-44a0-b1fd-1a96857e811f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.861410 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7rr85"] Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.877353 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-m5k95"] Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.880264 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-m5k95"] Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.891121 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.904742 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jb86w"] Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.908080 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jb86w" Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.910464 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.917058 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jb86w"] Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.970507 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6946475f8-lnppg" Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.998269 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fd306cb-05db-40e1-a1ec-9f811ce7fec0-utilities\") pod \"redhat-marketplace-jb86w\" (UID: \"0fd306cb-05db-40e1-a1ec-9f811ce7fec0\") " pod="openshift-marketplace/redhat-marketplace-jb86w" Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.998347 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fd306cb-05db-40e1-a1ec-9f811ce7fec0-catalog-content\") pod \"redhat-marketplace-jb86w\" (UID: \"0fd306cb-05db-40e1-a1ec-9f811ce7fec0\") " pod="openshift-marketplace/redhat-marketplace-jb86w" Mar 18 10:16:35 crc kubenswrapper[4733]: I0318 10:16:35.998418 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5frtf\" (UniqueName: \"kubernetes.io/projected/0fd306cb-05db-40e1-a1ec-9f811ce7fec0-kube-api-access-5frtf\") pod \"redhat-marketplace-jb86w\" (UID: \"0fd306cb-05db-40e1-a1ec-9f811ce7fec0\") " pod="openshift-marketplace/redhat-marketplace-jb86w" Mar 18 10:16:36 crc kubenswrapper[4733]: I0318 10:16:36.101099 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fd306cb-05db-40e1-a1ec-9f811ce7fec0-utilities\") pod \"redhat-marketplace-jb86w\" (UID: \"0fd306cb-05db-40e1-a1ec-9f811ce7fec0\") " pod="openshift-marketplace/redhat-marketplace-jb86w" Mar 18 10:16:36 crc kubenswrapper[4733]: I0318 10:16:36.102923 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fd306cb-05db-40e1-a1ec-9f811ce7fec0-catalog-content\") pod \"redhat-marketplace-jb86w\" (UID: \"0fd306cb-05db-40e1-a1ec-9f811ce7fec0\") " pod="openshift-marketplace/redhat-marketplace-jb86w" Mar 18 10:16:36 crc kubenswrapper[4733]: I0318 10:16:36.103015 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5frtf\" (UniqueName: \"kubernetes.io/projected/0fd306cb-05db-40e1-a1ec-9f811ce7fec0-kube-api-access-5frtf\") pod \"redhat-marketplace-jb86w\" (UID: \"0fd306cb-05db-40e1-a1ec-9f811ce7fec0\") " pod="openshift-marketplace/redhat-marketplace-jb86w" Mar 18 10:16:36 crc kubenswrapper[4733]: I0318 10:16:36.102476 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fd306cb-05db-40e1-a1ec-9f811ce7fec0-utilities\") pod \"redhat-marketplace-jb86w\" (UID: \"0fd306cb-05db-40e1-a1ec-9f811ce7fec0\") " pod="openshift-marketplace/redhat-marketplace-jb86w" Mar 18 10:16:36 crc kubenswrapper[4733]: I0318 10:16:36.103709 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fd306cb-05db-40e1-a1ec-9f811ce7fec0-catalog-content\") pod \"redhat-marketplace-jb86w\" (UID: \"0fd306cb-05db-40e1-a1ec-9f811ce7fec0\") " pod="openshift-marketplace/redhat-marketplace-jb86w" Mar 18 10:16:36 crc kubenswrapper[4733]: I0318 10:16:36.130021 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5frtf\" (UniqueName: \"kubernetes.io/projected/0fd306cb-05db-40e1-a1ec-9f811ce7fec0-kube-api-access-5frtf\") pod \"redhat-marketplace-jb86w\" (UID: \"0fd306cb-05db-40e1-a1ec-9f811ce7fec0\") " pod="openshift-marketplace/redhat-marketplace-jb86w" Mar 18 10:16:36 crc kubenswrapper[4733]: I0318 10:16:36.222851 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-nwhtg"] Mar 18 10:16:36 crc kubenswrapper[4733]: I0318 10:16:36.242551 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jb86w" Mar 18 10:16:36 crc kubenswrapper[4733]: I0318 10:16:36.305869 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-f82xf"] Mar 18 10:16:36 crc kubenswrapper[4733]: I0318 10:16:36.306940 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f82xf" Mar 18 10:16:36 crc kubenswrapper[4733]: I0318 10:16:36.319211 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f82xf"] Mar 18 10:16:36 crc kubenswrapper[4733]: I0318 10:16:36.408051 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c91f12fa-96f0-442a-a3f7-70d56a697839-catalog-content\") pod \"redhat-marketplace-f82xf\" (UID: \"c91f12fa-96f0-442a-a3f7-70d56a697839\") " pod="openshift-marketplace/redhat-marketplace-f82xf" Mar 18 10:16:36 crc kubenswrapper[4733]: I0318 10:16:36.408678 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6r2vl\" (UniqueName: \"kubernetes.io/projected/c91f12fa-96f0-442a-a3f7-70d56a697839-kube-api-access-6r2vl\") pod \"redhat-marketplace-f82xf\" (UID: \"c91f12fa-96f0-442a-a3f7-70d56a697839\") " pod="openshift-marketplace/redhat-marketplace-f82xf" Mar 18 10:16:36 crc kubenswrapper[4733]: I0318 10:16:36.408711 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c91f12fa-96f0-442a-a3f7-70d56a697839-utilities\") pod \"redhat-marketplace-f82xf\" (UID: \"c91f12fa-96f0-442a-a3f7-70d56a697839\") " pod="openshift-marketplace/redhat-marketplace-f82xf" Mar 18 10:16:36 crc kubenswrapper[4733]: I0318 10:16:36.490329 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6946475f8-lnppg"] Mar 18 10:16:36 crc kubenswrapper[4733]: I0318 10:16:36.509656 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c91f12fa-96f0-442a-a3f7-70d56a697839-catalog-content\") pod \"redhat-marketplace-f82xf\" (UID: \"c91f12fa-96f0-442a-a3f7-70d56a697839\") " pod="openshift-marketplace/redhat-marketplace-f82xf" Mar 18 10:16:36 crc kubenswrapper[4733]: I0318 10:16:36.509729 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6r2vl\" (UniqueName: \"kubernetes.io/projected/c91f12fa-96f0-442a-a3f7-70d56a697839-kube-api-access-6r2vl\") pod \"redhat-marketplace-f82xf\" (UID: \"c91f12fa-96f0-442a-a3f7-70d56a697839\") " pod="openshift-marketplace/redhat-marketplace-f82xf" Mar 18 10:16:36 crc kubenswrapper[4733]: I0318 10:16:36.509764 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c91f12fa-96f0-442a-a3f7-70d56a697839-utilities\") pod \"redhat-marketplace-f82xf\" (UID: \"c91f12fa-96f0-442a-a3f7-70d56a697839\") " pod="openshift-marketplace/redhat-marketplace-f82xf" Mar 18 10:16:36 crc kubenswrapper[4733]: I0318 10:16:36.510278 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c91f12fa-96f0-442a-a3f7-70d56a697839-catalog-content\") pod \"redhat-marketplace-f82xf\" (UID: \"c91f12fa-96f0-442a-a3f7-70d56a697839\") " pod="openshift-marketplace/redhat-marketplace-f82xf" Mar 18 10:16:36 crc kubenswrapper[4733]: I0318 10:16:36.510329 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c91f12fa-96f0-442a-a3f7-70d56a697839-utilities\") pod \"redhat-marketplace-f82xf\" (UID: \"c91f12fa-96f0-442a-a3f7-70d56a697839\") " pod="openshift-marketplace/redhat-marketplace-f82xf" Mar 18 10:16:36 crc kubenswrapper[4733]: W0318 10:16:36.526784 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1954584_f4d6_467a_8b0c_9db32f9e385c.slice/crio-f9b9ef0617ed480f3b6667ef58bad6aa0295e3670f7f52f36e961678f655af59 WatchSource:0}: Error finding container f9b9ef0617ed480f3b6667ef58bad6aa0295e3670f7f52f36e961678f655af59: Status 404 returned error can't find the container with id f9b9ef0617ed480f3b6667ef58bad6aa0295e3670f7f52f36e961678f655af59 Mar 18 10:16:36 crc kubenswrapper[4733]: I0318 10:16:36.538674 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6r2vl\" (UniqueName: \"kubernetes.io/projected/c91f12fa-96f0-442a-a3f7-70d56a697839-kube-api-access-6r2vl\") pod \"redhat-marketplace-f82xf\" (UID: \"c91f12fa-96f0-442a-a3f7-70d56a697839\") " pod="openshift-marketplace/redhat-marketplace-f82xf" Mar 18 10:16:36 crc kubenswrapper[4733]: I0318 10:16:36.574121 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jb86w"] Mar 18 10:16:36 crc kubenswrapper[4733]: I0318 10:16:36.632228 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f82xf" Mar 18 10:16:36 crc kubenswrapper[4733]: I0318 10:16:36.798402 4733 generic.go:334] "Generic (PLEG): container finished" podID="7eb97f2d-18fa-4e8c-895f-de4602c9dbbc" containerID="17741288ba852c25d8355eb97aa338d2e36690e9d066bbb56a0857710c52f266" exitCode=0 Mar 18 10:16:36 crc kubenswrapper[4733]: I0318 10:16:36.798999 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gmw2d" event={"ID":"7eb97f2d-18fa-4e8c-895f-de4602c9dbbc","Type":"ContainerDied","Data":"17741288ba852c25d8355eb97aa338d2e36690e9d066bbb56a0857710c52f266"} Mar 18 10:16:36 crc kubenswrapper[4733]: I0318 10:16:36.808292 4733 generic.go:334] "Generic (PLEG): container finished" podID="02cd6358-355c-4db8-b0f7-2528618602ff" containerID="0f769f3a01023165d2b55b37631a8e64c99ba9561927f3f83986829531dcb6ed" exitCode=0 Mar 18 10:16:36 crc kubenswrapper[4733]: I0318 10:16:36.808745 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w7rrs" event={"ID":"02cd6358-355c-4db8-b0f7-2528618602ff","Type":"ContainerDied","Data":"0f769f3a01023165d2b55b37631a8e64c99ba9561927f3f83986829531dcb6ed"} Mar 18 10:16:36 crc kubenswrapper[4733]: I0318 10:16:36.819353 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jb86w" event={"ID":"0fd306cb-05db-40e1-a1ec-9f811ce7fec0","Type":"ContainerStarted","Data":"152bb2d9d2d5d61c127ef6162804e32f4f4e993fb3a1aa90d7238cb79aedf035"} Mar 18 10:16:36 crc kubenswrapper[4733]: I0318 10:16:36.819940 4733 patch_prober.go:28] interesting pod/downloads-7954f5f757-gxcb2 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" start-of-body= Mar 18 10:16:36 crc kubenswrapper[4733]: I0318 10:16:36.819991 4733 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-gxcb2" podUID="61e27ee7-5eb0-4cc7-a696-85ddd192b171" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" Mar 18 10:16:36 crc kubenswrapper[4733]: I0318 10:16:36.820644 4733 patch_prober.go:28] interesting pod/downloads-7954f5f757-gxcb2 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" start-of-body= Mar 18 10:16:36 crc kubenswrapper[4733]: I0318 10:16:36.820670 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-gxcb2" podUID="61e27ee7-5eb0-4cc7-a696-85ddd192b171" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" Mar 18 10:16:36 crc kubenswrapper[4733]: I0318 10:16:36.827888 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6946475f8-lnppg" event={"ID":"d1954584-f4d6-467a-8b0c-9db32f9e385c","Type":"ContainerStarted","Data":"bb3b81003e39ecbea213f0c0b02b7e8dae8e9507c2c01e812e0dce4d4f5c71d4"} Mar 18 10:16:36 crc kubenswrapper[4733]: I0318 10:16:36.827960 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6946475f8-lnppg" event={"ID":"d1954584-f4d6-467a-8b0c-9db32f9e385c","Type":"ContainerStarted","Data":"f9b9ef0617ed480f3b6667ef58bad6aa0295e3670f7f52f36e961678f655af59"} Mar 18 10:16:36 crc kubenswrapper[4733]: I0318 10:16:36.829873 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6946475f8-lnppg" Mar 18 10:16:36 crc kubenswrapper[4733]: I0318 10:16:36.840739 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-xl5d7" Mar 18 10:16:36 crc kubenswrapper[4733]: I0318 10:16:36.871553 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"75f42e15-d1dc-4edf-8f2e-daef04ccc601","Type":"ContainerStarted","Data":"e35828e9f81ba55c9c2c8d38d1a7b2cf11a4f98596d388b277800589516f0e19"} Mar 18 10:16:36 crc kubenswrapper[4733]: I0318 10:16:36.871603 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"75f42e15-d1dc-4edf-8f2e-daef04ccc601","Type":"ContainerStarted","Data":"d4949220d6469d09f06a115f37de0df57649271ca132ce8e25268effeb41c8de"} Mar 18 10:16:36 crc kubenswrapper[4733]: I0318 10:16:36.874366 4733 patch_prober.go:28] interesting pod/router-default-5444994796-xl5d7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 10:16:36 crc kubenswrapper[4733]: [-]has-synced failed: reason withheld Mar 18 10:16:36 crc kubenswrapper[4733]: [+]process-running ok Mar 18 10:16:36 crc kubenswrapper[4733]: healthz check failed Mar 18 10:16:36 crc kubenswrapper[4733]: I0318 10:16:36.874426 4733 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xl5d7" podUID="9c5f567e-b38f-44a0-b1fd-1a96857e811f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 10:16:36 crc kubenswrapper[4733]: I0318 10:16:36.895348 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" event={"ID":"7b5dc098-4a15-429b-8243-1ac75ce2e0c1","Type":"ContainerStarted","Data":"d196ee7bec70e95ed9ff3308e0424855deaa072d0c4faba514ae98e1dcaec085"} Mar 18 10:16:36 crc kubenswrapper[4733]: I0318 10:16:36.895413 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" event={"ID":"7b5dc098-4a15-429b-8243-1ac75ce2e0c1","Type":"ContainerStarted","Data":"bf9beab436bdff3f99c6c06c629fb5de1f2bcd079250aacd7d55627140dc6e11"} Mar 18 10:16:36 crc kubenswrapper[4733]: I0318 10:16:36.896217 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:36 crc kubenswrapper[4733]: I0318 10:16:36.906543 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-8v244" Mar 18 10:16:36 crc kubenswrapper[4733]: I0318 10:16:36.907110 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-8v244" Mar 18 10:16:36 crc kubenswrapper[4733]: I0318 10:16:36.910876 4733 patch_prober.go:28] interesting pod/console-f9d7485db-8v244 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.18:8443/health\": dial tcp 10.217.0.18:8443: connect: connection refused" start-of-body= Mar 18 10:16:36 crc kubenswrapper[4733]: I0318 10:16:36.910933 4733 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-8v244" podUID="f27409fc-b6dd-4573-918b-7b30b3635cc7" containerName="console" probeResult="failure" output="Get \"https://10.217.0.18:8443/health\": dial tcp 10.217.0.18:8443: connect: connection refused" Mar 18 10:16:36 crc kubenswrapper[4733]: I0318 10:16:36.919207 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6946475f8-lnppg" podStartSLOduration=2.919164463 podStartE2EDuration="2.919164463s" podCreationTimestamp="2026-03-18 10:16:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:16:36.894670619 +0000 UTC m=+236.386404944" watchObservedRunningTime="2026-03-18 10:16:36.919164463 +0000 UTC m=+236.410898788" Mar 18 10:16:36 crc kubenswrapper[4733]: I0318 10:16:36.925526 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=1.925512346 podStartE2EDuration="1.925512346s" podCreationTimestamp="2026-03-18 10:16:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:16:36.921483597 +0000 UTC m=+236.413217922" watchObservedRunningTime="2026-03-18 10:16:36.925512346 +0000 UTC m=+236.417246671" Mar 18 10:16:36 crc kubenswrapper[4733]: I0318 10:16:36.947762 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" podStartSLOduration=178.947733267 podStartE2EDuration="2m58.947733267s" podCreationTimestamp="2026-03-18 10:13:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:16:36.947484019 +0000 UTC m=+236.439218354" watchObservedRunningTime="2026-03-18 10:16:36.947733267 +0000 UTC m=+236.439467592" Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.012879 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f82xf"] Mar 18 10:16:37 crc kubenswrapper[4733]: W0318 10:16:37.025439 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc91f12fa_96f0_442a_a3f7_70d56a697839.slice/crio-a8fa061a3aa824aa80f6c1569abe326d18dccd731789c62f81d22de7e9a828d3 WatchSource:0}: Error finding container a8fa061a3aa824aa80f6c1569abe326d18dccd731789c62f81d22de7e9a828d3: Status 404 returned error can't find the container with id a8fa061a3aa824aa80f6c1569abe326d18dccd731789c62f81d22de7e9a828d3 Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.193245 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25331c44-b639-46f7-8a7f-6f62f8779e2b" path="/var/lib/kubelet/pods/25331c44-b639-46f7-8a7f-6f62f8779e2b/volumes" Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.193917 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34ea1a9f-9093-421f-bef3-228352aa65fb" path="/var/lib/kubelet/pods/34ea1a9f-9093-421f-bef3-228352aa65fb/volumes" Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.199406 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.208998 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6946475f8-lnppg" Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.304547 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hrwxg"] Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.305736 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hrwxg" Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.308013 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.324323 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.325445 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.330150 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hrwxg"] Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.330544 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.330760 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.334563 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.379470 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb7ed879-1474-4200-88d4-70e425e2bcb1-catalog-content\") pod \"redhat-operators-hrwxg\" (UID: \"fb7ed879-1474-4200-88d4-70e425e2bcb1\") " pod="openshift-marketplace/redhat-operators-hrwxg" Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.379542 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb7ed879-1474-4200-88d4-70e425e2bcb1-utilities\") pod \"redhat-operators-hrwxg\" (UID: \"fb7ed879-1474-4200-88d4-70e425e2bcb1\") " pod="openshift-marketplace/redhat-operators-hrwxg" Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.379564 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsw8d\" (UniqueName: \"kubernetes.io/projected/fb7ed879-1474-4200-88d4-70e425e2bcb1-kube-api-access-jsw8d\") pod \"redhat-operators-hrwxg\" (UID: \"fb7ed879-1474-4200-88d4-70e425e2bcb1\") " pod="openshift-marketplace/redhat-operators-hrwxg" Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.385579 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-9h9xr" Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.412884 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-75c9fbd49-qmfds"] Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.413662 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75c9fbd49-qmfds" Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.421147 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.421559 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.421713 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.422143 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.423699 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.423872 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.435365 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.444535 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-75c9fbd49-qmfds"] Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.483377 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb7ed879-1474-4200-88d4-70e425e2bcb1-catalog-content\") pod \"redhat-operators-hrwxg\" (UID: \"fb7ed879-1474-4200-88d4-70e425e2bcb1\") " pod="openshift-marketplace/redhat-operators-hrwxg" Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.484555 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb7ed879-1474-4200-88d4-70e425e2bcb1-utilities\") pod \"redhat-operators-hrwxg\" (UID: \"fb7ed879-1474-4200-88d4-70e425e2bcb1\") " pod="openshift-marketplace/redhat-operators-hrwxg" Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.484592 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsw8d\" (UniqueName: \"kubernetes.io/projected/fb7ed879-1474-4200-88d4-70e425e2bcb1-kube-api-access-jsw8d\") pod \"redhat-operators-hrwxg\" (UID: \"fb7ed879-1474-4200-88d4-70e425e2bcb1\") " pod="openshift-marketplace/redhat-operators-hrwxg" Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.484642 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6503bd00-b300-438d-b10e-27380eaf7d9a-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"6503bd00-b300-438d-b10e-27380eaf7d9a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.484773 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6503bd00-b300-438d-b10e-27380eaf7d9a-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"6503bd00-b300-438d-b10e-27380eaf7d9a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.485496 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb7ed879-1474-4200-88d4-70e425e2bcb1-catalog-content\") pod \"redhat-operators-hrwxg\" (UID: \"fb7ed879-1474-4200-88d4-70e425e2bcb1\") " pod="openshift-marketplace/redhat-operators-hrwxg" Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.485641 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb7ed879-1474-4200-88d4-70e425e2bcb1-utilities\") pod \"redhat-operators-hrwxg\" (UID: \"fb7ed879-1474-4200-88d4-70e425e2bcb1\") " pod="openshift-marketplace/redhat-operators-hrwxg" Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.525106 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsw8d\" (UniqueName: \"kubernetes.io/projected/fb7ed879-1474-4200-88d4-70e425e2bcb1-kube-api-access-jsw8d\") pod \"redhat-operators-hrwxg\" (UID: \"fb7ed879-1474-4200-88d4-70e425e2bcb1\") " pod="openshift-marketplace/redhat-operators-hrwxg" Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.568484 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-xvnwv" Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.569038 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-xvnwv" Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.588851 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24e63f18-ff07-40b0-8289-351352d47d0a-serving-cert\") pod \"controller-manager-75c9fbd49-qmfds\" (UID: \"24e63f18-ff07-40b0-8289-351352d47d0a\") " pod="openshift-controller-manager/controller-manager-75c9fbd49-qmfds" Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.588969 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6503bd00-b300-438d-b10e-27380eaf7d9a-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"6503bd00-b300-438d-b10e-27380eaf7d9a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.589069 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/24e63f18-ff07-40b0-8289-351352d47d0a-client-ca\") pod \"controller-manager-75c9fbd49-qmfds\" (UID: \"24e63f18-ff07-40b0-8289-351352d47d0a\") " pod="openshift-controller-manager/controller-manager-75c9fbd49-qmfds" Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.589113 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6503bd00-b300-438d-b10e-27380eaf7d9a-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"6503bd00-b300-438d-b10e-27380eaf7d9a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.589742 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/24e63f18-ff07-40b0-8289-351352d47d0a-proxy-ca-bundles\") pod \"controller-manager-75c9fbd49-qmfds\" (UID: \"24e63f18-ff07-40b0-8289-351352d47d0a\") " pod="openshift-controller-manager/controller-manager-75c9fbd49-qmfds" Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.589796 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvgcj\" (UniqueName: \"kubernetes.io/projected/24e63f18-ff07-40b0-8289-351352d47d0a-kube-api-access-nvgcj\") pod \"controller-manager-75c9fbd49-qmfds\" (UID: \"24e63f18-ff07-40b0-8289-351352d47d0a\") " pod="openshift-controller-manager/controller-manager-75c9fbd49-qmfds" Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.589837 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24e63f18-ff07-40b0-8289-351352d47d0a-config\") pod \"controller-manager-75c9fbd49-qmfds\" (UID: \"24e63f18-ff07-40b0-8289-351352d47d0a\") " pod="openshift-controller-manager/controller-manager-75c9fbd49-qmfds" Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.589873 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6503bd00-b300-438d-b10e-27380eaf7d9a-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"6503bd00-b300-438d-b10e-27380eaf7d9a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.590627 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-xvnwv" Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.625770 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6503bd00-b300-438d-b10e-27380eaf7d9a-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"6503bd00-b300-438d-b10e-27380eaf7d9a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.637576 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hrwxg" Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.691434 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/24e63f18-ff07-40b0-8289-351352d47d0a-client-ca\") pod \"controller-manager-75c9fbd49-qmfds\" (UID: \"24e63f18-ff07-40b0-8289-351352d47d0a\") " pod="openshift-controller-manager/controller-manager-75c9fbd49-qmfds" Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.691532 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/24e63f18-ff07-40b0-8289-351352d47d0a-proxy-ca-bundles\") pod \"controller-manager-75c9fbd49-qmfds\" (UID: \"24e63f18-ff07-40b0-8289-351352d47d0a\") " pod="openshift-controller-manager/controller-manager-75c9fbd49-qmfds" Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.691784 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvgcj\" (UniqueName: \"kubernetes.io/projected/24e63f18-ff07-40b0-8289-351352d47d0a-kube-api-access-nvgcj\") pod \"controller-manager-75c9fbd49-qmfds\" (UID: \"24e63f18-ff07-40b0-8289-351352d47d0a\") " pod="openshift-controller-manager/controller-manager-75c9fbd49-qmfds" Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.691816 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24e63f18-ff07-40b0-8289-351352d47d0a-config\") pod \"controller-manager-75c9fbd49-qmfds\" (UID: \"24e63f18-ff07-40b0-8289-351352d47d0a\") " pod="openshift-controller-manager/controller-manager-75c9fbd49-qmfds" Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.691843 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24e63f18-ff07-40b0-8289-351352d47d0a-serving-cert\") pod \"controller-manager-75c9fbd49-qmfds\" (UID: \"24e63f18-ff07-40b0-8289-351352d47d0a\") " pod="openshift-controller-manager/controller-manager-75c9fbd49-qmfds" Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.694306 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/24e63f18-ff07-40b0-8289-351352d47d0a-client-ca\") pod \"controller-manager-75c9fbd49-qmfds\" (UID: \"24e63f18-ff07-40b0-8289-351352d47d0a\") " pod="openshift-controller-manager/controller-manager-75c9fbd49-qmfds" Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.697210 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24e63f18-ff07-40b0-8289-351352d47d0a-config\") pod \"controller-manager-75c9fbd49-qmfds\" (UID: \"24e63f18-ff07-40b0-8289-351352d47d0a\") " pod="openshift-controller-manager/controller-manager-75c9fbd49-qmfds" Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.702477 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/24e63f18-ff07-40b0-8289-351352d47d0a-proxy-ca-bundles\") pod \"controller-manager-75c9fbd49-qmfds\" (UID: \"24e63f18-ff07-40b0-8289-351352d47d0a\") " pod="openshift-controller-manager/controller-manager-75c9fbd49-qmfds" Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.704704 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24e63f18-ff07-40b0-8289-351352d47d0a-serving-cert\") pod \"controller-manager-75c9fbd49-qmfds\" (UID: \"24e63f18-ff07-40b0-8289-351352d47d0a\") " pod="openshift-controller-manager/controller-manager-75c9fbd49-qmfds" Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.710242 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ll7dp"] Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.713503 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.717925 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ll7dp" Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.720439 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvgcj\" (UniqueName: \"kubernetes.io/projected/24e63f18-ff07-40b0-8289-351352d47d0a-kube-api-access-nvgcj\") pod \"controller-manager-75c9fbd49-qmfds\" (UID: \"24e63f18-ff07-40b0-8289-351352d47d0a\") " pod="openshift-controller-manager/controller-manager-75c9fbd49-qmfds" Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.728610 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ll7dp"] Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.740355 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75c9fbd49-qmfds" Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.846551 4733 patch_prober.go:28] interesting pod/router-default-5444994796-xl5d7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 10:16:37 crc kubenswrapper[4733]: [-]has-synced failed: reason withheld Mar 18 10:16:37 crc kubenswrapper[4733]: [+]process-running ok Mar 18 10:16:37 crc kubenswrapper[4733]: healthz check failed Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.846616 4733 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xl5d7" podUID="9c5f567e-b38f-44a0-b1fd-1a96857e811f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.896375 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txwcn\" (UniqueName: \"kubernetes.io/projected/82922e1e-56fb-432e-9441-b99bdb19fbb0-kube-api-access-txwcn\") pod \"redhat-operators-ll7dp\" (UID: \"82922e1e-56fb-432e-9441-b99bdb19fbb0\") " pod="openshift-marketplace/redhat-operators-ll7dp" Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.896455 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82922e1e-56fb-432e-9441-b99bdb19fbb0-catalog-content\") pod \"redhat-operators-ll7dp\" (UID: \"82922e1e-56fb-432e-9441-b99bdb19fbb0\") " pod="openshift-marketplace/redhat-operators-ll7dp" Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.896479 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82922e1e-56fb-432e-9441-b99bdb19fbb0-utilities\") pod \"redhat-operators-ll7dp\" (UID: \"82922e1e-56fb-432e-9441-b99bdb19fbb0\") " pod="openshift-marketplace/redhat-operators-ll7dp" Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.921290 4733 generic.go:334] "Generic (PLEG): container finished" podID="0fd306cb-05db-40e1-a1ec-9f811ce7fec0" containerID="deb249a09e24f844e1f0eaad077e13c564da63c225d86fe92c3b3e169a3f2a0e" exitCode=0 Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.921457 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jb86w" event={"ID":"0fd306cb-05db-40e1-a1ec-9f811ce7fec0","Type":"ContainerDied","Data":"deb249a09e24f844e1f0eaad077e13c564da63c225d86fe92c3b3e169a3f2a0e"} Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.937829 4733 generic.go:334] "Generic (PLEG): container finished" podID="d915f7d2-5b4d-4017-a839-b615a182fafb" containerID="6068780e861c95e2a5524c6995b5943bf2eb924f4e716f49bfa978772d8dc58d" exitCode=0 Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.937965 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563815-tsrs6" event={"ID":"d915f7d2-5b4d-4017-a839-b615a182fafb","Type":"ContainerDied","Data":"6068780e861c95e2a5524c6995b5943bf2eb924f4e716f49bfa978772d8dc58d"} Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.949421 4733 generic.go:334] "Generic (PLEG): container finished" podID="75f42e15-d1dc-4edf-8f2e-daef04ccc601" containerID="e35828e9f81ba55c9c2c8d38d1a7b2cf11a4f98596d388b277800589516f0e19" exitCode=0 Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.949529 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"75f42e15-d1dc-4edf-8f2e-daef04ccc601","Type":"ContainerDied","Data":"e35828e9f81ba55c9c2c8d38d1a7b2cf11a4f98596d388b277800589516f0e19"} Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.962917 4733 generic.go:334] "Generic (PLEG): container finished" podID="c91f12fa-96f0-442a-a3f7-70d56a697839" containerID="6da3522bbcdb557467c36bac266a9dafb390a5a917de44dd30de9c3ac03051e1" exitCode=0 Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.963594 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f82xf" event={"ID":"c91f12fa-96f0-442a-a3f7-70d56a697839","Type":"ContainerDied","Data":"6da3522bbcdb557467c36bac266a9dafb390a5a917de44dd30de9c3ac03051e1"} Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.963646 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f82xf" event={"ID":"c91f12fa-96f0-442a-a3f7-70d56a697839","Type":"ContainerStarted","Data":"a8fa061a3aa824aa80f6c1569abe326d18dccd731789c62f81d22de7e9a828d3"} Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.973346 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-xvnwv" Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.998353 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txwcn\" (UniqueName: \"kubernetes.io/projected/82922e1e-56fb-432e-9441-b99bdb19fbb0-kube-api-access-txwcn\") pod \"redhat-operators-ll7dp\" (UID: \"82922e1e-56fb-432e-9441-b99bdb19fbb0\") " pod="openshift-marketplace/redhat-operators-ll7dp" Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.998432 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82922e1e-56fb-432e-9441-b99bdb19fbb0-catalog-content\") pod \"redhat-operators-ll7dp\" (UID: \"82922e1e-56fb-432e-9441-b99bdb19fbb0\") " pod="openshift-marketplace/redhat-operators-ll7dp" Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.998456 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82922e1e-56fb-432e-9441-b99bdb19fbb0-utilities\") pod \"redhat-operators-ll7dp\" (UID: \"82922e1e-56fb-432e-9441-b99bdb19fbb0\") " pod="openshift-marketplace/redhat-operators-ll7dp" Mar 18 10:16:37 crc kubenswrapper[4733]: I0318 10:16:37.999388 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82922e1e-56fb-432e-9441-b99bdb19fbb0-utilities\") pod \"redhat-operators-ll7dp\" (UID: \"82922e1e-56fb-432e-9441-b99bdb19fbb0\") " pod="openshift-marketplace/redhat-operators-ll7dp" Mar 18 10:16:38 crc kubenswrapper[4733]: I0318 10:16:38.000065 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82922e1e-56fb-432e-9441-b99bdb19fbb0-catalog-content\") pod \"redhat-operators-ll7dp\" (UID: \"82922e1e-56fb-432e-9441-b99bdb19fbb0\") " pod="openshift-marketplace/redhat-operators-ll7dp" Mar 18 10:16:38 crc kubenswrapper[4733]: I0318 10:16:38.049365 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txwcn\" (UniqueName: \"kubernetes.io/projected/82922e1e-56fb-432e-9441-b99bdb19fbb0-kube-api-access-txwcn\") pod \"redhat-operators-ll7dp\" (UID: \"82922e1e-56fb-432e-9441-b99bdb19fbb0\") " pod="openshift-marketplace/redhat-operators-ll7dp" Mar 18 10:16:38 crc kubenswrapper[4733]: I0318 10:16:38.071061 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hrwxg"] Mar 18 10:16:38 crc kubenswrapper[4733]: I0318 10:16:38.252618 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 18 10:16:38 crc kubenswrapper[4733]: I0318 10:16:38.309133 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-75c9fbd49-qmfds"] Mar 18 10:16:38 crc kubenswrapper[4733]: I0318 10:16:38.337620 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ll7dp" Mar 18 10:16:38 crc kubenswrapper[4733]: W0318 10:16:38.376158 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod6503bd00_b300_438d_b10e_27380eaf7d9a.slice/crio-cde036bce187894fcff4e3f9314bf8376952111eb9bf66cb5384b076ad589e4e WatchSource:0}: Error finding container cde036bce187894fcff4e3f9314bf8376952111eb9bf66cb5384b076ad589e4e: Status 404 returned error can't find the container with id cde036bce187894fcff4e3f9314bf8376952111eb9bf66cb5384b076ad589e4e Mar 18 10:16:38 crc kubenswrapper[4733]: W0318 10:16:38.400085 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24e63f18_ff07_40b0_8289_351352d47d0a.slice/crio-4e6f1dbd850522a9734d4ded476f261e3ae7fcfb8b06688f8b915837852ee325 WatchSource:0}: Error finding container 4e6f1dbd850522a9734d4ded476f261e3ae7fcfb8b06688f8b915837852ee325: Status 404 returned error can't find the container with id 4e6f1dbd850522a9734d4ded476f261e3ae7fcfb8b06688f8b915837852ee325 Mar 18 10:16:38 crc kubenswrapper[4733]: I0318 10:16:38.886883 4733 patch_prober.go:28] interesting pod/router-default-5444994796-xl5d7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 10:16:38 crc kubenswrapper[4733]: [-]has-synced failed: reason withheld Mar 18 10:16:38 crc kubenswrapper[4733]: [+]process-running ok Mar 18 10:16:38 crc kubenswrapper[4733]: healthz check failed Mar 18 10:16:38 crc kubenswrapper[4733]: I0318 10:16:38.887566 4733 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xl5d7" podUID="9c5f567e-b38f-44a0-b1fd-1a96857e811f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 10:16:39 crc kubenswrapper[4733]: I0318 10:16:39.006638 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hrwxg" event={"ID":"fb7ed879-1474-4200-88d4-70e425e2bcb1","Type":"ContainerStarted","Data":"62aa2aa87c6f58e0a138486db1e0ff0949ce50a5eef4891759673935a2791e3b"} Mar 18 10:16:39 crc kubenswrapper[4733]: I0318 10:16:39.010653 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75c9fbd49-qmfds" event={"ID":"24e63f18-ff07-40b0-8289-351352d47d0a","Type":"ContainerStarted","Data":"4e6f1dbd850522a9734d4ded476f261e3ae7fcfb8b06688f8b915837852ee325"} Mar 18 10:16:39 crc kubenswrapper[4733]: I0318 10:16:39.022023 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"6503bd00-b300-438d-b10e-27380eaf7d9a","Type":"ContainerStarted","Data":"cde036bce187894fcff4e3f9314bf8376952111eb9bf66cb5384b076ad589e4e"} Mar 18 10:16:39 crc kubenswrapper[4733]: I0318 10:16:39.127961 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ll7dp"] Mar 18 10:16:39 crc kubenswrapper[4733]: I0318 10:16:39.496999 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563815-tsrs6" Mar 18 10:16:39 crc kubenswrapper[4733]: I0318 10:16:39.513337 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 10:16:39 crc kubenswrapper[4733]: I0318 10:16:39.594924 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d915f7d2-5b4d-4017-a839-b615a182fafb-config-volume\") pod \"d915f7d2-5b4d-4017-a839-b615a182fafb\" (UID: \"d915f7d2-5b4d-4017-a839-b615a182fafb\") " Mar 18 10:16:39 crc kubenswrapper[4733]: I0318 10:16:39.594987 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/75f42e15-d1dc-4edf-8f2e-daef04ccc601-kubelet-dir\") pod \"75f42e15-d1dc-4edf-8f2e-daef04ccc601\" (UID: \"75f42e15-d1dc-4edf-8f2e-daef04ccc601\") " Mar 18 10:16:39 crc kubenswrapper[4733]: I0318 10:16:39.595018 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d915f7d2-5b4d-4017-a839-b615a182fafb-secret-volume\") pod \"d915f7d2-5b4d-4017-a839-b615a182fafb\" (UID: \"d915f7d2-5b4d-4017-a839-b615a182fafb\") " Mar 18 10:16:39 crc kubenswrapper[4733]: I0318 10:16:39.596310 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/75f42e15-d1dc-4edf-8f2e-daef04ccc601-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "75f42e15-d1dc-4edf-8f2e-daef04ccc601" (UID: "75f42e15-d1dc-4edf-8f2e-daef04ccc601"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 10:16:39 crc kubenswrapper[4733]: I0318 10:16:39.596653 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d915f7d2-5b4d-4017-a839-b615a182fafb-config-volume" (OuterVolumeSpecName: "config-volume") pod "d915f7d2-5b4d-4017-a839-b615a182fafb" (UID: "d915f7d2-5b4d-4017-a839-b615a182fafb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:16:39 crc kubenswrapper[4733]: I0318 10:16:39.603018 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d915f7d2-5b4d-4017-a839-b615a182fafb-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d915f7d2-5b4d-4017-a839-b615a182fafb" (UID: "d915f7d2-5b4d-4017-a839-b615a182fafb"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:16:39 crc kubenswrapper[4733]: I0318 10:16:39.707849 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lfsh\" (UniqueName: \"kubernetes.io/projected/d915f7d2-5b4d-4017-a839-b615a182fafb-kube-api-access-8lfsh\") pod \"d915f7d2-5b4d-4017-a839-b615a182fafb\" (UID: \"d915f7d2-5b4d-4017-a839-b615a182fafb\") " Mar 18 10:16:39 crc kubenswrapper[4733]: I0318 10:16:39.707955 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/75f42e15-d1dc-4edf-8f2e-daef04ccc601-kube-api-access\") pod \"75f42e15-d1dc-4edf-8f2e-daef04ccc601\" (UID: \"75f42e15-d1dc-4edf-8f2e-daef04ccc601\") " Mar 18 10:16:39 crc kubenswrapper[4733]: I0318 10:16:39.708427 4733 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/75f42e15-d1dc-4edf-8f2e-daef04ccc601-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 18 10:16:39 crc kubenswrapper[4733]: I0318 10:16:39.708447 4733 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d915f7d2-5b4d-4017-a839-b615a182fafb-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 10:16:39 crc kubenswrapper[4733]: I0318 10:16:39.708459 4733 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d915f7d2-5b4d-4017-a839-b615a182fafb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 10:16:39 crc kubenswrapper[4733]: I0318 10:16:39.713323 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d915f7d2-5b4d-4017-a839-b615a182fafb-kube-api-access-8lfsh" (OuterVolumeSpecName: "kube-api-access-8lfsh") pod "d915f7d2-5b4d-4017-a839-b615a182fafb" (UID: "d915f7d2-5b4d-4017-a839-b615a182fafb"). InnerVolumeSpecName "kube-api-access-8lfsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:16:39 crc kubenswrapper[4733]: I0318 10:16:39.716279 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75f42e15-d1dc-4edf-8f2e-daef04ccc601-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "75f42e15-d1dc-4edf-8f2e-daef04ccc601" (UID: "75f42e15-d1dc-4edf-8f2e-daef04ccc601"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:16:39 crc kubenswrapper[4733]: I0318 10:16:39.809381 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8lfsh\" (UniqueName: \"kubernetes.io/projected/d915f7d2-5b4d-4017-a839-b615a182fafb-kube-api-access-8lfsh\") on node \"crc\" DevicePath \"\"" Mar 18 10:16:39 crc kubenswrapper[4733]: I0318 10:16:39.809417 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/75f42e15-d1dc-4edf-8f2e-daef04ccc601-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 10:16:39 crc kubenswrapper[4733]: I0318 10:16:39.844703 4733 patch_prober.go:28] interesting pod/router-default-5444994796-xl5d7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 10:16:39 crc kubenswrapper[4733]: [-]has-synced failed: reason withheld Mar 18 10:16:39 crc kubenswrapper[4733]: [+]process-running ok Mar 18 10:16:39 crc kubenswrapper[4733]: healthz check failed Mar 18 10:16:39 crc kubenswrapper[4733]: I0318 10:16:39.844769 4733 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xl5d7" podUID="9c5f567e-b38f-44a0-b1fd-1a96857e811f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 10:16:40 crc kubenswrapper[4733]: I0318 10:16:40.063525 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"75f42e15-d1dc-4edf-8f2e-daef04ccc601","Type":"ContainerDied","Data":"d4949220d6469d09f06a115f37de0df57649271ca132ce8e25268effeb41c8de"} Mar 18 10:16:40 crc kubenswrapper[4733]: I0318 10:16:40.063601 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4949220d6469d09f06a115f37de0df57649271ca132ce8e25268effeb41c8de" Mar 18 10:16:40 crc kubenswrapper[4733]: I0318 10:16:40.064057 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 10:16:40 crc kubenswrapper[4733]: I0318 10:16:40.068078 4733 generic.go:334] "Generic (PLEG): container finished" podID="fb7ed879-1474-4200-88d4-70e425e2bcb1" containerID="aa0522bdc088c10a6b3c5dba1e3ad5057a62e8ded941287c75083cef63e55041" exitCode=0 Mar 18 10:16:40 crc kubenswrapper[4733]: I0318 10:16:40.068162 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hrwxg" event={"ID":"fb7ed879-1474-4200-88d4-70e425e2bcb1","Type":"ContainerDied","Data":"aa0522bdc088c10a6b3c5dba1e3ad5057a62e8ded941287c75083cef63e55041"} Mar 18 10:16:40 crc kubenswrapper[4733]: I0318 10:16:40.071840 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75c9fbd49-qmfds" event={"ID":"24e63f18-ff07-40b0-8289-351352d47d0a","Type":"ContainerStarted","Data":"9eef5c0d799a7506ebb7268521b337cf05cff4092c361a61789da71fa3bb245c"} Mar 18 10:16:40 crc kubenswrapper[4733]: I0318 10:16:40.076330 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-75c9fbd49-qmfds" Mar 18 10:16:40 crc kubenswrapper[4733]: I0318 10:16:40.096388 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-75c9fbd49-qmfds" Mar 18 10:16:40 crc kubenswrapper[4733]: I0318 10:16:40.097213 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ll7dp" event={"ID":"82922e1e-56fb-432e-9441-b99bdb19fbb0","Type":"ContainerStarted","Data":"1a1211028e93b8b114b76fa499d9200418412506c6795f17a8a464f56e421c4c"} Mar 18 10:16:40 crc kubenswrapper[4733]: I0318 10:16:40.113993 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563815-tsrs6" event={"ID":"d915f7d2-5b4d-4017-a839-b615a182fafb","Type":"ContainerDied","Data":"4abe99dbd7bc3b694bde422289cb5e8d4d69c342990c6d29b9ffcb65e8f885f7"} Mar 18 10:16:40 crc kubenswrapper[4733]: I0318 10:16:40.114067 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4abe99dbd7bc3b694bde422289cb5e8d4d69c342990c6d29b9ffcb65e8f885f7" Mar 18 10:16:40 crc kubenswrapper[4733]: I0318 10:16:40.114108 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563815-tsrs6" Mar 18 10:16:40 crc kubenswrapper[4733]: I0318 10:16:40.131575 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-75c9fbd49-qmfds" podStartSLOduration=7.131558511 podStartE2EDuration="7.131558511s" podCreationTimestamp="2026-03-18 10:16:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:16:40.131178899 +0000 UTC m=+239.622913224" watchObservedRunningTime="2026-03-18 10:16:40.131558511 +0000 UTC m=+239.623292826" Mar 18 10:16:40 crc kubenswrapper[4733]: I0318 10:16:40.135374 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"6503bd00-b300-438d-b10e-27380eaf7d9a","Type":"ContainerStarted","Data":"5097184223deb67237547660bb54a8b2f0c30328be9b17c9a3cc7d81a39aded5"} Mar 18 10:16:40 crc kubenswrapper[4733]: I0318 10:16:40.249260 4733 ???:1] "http: TLS handshake error from 192.168.126.11:33552: no serving certificate available for the kubelet" Mar 18 10:16:40 crc kubenswrapper[4733]: I0318 10:16:40.856182 4733 patch_prober.go:28] interesting pod/router-default-5444994796-xl5d7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 10:16:40 crc kubenswrapper[4733]: [-]has-synced failed: reason withheld Mar 18 10:16:40 crc kubenswrapper[4733]: [+]process-running ok Mar 18 10:16:40 crc kubenswrapper[4733]: healthz check failed Mar 18 10:16:40 crc kubenswrapper[4733]: I0318 10:16:40.856361 4733 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xl5d7" podUID="9c5f567e-b38f-44a0-b1fd-1a96857e811f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 10:16:41 crc kubenswrapper[4733]: I0318 10:16:41.211785 4733 generic.go:334] "Generic (PLEG): container finished" podID="82922e1e-56fb-432e-9441-b99bdb19fbb0" containerID="9d9d502e889f0bc1ff5ac5bd25eb5937fb15878b89bb5f2186b3e420cda96e62" exitCode=0 Mar 18 10:16:41 crc kubenswrapper[4733]: I0318 10:16:41.226013 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ll7dp" event={"ID":"82922e1e-56fb-432e-9441-b99bdb19fbb0","Type":"ContainerDied","Data":"9d9d502e889f0bc1ff5ac5bd25eb5937fb15878b89bb5f2186b3e420cda96e62"} Mar 18 10:16:41 crc kubenswrapper[4733]: I0318 10:16:41.241733 4733 generic.go:334] "Generic (PLEG): container finished" podID="6503bd00-b300-438d-b10e-27380eaf7d9a" containerID="5097184223deb67237547660bb54a8b2f0c30328be9b17c9a3cc7d81a39aded5" exitCode=0 Mar 18 10:16:41 crc kubenswrapper[4733]: I0318 10:16:41.241897 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"6503bd00-b300-438d-b10e-27380eaf7d9a","Type":"ContainerDied","Data":"5097184223deb67237547660bb54a8b2f0c30328be9b17c9a3cc7d81a39aded5"} Mar 18 10:16:41 crc kubenswrapper[4733]: I0318 10:16:41.841575 4733 patch_prober.go:28] interesting pod/router-default-5444994796-xl5d7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 10:16:41 crc kubenswrapper[4733]: [-]has-synced failed: reason withheld Mar 18 10:16:41 crc kubenswrapper[4733]: [+]process-running ok Mar 18 10:16:41 crc kubenswrapper[4733]: healthz check failed Mar 18 10:16:41 crc kubenswrapper[4733]: I0318 10:16:41.841631 4733 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xl5d7" podUID="9c5f567e-b38f-44a0-b1fd-1a96857e811f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 10:16:42 crc kubenswrapper[4733]: I0318 10:16:42.796624 4733 ???:1] "http: TLS handshake error from 192.168.126.11:33562: no serving certificate available for the kubelet" Mar 18 10:16:42 crc kubenswrapper[4733]: I0318 10:16:42.844339 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-xl5d7" Mar 18 10:16:42 crc kubenswrapper[4733]: I0318 10:16:42.847302 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-xl5d7" Mar 18 10:16:42 crc kubenswrapper[4733]: I0318 10:16:42.898640 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-hvmrz" Mar 18 10:16:43 crc kubenswrapper[4733]: I0318 10:16:43.571906 4733 patch_prober.go:28] interesting pod/machine-config-daemon-2h7dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:16:43 crc kubenswrapper[4733]: I0318 10:16:43.572001 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:16:43 crc kubenswrapper[4733]: I0318 10:16:43.605993 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:16:46 crc kubenswrapper[4733]: I0318 10:16:46.829780 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-gxcb2" Mar 18 10:16:46 crc kubenswrapper[4733]: I0318 10:16:46.922605 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-8v244" Mar 18 10:16:46 crc kubenswrapper[4733]: I0318 10:16:46.928158 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-8v244" Mar 18 10:16:50 crc kubenswrapper[4733]: I0318 10:16:50.521807 4733 ???:1] "http: TLS handshake error from 192.168.126.11:42986: no serving certificate available for the kubelet" Mar 18 10:16:53 crc kubenswrapper[4733]: I0318 10:16:53.140102 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-75c9fbd49-qmfds"] Mar 18 10:16:53 crc kubenswrapper[4733]: I0318 10:16:53.140696 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-75c9fbd49-qmfds" podUID="24e63f18-ff07-40b0-8289-351352d47d0a" containerName="controller-manager" containerID="cri-o://9eef5c0d799a7506ebb7268521b337cf05cff4092c361a61789da71fa3bb245c" gracePeriod=30 Mar 18 10:16:53 crc kubenswrapper[4733]: I0318 10:16:53.165356 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6946475f8-lnppg"] Mar 18 10:16:53 crc kubenswrapper[4733]: I0318 10:16:53.165645 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6946475f8-lnppg" podUID="d1954584-f4d6-467a-8b0c-9db32f9e385c" containerName="route-controller-manager" containerID="cri-o://bb3b81003e39ecbea213f0c0b02b7e8dae8e9507c2c01e812e0dce4d4f5c71d4" gracePeriod=30 Mar 18 10:16:54 crc kubenswrapper[4733]: I0318 10:16:54.202792 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 10:16:54 crc kubenswrapper[4733]: I0318 10:16:54.354995 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"6503bd00-b300-438d-b10e-27380eaf7d9a","Type":"ContainerDied","Data":"cde036bce187894fcff4e3f9314bf8376952111eb9bf66cb5384b076ad589e4e"} Mar 18 10:16:54 crc kubenswrapper[4733]: I0318 10:16:54.355077 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cde036bce187894fcff4e3f9314bf8376952111eb9bf66cb5384b076ad589e4e" Mar 18 10:16:54 crc kubenswrapper[4733]: I0318 10:16:54.355070 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 10:16:54 crc kubenswrapper[4733]: I0318 10:16:54.357053 4733 generic.go:334] "Generic (PLEG): container finished" podID="24e63f18-ff07-40b0-8289-351352d47d0a" containerID="9eef5c0d799a7506ebb7268521b337cf05cff4092c361a61789da71fa3bb245c" exitCode=0 Mar 18 10:16:54 crc kubenswrapper[4733]: I0318 10:16:54.357179 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75c9fbd49-qmfds" event={"ID":"24e63f18-ff07-40b0-8289-351352d47d0a","Type":"ContainerDied","Data":"9eef5c0d799a7506ebb7268521b337cf05cff4092c361a61789da71fa3bb245c"} Mar 18 10:16:54 crc kubenswrapper[4733]: I0318 10:16:54.361955 4733 generic.go:334] "Generic (PLEG): container finished" podID="d1954584-f4d6-467a-8b0c-9db32f9e385c" containerID="bb3b81003e39ecbea213f0c0b02b7e8dae8e9507c2c01e812e0dce4d4f5c71d4" exitCode=0 Mar 18 10:16:54 crc kubenswrapper[4733]: I0318 10:16:54.362218 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6946475f8-lnppg" event={"ID":"d1954584-f4d6-467a-8b0c-9db32f9e385c","Type":"ContainerDied","Data":"bb3b81003e39ecbea213f0c0b02b7e8dae8e9507c2c01e812e0dce4d4f5c71d4"} Mar 18 10:16:54 crc kubenswrapper[4733]: I0318 10:16:54.384690 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6503bd00-b300-438d-b10e-27380eaf7d9a-kubelet-dir\") pod \"6503bd00-b300-438d-b10e-27380eaf7d9a\" (UID: \"6503bd00-b300-438d-b10e-27380eaf7d9a\") " Mar 18 10:16:54 crc kubenswrapper[4733]: I0318 10:16:54.384817 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6503bd00-b300-438d-b10e-27380eaf7d9a-kube-api-access\") pod \"6503bd00-b300-438d-b10e-27380eaf7d9a\" (UID: \"6503bd00-b300-438d-b10e-27380eaf7d9a\") " Mar 18 10:16:54 crc kubenswrapper[4733]: I0318 10:16:54.384855 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6503bd00-b300-438d-b10e-27380eaf7d9a-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "6503bd00-b300-438d-b10e-27380eaf7d9a" (UID: "6503bd00-b300-438d-b10e-27380eaf7d9a"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 10:16:54 crc kubenswrapper[4733]: I0318 10:16:54.385906 4733 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6503bd00-b300-438d-b10e-27380eaf7d9a-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 18 10:16:54 crc kubenswrapper[4733]: I0318 10:16:54.410473 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6503bd00-b300-438d-b10e-27380eaf7d9a-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "6503bd00-b300-438d-b10e-27380eaf7d9a" (UID: "6503bd00-b300-438d-b10e-27380eaf7d9a"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:16:54 crc kubenswrapper[4733]: I0318 10:16:54.486804 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6503bd00-b300-438d-b10e-27380eaf7d9a-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 10:16:55 crc kubenswrapper[4733]: I0318 10:16:55.728519 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:16:55 crc kubenswrapper[4733]: I0318 10:16:55.972393 4733 patch_prober.go:28] interesting pod/route-controller-manager-6946475f8-lnppg container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.50:8443/healthz\": dial tcp 10.217.0.50:8443: connect: connection refused" start-of-body= Mar 18 10:16:55 crc kubenswrapper[4733]: I0318 10:16:55.972437 4733 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6946475f8-lnppg" podUID="d1954584-f4d6-467a-8b0c-9db32f9e385c" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.50:8443/healthz\": dial tcp 10.217.0.50:8443: connect: connection refused" Mar 18 10:16:57 crc kubenswrapper[4733]: E0318 10:16:57.471285 4733 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 18 10:16:57 crc kubenswrapper[4733]: E0318 10:16:57.471738 4733 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 10:16:57 crc kubenswrapper[4733]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 18 10:16:57 crc kubenswrapper[4733]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2whnv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29563816-4582s_openshift-infra(71a70c3c-d483-43f4-9f54-10978c7f8cc8): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 18 10:16:57 crc kubenswrapper[4733]: > logger="UnhandledError" Mar 18 10:16:57 crc kubenswrapper[4733]: E0318 10:16:57.472832 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29563816-4582s" podUID="71a70c3c-d483-43f4-9f54-10978c7f8cc8" Mar 18 10:16:57 crc kubenswrapper[4733]: I0318 10:16:57.807757 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75c9fbd49-qmfds" Mar 18 10:16:57 crc kubenswrapper[4733]: I0318 10:16:57.820500 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6946475f8-lnppg" Mar 18 10:16:57 crc kubenswrapper[4733]: I0318 10:16:57.867210 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6748588445-r4bnh"] Mar 18 10:16:57 crc kubenswrapper[4733]: E0318 10:16:57.867513 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24e63f18-ff07-40b0-8289-351352d47d0a" containerName="controller-manager" Mar 18 10:16:57 crc kubenswrapper[4733]: I0318 10:16:57.867526 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="24e63f18-ff07-40b0-8289-351352d47d0a" containerName="controller-manager" Mar 18 10:16:57 crc kubenswrapper[4733]: E0318 10:16:57.867545 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6503bd00-b300-438d-b10e-27380eaf7d9a" containerName="pruner" Mar 18 10:16:57 crc kubenswrapper[4733]: I0318 10:16:57.867552 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="6503bd00-b300-438d-b10e-27380eaf7d9a" containerName="pruner" Mar 18 10:16:57 crc kubenswrapper[4733]: E0318 10:16:57.867565 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1954584-f4d6-467a-8b0c-9db32f9e385c" containerName="route-controller-manager" Mar 18 10:16:57 crc kubenswrapper[4733]: I0318 10:16:57.867574 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1954584-f4d6-467a-8b0c-9db32f9e385c" containerName="route-controller-manager" Mar 18 10:16:57 crc kubenswrapper[4733]: E0318 10:16:57.867585 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d915f7d2-5b4d-4017-a839-b615a182fafb" containerName="collect-profiles" Mar 18 10:16:57 crc kubenswrapper[4733]: I0318 10:16:57.867591 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="d915f7d2-5b4d-4017-a839-b615a182fafb" containerName="collect-profiles" Mar 18 10:16:57 crc kubenswrapper[4733]: E0318 10:16:57.867598 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75f42e15-d1dc-4edf-8f2e-daef04ccc601" containerName="pruner" Mar 18 10:16:57 crc kubenswrapper[4733]: I0318 10:16:57.867604 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="75f42e15-d1dc-4edf-8f2e-daef04ccc601" containerName="pruner" Mar 18 10:16:57 crc kubenswrapper[4733]: I0318 10:16:57.867732 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="d915f7d2-5b4d-4017-a839-b615a182fafb" containerName="collect-profiles" Mar 18 10:16:57 crc kubenswrapper[4733]: I0318 10:16:57.867742 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="75f42e15-d1dc-4edf-8f2e-daef04ccc601" containerName="pruner" Mar 18 10:16:57 crc kubenswrapper[4733]: I0318 10:16:57.867753 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1954584-f4d6-467a-8b0c-9db32f9e385c" containerName="route-controller-manager" Mar 18 10:16:57 crc kubenswrapper[4733]: I0318 10:16:57.867762 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="24e63f18-ff07-40b0-8289-351352d47d0a" containerName="controller-manager" Mar 18 10:16:57 crc kubenswrapper[4733]: I0318 10:16:57.867772 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="6503bd00-b300-438d-b10e-27380eaf7d9a" containerName="pruner" Mar 18 10:16:57 crc kubenswrapper[4733]: I0318 10:16:57.868139 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6748588445-r4bnh" Mar 18 10:16:57 crc kubenswrapper[4733]: I0318 10:16:57.869740 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6748588445-r4bnh"] Mar 18 10:16:57 crc kubenswrapper[4733]: I0318 10:16:57.934127 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1954584-f4d6-467a-8b0c-9db32f9e385c-serving-cert\") pod \"d1954584-f4d6-467a-8b0c-9db32f9e385c\" (UID: \"d1954584-f4d6-467a-8b0c-9db32f9e385c\") " Mar 18 10:16:57 crc kubenswrapper[4733]: I0318 10:16:57.934205 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24e63f18-ff07-40b0-8289-351352d47d0a-serving-cert\") pod \"24e63f18-ff07-40b0-8289-351352d47d0a\" (UID: \"24e63f18-ff07-40b0-8289-351352d47d0a\") " Mar 18 10:16:57 crc kubenswrapper[4733]: I0318 10:16:57.934247 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dmg68\" (UniqueName: \"kubernetes.io/projected/d1954584-f4d6-467a-8b0c-9db32f9e385c-kube-api-access-dmg68\") pod \"d1954584-f4d6-467a-8b0c-9db32f9e385c\" (UID: \"d1954584-f4d6-467a-8b0c-9db32f9e385c\") " Mar 18 10:16:57 crc kubenswrapper[4733]: I0318 10:16:57.934306 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24e63f18-ff07-40b0-8289-351352d47d0a-config\") pod \"24e63f18-ff07-40b0-8289-351352d47d0a\" (UID: \"24e63f18-ff07-40b0-8289-351352d47d0a\") " Mar 18 10:16:57 crc kubenswrapper[4733]: I0318 10:16:57.934367 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1954584-f4d6-467a-8b0c-9db32f9e385c-config\") pod \"d1954584-f4d6-467a-8b0c-9db32f9e385c\" (UID: \"d1954584-f4d6-467a-8b0c-9db32f9e385c\") " Mar 18 10:16:57 crc kubenswrapper[4733]: I0318 10:16:57.934391 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d1954584-f4d6-467a-8b0c-9db32f9e385c-client-ca\") pod \"d1954584-f4d6-467a-8b0c-9db32f9e385c\" (UID: \"d1954584-f4d6-467a-8b0c-9db32f9e385c\") " Mar 18 10:16:57 crc kubenswrapper[4733]: I0318 10:16:57.934420 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvgcj\" (UniqueName: \"kubernetes.io/projected/24e63f18-ff07-40b0-8289-351352d47d0a-kube-api-access-nvgcj\") pod \"24e63f18-ff07-40b0-8289-351352d47d0a\" (UID: \"24e63f18-ff07-40b0-8289-351352d47d0a\") " Mar 18 10:16:57 crc kubenswrapper[4733]: I0318 10:16:57.935336 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/24e63f18-ff07-40b0-8289-351352d47d0a-proxy-ca-bundles\") pod \"24e63f18-ff07-40b0-8289-351352d47d0a\" (UID: \"24e63f18-ff07-40b0-8289-351352d47d0a\") " Mar 18 10:16:57 crc kubenswrapper[4733]: I0318 10:16:57.935347 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1954584-f4d6-467a-8b0c-9db32f9e385c-config" (OuterVolumeSpecName: "config") pod "d1954584-f4d6-467a-8b0c-9db32f9e385c" (UID: "d1954584-f4d6-467a-8b0c-9db32f9e385c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:16:57 crc kubenswrapper[4733]: I0318 10:16:57.935403 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/24e63f18-ff07-40b0-8289-351352d47d0a-client-ca\") pod \"24e63f18-ff07-40b0-8289-351352d47d0a\" (UID: \"24e63f18-ff07-40b0-8289-351352d47d0a\") " Mar 18 10:16:57 crc kubenswrapper[4733]: I0318 10:16:57.935451 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24e63f18-ff07-40b0-8289-351352d47d0a-config" (OuterVolumeSpecName: "config") pod "24e63f18-ff07-40b0-8289-351352d47d0a" (UID: "24e63f18-ff07-40b0-8289-351352d47d0a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:16:57 crc kubenswrapper[4733]: I0318 10:16:57.935449 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1954584-f4d6-467a-8b0c-9db32f9e385c-client-ca" (OuterVolumeSpecName: "client-ca") pod "d1954584-f4d6-467a-8b0c-9db32f9e385c" (UID: "d1954584-f4d6-467a-8b0c-9db32f9e385c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:16:57 crc kubenswrapper[4733]: I0318 10:16:57.935865 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24e63f18-ff07-40b0-8289-351352d47d0a-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "24e63f18-ff07-40b0-8289-351352d47d0a" (UID: "24e63f18-ff07-40b0-8289-351352d47d0a"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:16:57 crc kubenswrapper[4733]: I0318 10:16:57.936729 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24e63f18-ff07-40b0-8289-351352d47d0a-client-ca" (OuterVolumeSpecName: "client-ca") pod "24e63f18-ff07-40b0-8289-351352d47d0a" (UID: "24e63f18-ff07-40b0-8289-351352d47d0a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:16:57 crc kubenswrapper[4733]: I0318 10:16:57.937060 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24e63f18-ff07-40b0-8289-351352d47d0a-config\") on node \"crc\" DevicePath \"\"" Mar 18 10:16:57 crc kubenswrapper[4733]: I0318 10:16:57.937087 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1954584-f4d6-467a-8b0c-9db32f9e385c-config\") on node \"crc\" DevicePath \"\"" Mar 18 10:16:57 crc kubenswrapper[4733]: I0318 10:16:57.937099 4733 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d1954584-f4d6-467a-8b0c-9db32f9e385c-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 10:16:57 crc kubenswrapper[4733]: I0318 10:16:57.937110 4733 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/24e63f18-ff07-40b0-8289-351352d47d0a-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 10:16:57 crc kubenswrapper[4733]: I0318 10:16:57.937125 4733 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/24e63f18-ff07-40b0-8289-351352d47d0a-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 10:16:57 crc kubenswrapper[4733]: I0318 10:16:57.940692 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24e63f18-ff07-40b0-8289-351352d47d0a-kube-api-access-nvgcj" (OuterVolumeSpecName: "kube-api-access-nvgcj") pod "24e63f18-ff07-40b0-8289-351352d47d0a" (UID: "24e63f18-ff07-40b0-8289-351352d47d0a"). InnerVolumeSpecName "kube-api-access-nvgcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:16:57 crc kubenswrapper[4733]: I0318 10:16:57.941804 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1954584-f4d6-467a-8b0c-9db32f9e385c-kube-api-access-dmg68" (OuterVolumeSpecName: "kube-api-access-dmg68") pod "d1954584-f4d6-467a-8b0c-9db32f9e385c" (UID: "d1954584-f4d6-467a-8b0c-9db32f9e385c"). InnerVolumeSpecName "kube-api-access-dmg68". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:16:57 crc kubenswrapper[4733]: I0318 10:16:57.943111 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24e63f18-ff07-40b0-8289-351352d47d0a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "24e63f18-ff07-40b0-8289-351352d47d0a" (UID: "24e63f18-ff07-40b0-8289-351352d47d0a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:16:57 crc kubenswrapper[4733]: I0318 10:16:57.952719 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1954584-f4d6-467a-8b0c-9db32f9e385c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d1954584-f4d6-467a-8b0c-9db32f9e385c" (UID: "d1954584-f4d6-467a-8b0c-9db32f9e385c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:16:58 crc kubenswrapper[4733]: I0318 10:16:58.037948 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b-config\") pod \"controller-manager-6748588445-r4bnh\" (UID: \"19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b\") " pod="openshift-controller-manager/controller-manager-6748588445-r4bnh" Mar 18 10:16:58 crc kubenswrapper[4733]: I0318 10:16:58.037999 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b-proxy-ca-bundles\") pod \"controller-manager-6748588445-r4bnh\" (UID: \"19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b\") " pod="openshift-controller-manager/controller-manager-6748588445-r4bnh" Mar 18 10:16:58 crc kubenswrapper[4733]: I0318 10:16:58.038029 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzm9t\" (UniqueName: \"kubernetes.io/projected/19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b-kube-api-access-hzm9t\") pod \"controller-manager-6748588445-r4bnh\" (UID: \"19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b\") " pod="openshift-controller-manager/controller-manager-6748588445-r4bnh" Mar 18 10:16:58 crc kubenswrapper[4733]: I0318 10:16:58.038342 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b-client-ca\") pod \"controller-manager-6748588445-r4bnh\" (UID: \"19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b\") " pod="openshift-controller-manager/controller-manager-6748588445-r4bnh" Mar 18 10:16:58 crc kubenswrapper[4733]: I0318 10:16:58.038403 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b-serving-cert\") pod \"controller-manager-6748588445-r4bnh\" (UID: \"19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b\") " pod="openshift-controller-manager/controller-manager-6748588445-r4bnh" Mar 18 10:16:58 crc kubenswrapper[4733]: I0318 10:16:58.038545 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvgcj\" (UniqueName: \"kubernetes.io/projected/24e63f18-ff07-40b0-8289-351352d47d0a-kube-api-access-nvgcj\") on node \"crc\" DevicePath \"\"" Mar 18 10:16:58 crc kubenswrapper[4733]: I0318 10:16:58.038596 4733 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1954584-f4d6-467a-8b0c-9db32f9e385c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 10:16:58 crc kubenswrapper[4733]: I0318 10:16:58.038610 4733 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24e63f18-ff07-40b0-8289-351352d47d0a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 10:16:58 crc kubenswrapper[4733]: I0318 10:16:58.038623 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dmg68\" (UniqueName: \"kubernetes.io/projected/d1954584-f4d6-467a-8b0c-9db32f9e385c-kube-api-access-dmg68\") on node \"crc\" DevicePath \"\"" Mar 18 10:16:58 crc kubenswrapper[4733]: I0318 10:16:58.139405 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b-client-ca\") pod \"controller-manager-6748588445-r4bnh\" (UID: \"19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b\") " pod="openshift-controller-manager/controller-manager-6748588445-r4bnh" Mar 18 10:16:58 crc kubenswrapper[4733]: I0318 10:16:58.139460 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b-serving-cert\") pod \"controller-manager-6748588445-r4bnh\" (UID: \"19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b\") " pod="openshift-controller-manager/controller-manager-6748588445-r4bnh" Mar 18 10:16:58 crc kubenswrapper[4733]: I0318 10:16:58.139516 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b-config\") pod \"controller-manager-6748588445-r4bnh\" (UID: \"19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b\") " pod="openshift-controller-manager/controller-manager-6748588445-r4bnh" Mar 18 10:16:58 crc kubenswrapper[4733]: I0318 10:16:58.139535 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b-proxy-ca-bundles\") pod \"controller-manager-6748588445-r4bnh\" (UID: \"19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b\") " pod="openshift-controller-manager/controller-manager-6748588445-r4bnh" Mar 18 10:16:58 crc kubenswrapper[4733]: I0318 10:16:58.139557 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzm9t\" (UniqueName: \"kubernetes.io/projected/19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b-kube-api-access-hzm9t\") pod \"controller-manager-6748588445-r4bnh\" (UID: \"19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b\") " pod="openshift-controller-manager/controller-manager-6748588445-r4bnh" Mar 18 10:16:58 crc kubenswrapper[4733]: I0318 10:16:58.140983 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b-client-ca\") pod \"controller-manager-6748588445-r4bnh\" (UID: \"19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b\") " pod="openshift-controller-manager/controller-manager-6748588445-r4bnh" Mar 18 10:16:58 crc kubenswrapper[4733]: I0318 10:16:58.141396 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b-config\") pod \"controller-manager-6748588445-r4bnh\" (UID: \"19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b\") " pod="openshift-controller-manager/controller-manager-6748588445-r4bnh" Mar 18 10:16:58 crc kubenswrapper[4733]: I0318 10:16:58.141770 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b-proxy-ca-bundles\") pod \"controller-manager-6748588445-r4bnh\" (UID: \"19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b\") " pod="openshift-controller-manager/controller-manager-6748588445-r4bnh" Mar 18 10:16:58 crc kubenswrapper[4733]: I0318 10:16:58.145779 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b-serving-cert\") pod \"controller-manager-6748588445-r4bnh\" (UID: \"19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b\") " pod="openshift-controller-manager/controller-manager-6748588445-r4bnh" Mar 18 10:16:58 crc kubenswrapper[4733]: I0318 10:16:58.157988 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzm9t\" (UniqueName: \"kubernetes.io/projected/19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b-kube-api-access-hzm9t\") pod \"controller-manager-6748588445-r4bnh\" (UID: \"19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b\") " pod="openshift-controller-manager/controller-manager-6748588445-r4bnh" Mar 18 10:16:58 crc kubenswrapper[4733]: I0318 10:16:58.191399 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6748588445-r4bnh" Mar 18 10:16:58 crc kubenswrapper[4733]: I0318 10:16:58.391250 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75c9fbd49-qmfds" Mar 18 10:16:58 crc kubenswrapper[4733]: I0318 10:16:58.391293 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75c9fbd49-qmfds" event={"ID":"24e63f18-ff07-40b0-8289-351352d47d0a","Type":"ContainerDied","Data":"4e6f1dbd850522a9734d4ded476f261e3ae7fcfb8b06688f8b915837852ee325"} Mar 18 10:16:58 crc kubenswrapper[4733]: I0318 10:16:58.391698 4733 scope.go:117] "RemoveContainer" containerID="9eef5c0d799a7506ebb7268521b337cf05cff4092c361a61789da71fa3bb245c" Mar 18 10:16:58 crc kubenswrapper[4733]: I0318 10:16:58.394028 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6946475f8-lnppg" Mar 18 10:16:58 crc kubenswrapper[4733]: I0318 10:16:58.394050 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6946475f8-lnppg" event={"ID":"d1954584-f4d6-467a-8b0c-9db32f9e385c","Type":"ContainerDied","Data":"f9b9ef0617ed480f3b6667ef58bad6aa0295e3670f7f52f36e961678f655af59"} Mar 18 10:16:58 crc kubenswrapper[4733]: E0318 10:16:58.395535 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29563816-4582s" podUID="71a70c3c-d483-43f4-9f54-10978c7f8cc8" Mar 18 10:16:58 crc kubenswrapper[4733]: I0318 10:16:58.424936 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-75c9fbd49-qmfds"] Mar 18 10:16:58 crc kubenswrapper[4733]: I0318 10:16:58.430144 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-75c9fbd49-qmfds"] Mar 18 10:16:58 crc kubenswrapper[4733]: I0318 10:16:58.439449 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6946475f8-lnppg"] Mar 18 10:16:58 crc kubenswrapper[4733]: I0318 10:16:58.442364 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6946475f8-lnppg"] Mar 18 10:16:58 crc kubenswrapper[4733]: I0318 10:16:58.741038 4733 patch_prober.go:28] interesting pod/controller-manager-75c9fbd49-qmfds container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.55:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 10:16:58 crc kubenswrapper[4733]: I0318 10:16:58.741101 4733 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-75c9fbd49-qmfds" podUID="24e63f18-ff07-40b0-8289-351352d47d0a" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.55:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 10:16:59 crc kubenswrapper[4733]: I0318 10:16:59.184518 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24e63f18-ff07-40b0-8289-351352d47d0a" path="/var/lib/kubelet/pods/24e63f18-ff07-40b0-8289-351352d47d0a/volumes" Mar 18 10:16:59 crc kubenswrapper[4733]: I0318 10:16:59.185651 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1954584-f4d6-467a-8b0c-9db32f9e385c" path="/var/lib/kubelet/pods/d1954584-f4d6-467a-8b0c-9db32f9e385c/volumes" Mar 18 10:17:02 crc kubenswrapper[4733]: I0318 10:17:02.428971 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d8fb44d7f-zrsx6"] Mar 18 10:17:02 crc kubenswrapper[4733]: I0318 10:17:02.430323 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d8fb44d7f-zrsx6" Mar 18 10:17:02 crc kubenswrapper[4733]: I0318 10:17:02.432989 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 18 10:17:02 crc kubenswrapper[4733]: I0318 10:17:02.433079 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 18 10:17:02 crc kubenswrapper[4733]: I0318 10:17:02.433161 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 18 10:17:02 crc kubenswrapper[4733]: I0318 10:17:02.434680 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 18 10:17:02 crc kubenswrapper[4733]: I0318 10:17:02.441299 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 18 10:17:02 crc kubenswrapper[4733]: I0318 10:17:02.441576 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 18 10:17:02 crc kubenswrapper[4733]: I0318 10:17:02.446724 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d8fb44d7f-zrsx6"] Mar 18 10:17:02 crc kubenswrapper[4733]: I0318 10:17:02.607482 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8ad622ed-fcbc-4485-a927-639a658660fa-client-ca\") pod \"route-controller-manager-6d8fb44d7f-zrsx6\" (UID: \"8ad622ed-fcbc-4485-a927-639a658660fa\") " pod="openshift-route-controller-manager/route-controller-manager-6d8fb44d7f-zrsx6" Mar 18 10:17:02 crc kubenswrapper[4733]: I0318 10:17:02.607619 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzjn9\" (UniqueName: \"kubernetes.io/projected/8ad622ed-fcbc-4485-a927-639a658660fa-kube-api-access-pzjn9\") pod \"route-controller-manager-6d8fb44d7f-zrsx6\" (UID: \"8ad622ed-fcbc-4485-a927-639a658660fa\") " pod="openshift-route-controller-manager/route-controller-manager-6d8fb44d7f-zrsx6" Mar 18 10:17:02 crc kubenswrapper[4733]: I0318 10:17:02.607700 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ad622ed-fcbc-4485-a927-639a658660fa-config\") pod \"route-controller-manager-6d8fb44d7f-zrsx6\" (UID: \"8ad622ed-fcbc-4485-a927-639a658660fa\") " pod="openshift-route-controller-manager/route-controller-manager-6d8fb44d7f-zrsx6" Mar 18 10:17:02 crc kubenswrapper[4733]: I0318 10:17:02.607736 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ad622ed-fcbc-4485-a927-639a658660fa-serving-cert\") pod \"route-controller-manager-6d8fb44d7f-zrsx6\" (UID: \"8ad622ed-fcbc-4485-a927-639a658660fa\") " pod="openshift-route-controller-manager/route-controller-manager-6d8fb44d7f-zrsx6" Mar 18 10:17:02 crc kubenswrapper[4733]: I0318 10:17:02.709805 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8ad622ed-fcbc-4485-a927-639a658660fa-client-ca\") pod \"route-controller-manager-6d8fb44d7f-zrsx6\" (UID: \"8ad622ed-fcbc-4485-a927-639a658660fa\") " pod="openshift-route-controller-manager/route-controller-manager-6d8fb44d7f-zrsx6" Mar 18 10:17:02 crc kubenswrapper[4733]: I0318 10:17:02.709893 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzjn9\" (UniqueName: \"kubernetes.io/projected/8ad622ed-fcbc-4485-a927-639a658660fa-kube-api-access-pzjn9\") pod \"route-controller-manager-6d8fb44d7f-zrsx6\" (UID: \"8ad622ed-fcbc-4485-a927-639a658660fa\") " pod="openshift-route-controller-manager/route-controller-manager-6d8fb44d7f-zrsx6" Mar 18 10:17:02 crc kubenswrapper[4733]: I0318 10:17:02.709942 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ad622ed-fcbc-4485-a927-639a658660fa-config\") pod \"route-controller-manager-6d8fb44d7f-zrsx6\" (UID: \"8ad622ed-fcbc-4485-a927-639a658660fa\") " pod="openshift-route-controller-manager/route-controller-manager-6d8fb44d7f-zrsx6" Mar 18 10:17:02 crc kubenswrapper[4733]: I0318 10:17:02.709961 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ad622ed-fcbc-4485-a927-639a658660fa-serving-cert\") pod \"route-controller-manager-6d8fb44d7f-zrsx6\" (UID: \"8ad622ed-fcbc-4485-a927-639a658660fa\") " pod="openshift-route-controller-manager/route-controller-manager-6d8fb44d7f-zrsx6" Mar 18 10:17:02 crc kubenswrapper[4733]: I0318 10:17:02.711343 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8ad622ed-fcbc-4485-a927-639a658660fa-client-ca\") pod \"route-controller-manager-6d8fb44d7f-zrsx6\" (UID: \"8ad622ed-fcbc-4485-a927-639a658660fa\") " pod="openshift-route-controller-manager/route-controller-manager-6d8fb44d7f-zrsx6" Mar 18 10:17:02 crc kubenswrapper[4733]: I0318 10:17:02.711461 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ad622ed-fcbc-4485-a927-639a658660fa-config\") pod \"route-controller-manager-6d8fb44d7f-zrsx6\" (UID: \"8ad622ed-fcbc-4485-a927-639a658660fa\") " pod="openshift-route-controller-manager/route-controller-manager-6d8fb44d7f-zrsx6" Mar 18 10:17:02 crc kubenswrapper[4733]: I0318 10:17:02.717139 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ad622ed-fcbc-4485-a927-639a658660fa-serving-cert\") pod \"route-controller-manager-6d8fb44d7f-zrsx6\" (UID: \"8ad622ed-fcbc-4485-a927-639a658660fa\") " pod="openshift-route-controller-manager/route-controller-manager-6d8fb44d7f-zrsx6" Mar 18 10:17:02 crc kubenswrapper[4733]: I0318 10:17:02.725274 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzjn9\" (UniqueName: \"kubernetes.io/projected/8ad622ed-fcbc-4485-a927-639a658660fa-kube-api-access-pzjn9\") pod \"route-controller-manager-6d8fb44d7f-zrsx6\" (UID: \"8ad622ed-fcbc-4485-a927-639a658660fa\") " pod="openshift-route-controller-manager/route-controller-manager-6d8fb44d7f-zrsx6" Mar 18 10:17:02 crc kubenswrapper[4733]: I0318 10:17:02.778485 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d8fb44d7f-zrsx6" Mar 18 10:17:07 crc kubenswrapper[4733]: I0318 10:17:07.859143 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kd6gw" Mar 18 10:17:09 crc kubenswrapper[4733]: E0318 10:17:09.754049 4733 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 18 10:17:09 crc kubenswrapper[4733]: E0318 10:17:09.754311 4733 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s5vpf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-w7rrs_openshift-marketplace(02cd6358-355c-4db8-b0f7-2528618602ff): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 10:17:09 crc kubenswrapper[4733]: E0318 10:17:09.755989 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-w7rrs" podUID="02cd6358-355c-4db8-b0f7-2528618602ff" Mar 18 10:17:11 crc kubenswrapper[4733]: E0318 10:17:11.359396 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-w7rrs" podUID="02cd6358-355c-4db8-b0f7-2528618602ff" Mar 18 10:17:11 crc kubenswrapper[4733]: I0318 10:17:11.897635 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 18 10:17:11 crc kubenswrapper[4733]: I0318 10:17:11.898403 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 10:17:11 crc kubenswrapper[4733]: I0318 10:17:11.900670 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 18 10:17:11 crc kubenswrapper[4733]: I0318 10:17:11.900998 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 18 10:17:11 crc kubenswrapper[4733]: I0318 10:17:11.915782 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 18 10:17:12 crc kubenswrapper[4733]: I0318 10:17:12.090017 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/068ddaf0-0f79-459b-b064-1b90505c36ca-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"068ddaf0-0f79-459b-b064-1b90505c36ca\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 10:17:12 crc kubenswrapper[4733]: I0318 10:17:12.090133 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/068ddaf0-0f79-459b-b064-1b90505c36ca-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"068ddaf0-0f79-459b-b064-1b90505c36ca\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 10:17:12 crc kubenswrapper[4733]: E0318 10:17:12.157245 4733 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 18 10:17:12 crc kubenswrapper[4733]: E0318 10:17:12.157623 4733 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5frtf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-jb86w_openshift-marketplace(0fd306cb-05db-40e1-a1ec-9f811ce7fec0): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 10:17:12 crc kubenswrapper[4733]: E0318 10:17:12.159036 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-jb86w" podUID="0fd306cb-05db-40e1-a1ec-9f811ce7fec0" Mar 18 10:17:12 crc kubenswrapper[4733]: I0318 10:17:12.191691 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/068ddaf0-0f79-459b-b064-1b90505c36ca-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"068ddaf0-0f79-459b-b064-1b90505c36ca\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 10:17:12 crc kubenswrapper[4733]: I0318 10:17:12.191832 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/068ddaf0-0f79-459b-b064-1b90505c36ca-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"068ddaf0-0f79-459b-b064-1b90505c36ca\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 10:17:12 crc kubenswrapper[4733]: I0318 10:17:12.192265 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/068ddaf0-0f79-459b-b064-1b90505c36ca-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"068ddaf0-0f79-459b-b064-1b90505c36ca\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 10:17:12 crc kubenswrapper[4733]: I0318 10:17:12.214123 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/068ddaf0-0f79-459b-b064-1b90505c36ca-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"068ddaf0-0f79-459b-b064-1b90505c36ca\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 10:17:12 crc kubenswrapper[4733]: I0318 10:17:12.262123 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 10:17:12 crc kubenswrapper[4733]: I0318 10:17:12.330927 4733 scope.go:117] "RemoveContainer" containerID="bb3b81003e39ecbea213f0c0b02b7e8dae8e9507c2c01e812e0dce4d4f5c71d4" Mar 18 10:17:12 crc kubenswrapper[4733]: E0318 10:17:12.916211 4733 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 18 10:17:12 crc kubenswrapper[4733]: E0318 10:17:12.916890 4733 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w7hv7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-rls2r_openshift-marketplace(92996997-080b-42c9-bc2c-19c2e68db896): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 10:17:12 crc kubenswrapper[4733]: E0318 10:17:12.918572 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-rls2r" podUID="92996997-080b-42c9-bc2c-19c2e68db896" Mar 18 10:17:13 crc kubenswrapper[4733]: E0318 10:17:13.122965 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-jb86w" podUID="0fd306cb-05db-40e1-a1ec-9f811ce7fec0" Mar 18 10:17:13 crc kubenswrapper[4733]: I0318 10:17:13.141115 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6748588445-r4bnh"] Mar 18 10:17:13 crc kubenswrapper[4733]: I0318 10:17:13.249951 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d8fb44d7f-zrsx6"] Mar 18 10:17:13 crc kubenswrapper[4733]: E0318 10:17:13.295275 4733 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 18 10:17:13 crc kubenswrapper[4733]: E0318 10:17:13.295572 4733 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6r2vl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-f82xf_openshift-marketplace(c91f12fa-96f0-442a-a3f7-70d56a697839): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 10:17:13 crc kubenswrapper[4733]: E0318 10:17:13.296971 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-f82xf" podUID="c91f12fa-96f0-442a-a3f7-70d56a697839" Mar 18 10:17:13 crc kubenswrapper[4733]: I0318 10:17:13.571683 4733 patch_prober.go:28] interesting pod/machine-config-daemon-2h7dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:17:13 crc kubenswrapper[4733]: I0318 10:17:13.571756 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:17:13 crc kubenswrapper[4733]: E0318 10:17:13.641709 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-rls2r" podUID="92996997-080b-42c9-bc2c-19c2e68db896" Mar 18 10:17:13 crc kubenswrapper[4733]: E0318 10:17:13.641944 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-f82xf" podUID="c91f12fa-96f0-442a-a3f7-70d56a697839" Mar 18 10:17:14 crc kubenswrapper[4733]: I0318 10:17:14.193643 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6748588445-r4bnh"] Mar 18 10:17:14 crc kubenswrapper[4733]: W0318 10:17:14.204592 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19d4bb8b_0ef7_4aae_9eca_d24dbb957f8b.slice/crio-729502b90d95e5266fca1fcfc3473b605230425c19cf5542d40820061af70fa1 WatchSource:0}: Error finding container 729502b90d95e5266fca1fcfc3473b605230425c19cf5542d40820061af70fa1: Status 404 returned error can't find the container with id 729502b90d95e5266fca1fcfc3473b605230425c19cf5542d40820061af70fa1 Mar 18 10:17:14 crc kubenswrapper[4733]: I0318 10:17:14.279729 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d8fb44d7f-zrsx6"] Mar 18 10:17:14 crc kubenswrapper[4733]: I0318 10:17:14.286983 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 18 10:17:14 crc kubenswrapper[4733]: I0318 10:17:14.482423 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d8fb44d7f-zrsx6" event={"ID":"8ad622ed-fcbc-4485-a927-639a658660fa","Type":"ContainerStarted","Data":"c17a0094029923b923cf507d45924db60f811cb6ad73561bc864ff132db3d533"} Mar 18 10:17:14 crc kubenswrapper[4733]: I0318 10:17:14.483537 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"068ddaf0-0f79-459b-b064-1b90505c36ca","Type":"ContainerStarted","Data":"a37d483c7ee4ef126cb1f876ea9f88989d5b67cbbc348139e49a23862aa79e05"} Mar 18 10:17:14 crc kubenswrapper[4733]: I0318 10:17:14.499926 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gmw2d" event={"ID":"7eb97f2d-18fa-4e8c-895f-de4602c9dbbc","Type":"ContainerStarted","Data":"c0f64ebe0af1fce843609f30c833e4b965000df30afbd1af8fae99160a42210c"} Mar 18 10:17:14 crc kubenswrapper[4733]: I0318 10:17:14.505095 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hrwxg" event={"ID":"fb7ed879-1474-4200-88d4-70e425e2bcb1","Type":"ContainerStarted","Data":"340cfa7d2b8654b1dea28355651bf6f54381a8104d827e6d38142ffcaf93e8ae"} Mar 18 10:17:14 crc kubenswrapper[4733]: I0318 10:17:14.513115 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6748588445-r4bnh" event={"ID":"19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b","Type":"ContainerStarted","Data":"729502b90d95e5266fca1fcfc3473b605230425c19cf5542d40820061af70fa1"} Mar 18 10:17:14 crc kubenswrapper[4733]: I0318 10:17:14.516133 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ll7dp" event={"ID":"82922e1e-56fb-432e-9441-b99bdb19fbb0","Type":"ContainerStarted","Data":"29a9561b8927709b5dd59a92cbf81b78eacad45dd6ac5ec49191d6faee246d53"} Mar 18 10:17:14 crc kubenswrapper[4733]: I0318 10:17:14.519343 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f92nl" event={"ID":"527056ad-4daf-4dd5-9e31-887d55be0336","Type":"ContainerStarted","Data":"29549f7b8e67a919f47e1ac510a621c5aca25e45afa1c1c52c5acdec0d566db4"} Mar 18 10:17:15 crc kubenswrapper[4733]: I0318 10:17:15.530669 4733 generic.go:334] "Generic (PLEG): container finished" podID="82922e1e-56fb-432e-9441-b99bdb19fbb0" containerID="29a9561b8927709b5dd59a92cbf81b78eacad45dd6ac5ec49191d6faee246d53" exitCode=0 Mar 18 10:17:15 crc kubenswrapper[4733]: I0318 10:17:15.530758 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ll7dp" event={"ID":"82922e1e-56fb-432e-9441-b99bdb19fbb0","Type":"ContainerDied","Data":"29a9561b8927709b5dd59a92cbf81b78eacad45dd6ac5ec49191d6faee246d53"} Mar 18 10:17:15 crc kubenswrapper[4733]: I0318 10:17:15.533409 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563816-4582s" event={"ID":"71a70c3c-d483-43f4-9f54-10978c7f8cc8","Type":"ContainerStarted","Data":"869578488a5526adb52c0d5efeb676ea68e5c20e95b1cf2d208fa00dbd02baca"} Mar 18 10:17:15 crc kubenswrapper[4733]: I0318 10:17:15.536488 4733 generic.go:334] "Generic (PLEG): container finished" podID="527056ad-4daf-4dd5-9e31-887d55be0336" containerID="29549f7b8e67a919f47e1ac510a621c5aca25e45afa1c1c52c5acdec0d566db4" exitCode=0 Mar 18 10:17:15 crc kubenswrapper[4733]: I0318 10:17:15.536607 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f92nl" event={"ID":"527056ad-4daf-4dd5-9e31-887d55be0336","Type":"ContainerDied","Data":"29549f7b8e67a919f47e1ac510a621c5aca25e45afa1c1c52c5acdec0d566db4"} Mar 18 10:17:15 crc kubenswrapper[4733]: I0318 10:17:15.538624 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d8fb44d7f-zrsx6" event={"ID":"8ad622ed-fcbc-4485-a927-639a658660fa","Type":"ContainerStarted","Data":"363048c696b7c0c6504ff378d37d2a0ff6eece67133ad5d68c872cadbe32058b"} Mar 18 10:17:15 crc kubenswrapper[4733]: I0318 10:17:15.538744 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6d8fb44d7f-zrsx6" podUID="8ad622ed-fcbc-4485-a927-639a658660fa" containerName="route-controller-manager" containerID="cri-o://363048c696b7c0c6504ff378d37d2a0ff6eece67133ad5d68c872cadbe32058b" gracePeriod=30 Mar 18 10:17:15 crc kubenswrapper[4733]: I0318 10:17:15.538906 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6d8fb44d7f-zrsx6" Mar 18 10:17:15 crc kubenswrapper[4733]: I0318 10:17:15.542667 4733 generic.go:334] "Generic (PLEG): container finished" podID="068ddaf0-0f79-459b-b064-1b90505c36ca" containerID="88cfa7738cd509c2ad7f6041d284fa78af0a64d896c50f75f0a19f578d7cb91d" exitCode=0 Mar 18 10:17:15 crc kubenswrapper[4733]: I0318 10:17:15.542789 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"068ddaf0-0f79-459b-b064-1b90505c36ca","Type":"ContainerDied","Data":"88cfa7738cd509c2ad7f6041d284fa78af0a64d896c50f75f0a19f578d7cb91d"} Mar 18 10:17:15 crc kubenswrapper[4733]: I0318 10:17:15.547080 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6d8fb44d7f-zrsx6" Mar 18 10:17:15 crc kubenswrapper[4733]: I0318 10:17:15.547596 4733 generic.go:334] "Generic (PLEG): container finished" podID="7eb97f2d-18fa-4e8c-895f-de4602c9dbbc" containerID="c0f64ebe0af1fce843609f30c833e4b965000df30afbd1af8fae99160a42210c" exitCode=0 Mar 18 10:17:15 crc kubenswrapper[4733]: I0318 10:17:15.547640 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gmw2d" event={"ID":"7eb97f2d-18fa-4e8c-895f-de4602c9dbbc","Type":"ContainerDied","Data":"c0f64ebe0af1fce843609f30c833e4b965000df30afbd1af8fae99160a42210c"} Mar 18 10:17:15 crc kubenswrapper[4733]: I0318 10:17:15.551313 4733 generic.go:334] "Generic (PLEG): container finished" podID="fb7ed879-1474-4200-88d4-70e425e2bcb1" containerID="340cfa7d2b8654b1dea28355651bf6f54381a8104d827e6d38142ffcaf93e8ae" exitCode=0 Mar 18 10:17:15 crc kubenswrapper[4733]: I0318 10:17:15.551401 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hrwxg" event={"ID":"fb7ed879-1474-4200-88d4-70e425e2bcb1","Type":"ContainerDied","Data":"340cfa7d2b8654b1dea28355651bf6f54381a8104d827e6d38142ffcaf93e8ae"} Mar 18 10:17:15 crc kubenswrapper[4733]: I0318 10:17:15.553054 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6748588445-r4bnh" event={"ID":"19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b","Type":"ContainerStarted","Data":"99d80b19731c17ac26d38096f7c5b3a305c502e2750b077f48abc2058aa5d277"} Mar 18 10:17:15 crc kubenswrapper[4733]: I0318 10:17:15.553179 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6748588445-r4bnh" podUID="19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b" containerName="controller-manager" containerID="cri-o://99d80b19731c17ac26d38096f7c5b3a305c502e2750b077f48abc2058aa5d277" gracePeriod=30 Mar 18 10:17:15 crc kubenswrapper[4733]: I0318 10:17:15.553308 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6748588445-r4bnh" Mar 18 10:17:15 crc kubenswrapper[4733]: I0318 10:17:15.562322 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6748588445-r4bnh" Mar 18 10:17:15 crc kubenswrapper[4733]: I0318 10:17:15.583475 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6748588445-r4bnh" podStartSLOduration=22.583454564 podStartE2EDuration="22.583454564s" podCreationTimestamp="2026-03-18 10:16:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:17:15.581768987 +0000 UTC m=+275.073503312" watchObservedRunningTime="2026-03-18 10:17:15.583454564 +0000 UTC m=+275.075188889" Mar 18 10:17:15 crc kubenswrapper[4733]: I0318 10:17:15.634636 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6d8fb44d7f-zrsx6" podStartSLOduration=22.634603006 podStartE2EDuration="22.634603006s" podCreationTimestamp="2026-03-18 10:16:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:17:15.632468076 +0000 UTC m=+275.124202401" watchObservedRunningTime="2026-03-18 10:17:15.634603006 +0000 UTC m=+275.126337331" Mar 18 10:17:15 crc kubenswrapper[4733]: I0318 10:17:15.684222 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563816-4582s" podStartSLOduration=29.799564362 podStartE2EDuration="1m15.684167824s" podCreationTimestamp="2026-03-18 10:16:00 +0000 UTC" firstStartedPulling="2026-03-18 10:16:28.967450925 +0000 UTC m=+228.459185250" lastFinishedPulling="2026-03-18 10:17:14.852054387 +0000 UTC m=+274.343788712" observedRunningTime="2026-03-18 10:17:15.674881114 +0000 UTC m=+275.166615439" watchObservedRunningTime="2026-03-18 10:17:15.684167824 +0000 UTC m=+275.175902149" Mar 18 10:17:15 crc kubenswrapper[4733]: I0318 10:17:15.694352 4733 csr.go:261] certificate signing request csr-8nwhj is approved, waiting to be issued Mar 18 10:17:15 crc kubenswrapper[4733]: I0318 10:17:15.711933 4733 csr.go:257] certificate signing request csr-8nwhj is issued Mar 18 10:17:15 crc kubenswrapper[4733]: I0318 10:17:15.973656 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d8fb44d7f-zrsx6" Mar 18 10:17:15 crc kubenswrapper[4733]: I0318 10:17:15.982551 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6748588445-r4bnh" Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.004110 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6966b4866d-xm6hn"] Mar 18 10:17:16 crc kubenswrapper[4733]: E0318 10:17:16.004933 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ad622ed-fcbc-4485-a927-639a658660fa" containerName="route-controller-manager" Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.004953 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ad622ed-fcbc-4485-a927-639a658660fa" containerName="route-controller-manager" Mar 18 10:17:16 crc kubenswrapper[4733]: E0318 10:17:16.004972 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b" containerName="controller-manager" Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.004979 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b" containerName="controller-manager" Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.005076 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b" containerName="controller-manager" Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.005096 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ad622ed-fcbc-4485-a927-639a658660fa" containerName="route-controller-manager" Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.005489 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6966b4866d-xm6hn" Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.016183 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6966b4866d-xm6hn"] Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.051015 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/debb4f43-7f9a-4fdc-9896-db5106650a74-serving-cert\") pod \"route-controller-manager-6966b4866d-xm6hn\" (UID: \"debb4f43-7f9a-4fdc-9896-db5106650a74\") " pod="openshift-route-controller-manager/route-controller-manager-6966b4866d-xm6hn" Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.051220 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/debb4f43-7f9a-4fdc-9896-db5106650a74-config\") pod \"route-controller-manager-6966b4866d-xm6hn\" (UID: \"debb4f43-7f9a-4fdc-9896-db5106650a74\") " pod="openshift-route-controller-manager/route-controller-manager-6966b4866d-xm6hn" Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.051283 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/debb4f43-7f9a-4fdc-9896-db5106650a74-client-ca\") pod \"route-controller-manager-6966b4866d-xm6hn\" (UID: \"debb4f43-7f9a-4fdc-9896-db5106650a74\") " pod="openshift-route-controller-manager/route-controller-manager-6966b4866d-xm6hn" Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.051301 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwfk4\" (UniqueName: \"kubernetes.io/projected/debb4f43-7f9a-4fdc-9896-db5106650a74-kube-api-access-wwfk4\") pod \"route-controller-manager-6966b4866d-xm6hn\" (UID: \"debb4f43-7f9a-4fdc-9896-db5106650a74\") " pod="openshift-route-controller-manager/route-controller-manager-6966b4866d-xm6hn" Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.152847 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzm9t\" (UniqueName: \"kubernetes.io/projected/19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b-kube-api-access-hzm9t\") pod \"19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b\" (UID: \"19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b\") " Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.152918 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ad622ed-fcbc-4485-a927-639a658660fa-config\") pod \"8ad622ed-fcbc-4485-a927-639a658660fa\" (UID: \"8ad622ed-fcbc-4485-a927-639a658660fa\") " Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.153015 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b-serving-cert\") pod \"19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b\" (UID: \"19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b\") " Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.153236 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b-client-ca\") pod \"19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b\" (UID: \"19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b\") " Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.153292 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b-config\") pod \"19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b\" (UID: \"19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b\") " Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.153332 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzjn9\" (UniqueName: \"kubernetes.io/projected/8ad622ed-fcbc-4485-a927-639a658660fa-kube-api-access-pzjn9\") pod \"8ad622ed-fcbc-4485-a927-639a658660fa\" (UID: \"8ad622ed-fcbc-4485-a927-639a658660fa\") " Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.153398 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b-proxy-ca-bundles\") pod \"19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b\" (UID: \"19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b\") " Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.153454 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ad622ed-fcbc-4485-a927-639a658660fa-serving-cert\") pod \"8ad622ed-fcbc-4485-a927-639a658660fa\" (UID: \"8ad622ed-fcbc-4485-a927-639a658660fa\") " Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.153510 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8ad622ed-fcbc-4485-a927-639a658660fa-client-ca\") pod \"8ad622ed-fcbc-4485-a927-639a658660fa\" (UID: \"8ad622ed-fcbc-4485-a927-639a658660fa\") " Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.153796 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/debb4f43-7f9a-4fdc-9896-db5106650a74-serving-cert\") pod \"route-controller-manager-6966b4866d-xm6hn\" (UID: \"debb4f43-7f9a-4fdc-9896-db5106650a74\") " pod="openshift-route-controller-manager/route-controller-manager-6966b4866d-xm6hn" Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.153859 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/debb4f43-7f9a-4fdc-9896-db5106650a74-config\") pod \"route-controller-manager-6966b4866d-xm6hn\" (UID: \"debb4f43-7f9a-4fdc-9896-db5106650a74\") " pod="openshift-route-controller-manager/route-controller-manager-6966b4866d-xm6hn" Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.153905 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/debb4f43-7f9a-4fdc-9896-db5106650a74-client-ca\") pod \"route-controller-manager-6966b4866d-xm6hn\" (UID: \"debb4f43-7f9a-4fdc-9896-db5106650a74\") " pod="openshift-route-controller-manager/route-controller-manager-6966b4866d-xm6hn" Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.153927 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwfk4\" (UniqueName: \"kubernetes.io/projected/debb4f43-7f9a-4fdc-9896-db5106650a74-kube-api-access-wwfk4\") pod \"route-controller-manager-6966b4866d-xm6hn\" (UID: \"debb4f43-7f9a-4fdc-9896-db5106650a74\") " pod="openshift-route-controller-manager/route-controller-manager-6966b4866d-xm6hn" Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.154102 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b-client-ca" (OuterVolumeSpecName: "client-ca") pod "19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b" (UID: "19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.155024 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b-config" (OuterVolumeSpecName: "config") pod "19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b" (UID: "19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.155354 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ad622ed-fcbc-4485-a927-639a658660fa-client-ca" (OuterVolumeSpecName: "client-ca") pod "8ad622ed-fcbc-4485-a927-639a658660fa" (UID: "8ad622ed-fcbc-4485-a927-639a658660fa"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.155629 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b" (UID: "19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.157004 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/debb4f43-7f9a-4fdc-9896-db5106650a74-config\") pod \"route-controller-manager-6966b4866d-xm6hn\" (UID: \"debb4f43-7f9a-4fdc-9896-db5106650a74\") " pod="openshift-route-controller-manager/route-controller-manager-6966b4866d-xm6hn" Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.157229 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/debb4f43-7f9a-4fdc-9896-db5106650a74-client-ca\") pod \"route-controller-manager-6966b4866d-xm6hn\" (UID: \"debb4f43-7f9a-4fdc-9896-db5106650a74\") " pod="openshift-route-controller-manager/route-controller-manager-6966b4866d-xm6hn" Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.157223 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ad622ed-fcbc-4485-a927-639a658660fa-config" (OuterVolumeSpecName: "config") pod "8ad622ed-fcbc-4485-a927-639a658660fa" (UID: "8ad622ed-fcbc-4485-a927-639a658660fa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.159736 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b" (UID: "19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.160276 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ad622ed-fcbc-4485-a927-639a658660fa-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8ad622ed-fcbc-4485-a927-639a658660fa" (UID: "8ad622ed-fcbc-4485-a927-639a658660fa"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.160276 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ad622ed-fcbc-4485-a927-639a658660fa-kube-api-access-pzjn9" (OuterVolumeSpecName: "kube-api-access-pzjn9") pod "8ad622ed-fcbc-4485-a927-639a658660fa" (UID: "8ad622ed-fcbc-4485-a927-639a658660fa"). InnerVolumeSpecName "kube-api-access-pzjn9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.160374 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b-kube-api-access-hzm9t" (OuterVolumeSpecName: "kube-api-access-hzm9t") pod "19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b" (UID: "19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b"). InnerVolumeSpecName "kube-api-access-hzm9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.160554 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/debb4f43-7f9a-4fdc-9896-db5106650a74-serving-cert\") pod \"route-controller-manager-6966b4866d-xm6hn\" (UID: \"debb4f43-7f9a-4fdc-9896-db5106650a74\") " pod="openshift-route-controller-manager/route-controller-manager-6966b4866d-xm6hn" Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.171886 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwfk4\" (UniqueName: \"kubernetes.io/projected/debb4f43-7f9a-4fdc-9896-db5106650a74-kube-api-access-wwfk4\") pod \"route-controller-manager-6966b4866d-xm6hn\" (UID: \"debb4f43-7f9a-4fdc-9896-db5106650a74\") " pod="openshift-route-controller-manager/route-controller-manager-6966b4866d-xm6hn" Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.255737 4733 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.255785 4733 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ad622ed-fcbc-4485-a927-639a658660fa-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.255798 4733 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8ad622ed-fcbc-4485-a927-639a658660fa-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.255809 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hzm9t\" (UniqueName: \"kubernetes.io/projected/19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b-kube-api-access-hzm9t\") on node \"crc\" DevicePath \"\"" Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.255824 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ad622ed-fcbc-4485-a927-639a658660fa-config\") on node \"crc\" DevicePath \"\"" Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.255837 4733 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.255849 4733 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.255860 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b-config\") on node \"crc\" DevicePath \"\"" Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.255873 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzjn9\" (UniqueName: \"kubernetes.io/projected/8ad622ed-fcbc-4485-a927-639a658660fa-kube-api-access-pzjn9\") on node \"crc\" DevicePath \"\"" Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.406530 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6966b4866d-xm6hn" Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.572296 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gmw2d" event={"ID":"7eb97f2d-18fa-4e8c-895f-de4602c9dbbc","Type":"ContainerStarted","Data":"59fff67ee2ee5eba64f97e1503253ee04deab9883241245399247be534c88d2d"} Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.575888 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hrwxg" event={"ID":"fb7ed879-1474-4200-88d4-70e425e2bcb1","Type":"ContainerStarted","Data":"33010d46494372b311f8b2a190a49601d96469c4c865b75dc62dd08ddc447a47"} Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.577516 4733 generic.go:334] "Generic (PLEG): container finished" podID="19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b" containerID="99d80b19731c17ac26d38096f7c5b3a305c502e2750b077f48abc2058aa5d277" exitCode=0 Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.577590 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6748588445-r4bnh" event={"ID":"19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b","Type":"ContainerDied","Data":"99d80b19731c17ac26d38096f7c5b3a305c502e2750b077f48abc2058aa5d277"} Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.577636 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6748588445-r4bnh" event={"ID":"19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b","Type":"ContainerDied","Data":"729502b90d95e5266fca1fcfc3473b605230425c19cf5542d40820061af70fa1"} Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.577636 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6748588445-r4bnh" Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.577663 4733 scope.go:117] "RemoveContainer" containerID="99d80b19731c17ac26d38096f7c5b3a305c502e2750b077f48abc2058aa5d277" Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.591316 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ll7dp" event={"ID":"82922e1e-56fb-432e-9441-b99bdb19fbb0","Type":"ContainerStarted","Data":"7a59787487ae6ad4c8775bad7fe0e44e006e25d60fd069b4d5bd8cc6ceca6c70"} Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.603717 4733 generic.go:334] "Generic (PLEG): container finished" podID="71a70c3c-d483-43f4-9f54-10978c7f8cc8" containerID="869578488a5526adb52c0d5efeb676ea68e5c20e95b1cf2d208fa00dbd02baca" exitCode=0 Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.604222 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563816-4582s" event={"ID":"71a70c3c-d483-43f4-9f54-10978c7f8cc8","Type":"ContainerDied","Data":"869578488a5526adb52c0d5efeb676ea68e5c20e95b1cf2d208fa00dbd02baca"} Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.616702 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gmw2d" podStartSLOduration=3.414821918 podStartE2EDuration="42.616668181s" podCreationTimestamp="2026-03-18 10:16:34 +0000 UTC" firstStartedPulling="2026-03-18 10:16:36.800759136 +0000 UTC m=+236.292493471" lastFinishedPulling="2026-03-18 10:17:16.002605409 +0000 UTC m=+275.494339734" observedRunningTime="2026-03-18 10:17:16.613763599 +0000 UTC m=+276.105497934" watchObservedRunningTime="2026-03-18 10:17:16.616668181 +0000 UTC m=+276.108402506" Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.628171 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f92nl" event={"ID":"527056ad-4daf-4dd5-9e31-887d55be0336","Type":"ContainerStarted","Data":"d88a014dfa4a61b3bdf527747022f6d4b6201eb43fb9d2c08a1918862483878b"} Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.640002 4733 scope.go:117] "RemoveContainer" containerID="99d80b19731c17ac26d38096f7c5b3a305c502e2750b077f48abc2058aa5d277" Mar 18 10:17:16 crc kubenswrapper[4733]: E0318 10:17:16.640799 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99d80b19731c17ac26d38096f7c5b3a305c502e2750b077f48abc2058aa5d277\": container with ID starting with 99d80b19731c17ac26d38096f7c5b3a305c502e2750b077f48abc2058aa5d277 not found: ID does not exist" containerID="99d80b19731c17ac26d38096f7c5b3a305c502e2750b077f48abc2058aa5d277" Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.640882 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99d80b19731c17ac26d38096f7c5b3a305c502e2750b077f48abc2058aa5d277"} err="failed to get container status \"99d80b19731c17ac26d38096f7c5b3a305c502e2750b077f48abc2058aa5d277\": rpc error: code = NotFound desc = could not find container \"99d80b19731c17ac26d38096f7c5b3a305c502e2750b077f48abc2058aa5d277\": container with ID starting with 99d80b19731c17ac26d38096f7c5b3a305c502e2750b077f48abc2058aa5d277 not found: ID does not exist" Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.640922 4733 generic.go:334] "Generic (PLEG): container finished" podID="8ad622ed-fcbc-4485-a927-639a658660fa" containerID="363048c696b7c0c6504ff378d37d2a0ff6eece67133ad5d68c872cadbe32058b" exitCode=0 Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.640996 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d8fb44d7f-zrsx6" Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.640992 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d8fb44d7f-zrsx6" event={"ID":"8ad622ed-fcbc-4485-a927-639a658660fa","Type":"ContainerDied","Data":"363048c696b7c0c6504ff378d37d2a0ff6eece67133ad5d68c872cadbe32058b"} Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.641053 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d8fb44d7f-zrsx6" event={"ID":"8ad622ed-fcbc-4485-a927-639a658660fa","Type":"ContainerDied","Data":"c17a0094029923b923cf507d45924db60f811cb6ad73561bc864ff132db3d533"} Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.641078 4733 scope.go:117] "RemoveContainer" containerID="363048c696b7c0c6504ff378d37d2a0ff6eece67133ad5d68c872cadbe32058b" Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.656742 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hrwxg" podStartSLOduration=3.497819402 podStartE2EDuration="39.656727892s" podCreationTimestamp="2026-03-18 10:16:37 +0000 UTC" firstStartedPulling="2026-03-18 10:16:40.070658083 +0000 UTC m=+239.562392408" lastFinishedPulling="2026-03-18 10:17:16.229566573 +0000 UTC m=+275.721300898" observedRunningTime="2026-03-18 10:17:16.655544269 +0000 UTC m=+276.147278584" watchObservedRunningTime="2026-03-18 10:17:16.656727892 +0000 UTC m=+276.148462217" Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.669395 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6748588445-r4bnh"] Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.678222 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6748588445-r4bnh"] Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.689176 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ll7dp" podStartSLOduration=15.974903489 podStartE2EDuration="39.68916051s" podCreationTimestamp="2026-03-18 10:16:37 +0000 UTC" firstStartedPulling="2026-03-18 10:16:52.239528192 +0000 UTC m=+251.731262537" lastFinishedPulling="2026-03-18 10:17:15.953785233 +0000 UTC m=+275.445519558" observedRunningTime="2026-03-18 10:17:16.68699282 +0000 UTC m=+276.178727155" watchObservedRunningTime="2026-03-18 10:17:16.68916051 +0000 UTC m=+276.180894825" Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.703481 4733 scope.go:117] "RemoveContainer" containerID="363048c696b7c0c6504ff378d37d2a0ff6eece67133ad5d68c872cadbe32058b" Mar 18 10:17:16 crc kubenswrapper[4733]: E0318 10:17:16.707793 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"363048c696b7c0c6504ff378d37d2a0ff6eece67133ad5d68c872cadbe32058b\": container with ID starting with 363048c696b7c0c6504ff378d37d2a0ff6eece67133ad5d68c872cadbe32058b not found: ID does not exist" containerID="363048c696b7c0c6504ff378d37d2a0ff6eece67133ad5d68c872cadbe32058b" Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.707831 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"363048c696b7c0c6504ff378d37d2a0ff6eece67133ad5d68c872cadbe32058b"} err="failed to get container status \"363048c696b7c0c6504ff378d37d2a0ff6eece67133ad5d68c872cadbe32058b\": rpc error: code = NotFound desc = could not find container \"363048c696b7c0c6504ff378d37d2a0ff6eece67133ad5d68c872cadbe32058b\": container with ID starting with 363048c696b7c0c6504ff378d37d2a0ff6eece67133ad5d68c872cadbe32058b not found: ID does not exist" Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.717341 4733 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-13 13:56:53.35255902 +0000 UTC Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.717405 4733 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 5763h39m36.635157219s for next certificate rotation Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.732133 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6966b4866d-xm6hn"] Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.750672 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-f92nl" podStartSLOduration=2.622889314 podStartE2EDuration="42.750646152s" podCreationTimestamp="2026-03-18 10:16:34 +0000 UTC" firstStartedPulling="2026-03-18 10:16:35.812810954 +0000 UTC m=+235.304545279" lastFinishedPulling="2026-03-18 10:17:15.940567792 +0000 UTC m=+275.432302117" observedRunningTime="2026-03-18 10:17:16.734540251 +0000 UTC m=+276.226274576" watchObservedRunningTime="2026-03-18 10:17:16.750646152 +0000 UTC m=+276.242380477" Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.761081 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d8fb44d7f-zrsx6"] Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.777479 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d8fb44d7f-zrsx6"] Mar 18 10:17:16 crc kubenswrapper[4733]: I0318 10:17:16.988698 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 10:17:17 crc kubenswrapper[4733]: I0318 10:17:17.169647 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/068ddaf0-0f79-459b-b064-1b90505c36ca-kubelet-dir\") pod \"068ddaf0-0f79-459b-b064-1b90505c36ca\" (UID: \"068ddaf0-0f79-459b-b064-1b90505c36ca\") " Mar 18 10:17:17 crc kubenswrapper[4733]: I0318 10:17:17.169739 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/068ddaf0-0f79-459b-b064-1b90505c36ca-kube-api-access\") pod \"068ddaf0-0f79-459b-b064-1b90505c36ca\" (UID: \"068ddaf0-0f79-459b-b064-1b90505c36ca\") " Mar 18 10:17:17 crc kubenswrapper[4733]: I0318 10:17:17.169800 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/068ddaf0-0f79-459b-b064-1b90505c36ca-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "068ddaf0-0f79-459b-b064-1b90505c36ca" (UID: "068ddaf0-0f79-459b-b064-1b90505c36ca"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 10:17:17 crc kubenswrapper[4733]: I0318 10:17:17.170043 4733 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/068ddaf0-0f79-459b-b064-1b90505c36ca-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 18 10:17:17 crc kubenswrapper[4733]: I0318 10:17:17.180486 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/068ddaf0-0f79-459b-b064-1b90505c36ca-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "068ddaf0-0f79-459b-b064-1b90505c36ca" (UID: "068ddaf0-0f79-459b-b064-1b90505c36ca"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:17:17 crc kubenswrapper[4733]: I0318 10:17:17.195833 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b" path="/var/lib/kubelet/pods/19d4bb8b-0ef7-4aae-9eca-d24dbb957f8b/volumes" Mar 18 10:17:17 crc kubenswrapper[4733]: I0318 10:17:17.197024 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ad622ed-fcbc-4485-a927-639a658660fa" path="/var/lib/kubelet/pods/8ad622ed-fcbc-4485-a927-639a658660fa/volumes" Mar 18 10:17:17 crc kubenswrapper[4733]: I0318 10:17:17.271433 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/068ddaf0-0f79-459b-b064-1b90505c36ca-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 10:17:17 crc kubenswrapper[4733]: I0318 10:17:17.639297 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hrwxg" Mar 18 10:17:17 crc kubenswrapper[4733]: I0318 10:17:17.641175 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hrwxg" Mar 18 10:17:17 crc kubenswrapper[4733]: I0318 10:17:17.650404 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6966b4866d-xm6hn" event={"ID":"debb4f43-7f9a-4fdc-9896-db5106650a74","Type":"ContainerStarted","Data":"7412c3c1effa7cc3545719a633ad64080beb347c499c8ddc9951beee83c2e740"} Mar 18 10:17:17 crc kubenswrapper[4733]: I0318 10:17:17.650431 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6966b4866d-xm6hn" event={"ID":"debb4f43-7f9a-4fdc-9896-db5106650a74","Type":"ContainerStarted","Data":"f30c8eb187bc3102906d7afcd8a11a3b91672d508cc51863174f4d3b52b10205"} Mar 18 10:17:17 crc kubenswrapper[4733]: I0318 10:17:17.651470 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6966b4866d-xm6hn" Mar 18 10:17:17 crc kubenswrapper[4733]: I0318 10:17:17.656241 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"068ddaf0-0f79-459b-b064-1b90505c36ca","Type":"ContainerDied","Data":"a37d483c7ee4ef126cb1f876ea9f88989d5b67cbbc348139e49a23862aa79e05"} Mar 18 10:17:17 crc kubenswrapper[4733]: I0318 10:17:17.656311 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a37d483c7ee4ef126cb1f876ea9f88989d5b67cbbc348139e49a23862aa79e05" Mar 18 10:17:17 crc kubenswrapper[4733]: I0318 10:17:17.656350 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 10:17:17 crc kubenswrapper[4733]: I0318 10:17:17.667607 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6966b4866d-xm6hn" Mar 18 10:17:17 crc kubenswrapper[4733]: I0318 10:17:17.676428 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6966b4866d-xm6hn" podStartSLOduration=4.676403871 podStartE2EDuration="4.676403871s" podCreationTimestamp="2026-03-18 10:17:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:17:17.674357064 +0000 UTC m=+277.166091399" watchObservedRunningTime="2026-03-18 10:17:17.676403871 +0000 UTC m=+277.168138206" Mar 18 10:17:17 crc kubenswrapper[4733]: I0318 10:17:17.718092 4733 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-06 05:20:34.513480736 +0000 UTC Mar 18 10:17:17 crc kubenswrapper[4733]: I0318 10:17:17.718535 4733 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6307h3m16.795158741s for next certificate rotation Mar 18 10:17:17 crc kubenswrapper[4733]: I0318 10:17:17.957331 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563816-4582s" Mar 18 10:17:18 crc kubenswrapper[4733]: I0318 10:17:18.087975 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2whnv\" (UniqueName: \"kubernetes.io/projected/71a70c3c-d483-43f4-9f54-10978c7f8cc8-kube-api-access-2whnv\") pod \"71a70c3c-d483-43f4-9f54-10978c7f8cc8\" (UID: \"71a70c3c-d483-43f4-9f54-10978c7f8cc8\") " Mar 18 10:17:18 crc kubenswrapper[4733]: I0318 10:17:18.098832 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71a70c3c-d483-43f4-9f54-10978c7f8cc8-kube-api-access-2whnv" (OuterVolumeSpecName: "kube-api-access-2whnv") pod "71a70c3c-d483-43f4-9f54-10978c7f8cc8" (UID: "71a70c3c-d483-43f4-9f54-10978c7f8cc8"). InnerVolumeSpecName "kube-api-access-2whnv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:17:18 crc kubenswrapper[4733]: I0318 10:17:18.189231 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2whnv\" (UniqueName: \"kubernetes.io/projected/71a70c3c-d483-43f4-9f54-10978c7f8cc8-kube-api-access-2whnv\") on node \"crc\" DevicePath \"\"" Mar 18 10:17:18 crc kubenswrapper[4733]: I0318 10:17:18.337848 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ll7dp" Mar 18 10:17:18 crc kubenswrapper[4733]: I0318 10:17:18.337927 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ll7dp" Mar 18 10:17:18 crc kubenswrapper[4733]: I0318 10:17:18.446259 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7cf89766c-wb795"] Mar 18 10:17:18 crc kubenswrapper[4733]: E0318 10:17:18.446720 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71a70c3c-d483-43f4-9f54-10978c7f8cc8" containerName="oc" Mar 18 10:17:18 crc kubenswrapper[4733]: I0318 10:17:18.446733 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="71a70c3c-d483-43f4-9f54-10978c7f8cc8" containerName="oc" Mar 18 10:17:18 crc kubenswrapper[4733]: E0318 10:17:18.446748 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="068ddaf0-0f79-459b-b064-1b90505c36ca" containerName="pruner" Mar 18 10:17:18 crc kubenswrapper[4733]: I0318 10:17:18.446755 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="068ddaf0-0f79-459b-b064-1b90505c36ca" containerName="pruner" Mar 18 10:17:18 crc kubenswrapper[4733]: I0318 10:17:18.446853 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="71a70c3c-d483-43f4-9f54-10978c7f8cc8" containerName="oc" Mar 18 10:17:18 crc kubenswrapper[4733]: I0318 10:17:18.446864 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="068ddaf0-0f79-459b-b064-1b90505c36ca" containerName="pruner" Mar 18 10:17:18 crc kubenswrapper[4733]: I0318 10:17:18.447213 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7cf89766c-wb795" Mar 18 10:17:18 crc kubenswrapper[4733]: I0318 10:17:18.448635 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 10:17:18 crc kubenswrapper[4733]: I0318 10:17:18.452218 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 10:17:18 crc kubenswrapper[4733]: I0318 10:17:18.452784 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 10:17:18 crc kubenswrapper[4733]: I0318 10:17:18.452841 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 10:17:18 crc kubenswrapper[4733]: I0318 10:17:18.453069 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 10:17:18 crc kubenswrapper[4733]: I0318 10:17:18.454476 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 10:17:18 crc kubenswrapper[4733]: I0318 10:17:18.469842 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 10:17:18 crc kubenswrapper[4733]: I0318 10:17:18.470019 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7cf89766c-wb795"] Mar 18 10:17:18 crc kubenswrapper[4733]: I0318 10:17:18.528682 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f280b8dd-9b2f-4faa-9335-ca84961ea0a6-config\") pod \"controller-manager-7cf89766c-wb795\" (UID: \"f280b8dd-9b2f-4faa-9335-ca84961ea0a6\") " pod="openshift-controller-manager/controller-manager-7cf89766c-wb795" Mar 18 10:17:18 crc kubenswrapper[4733]: I0318 10:17:18.528751 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f280b8dd-9b2f-4faa-9335-ca84961ea0a6-proxy-ca-bundles\") pod \"controller-manager-7cf89766c-wb795\" (UID: \"f280b8dd-9b2f-4faa-9335-ca84961ea0a6\") " pod="openshift-controller-manager/controller-manager-7cf89766c-wb795" Mar 18 10:17:18 crc kubenswrapper[4733]: I0318 10:17:18.528817 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f280b8dd-9b2f-4faa-9335-ca84961ea0a6-client-ca\") pod \"controller-manager-7cf89766c-wb795\" (UID: \"f280b8dd-9b2f-4faa-9335-ca84961ea0a6\") " pod="openshift-controller-manager/controller-manager-7cf89766c-wb795" Mar 18 10:17:18 crc kubenswrapper[4733]: I0318 10:17:18.528869 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-575kc\" (UniqueName: \"kubernetes.io/projected/f280b8dd-9b2f-4faa-9335-ca84961ea0a6-kube-api-access-575kc\") pod \"controller-manager-7cf89766c-wb795\" (UID: \"f280b8dd-9b2f-4faa-9335-ca84961ea0a6\") " pod="openshift-controller-manager/controller-manager-7cf89766c-wb795" Mar 18 10:17:18 crc kubenswrapper[4733]: I0318 10:17:18.528898 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f280b8dd-9b2f-4faa-9335-ca84961ea0a6-serving-cert\") pod \"controller-manager-7cf89766c-wb795\" (UID: \"f280b8dd-9b2f-4faa-9335-ca84961ea0a6\") " pod="openshift-controller-manager/controller-manager-7cf89766c-wb795" Mar 18 10:17:18 crc kubenswrapper[4733]: I0318 10:17:18.630213 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f280b8dd-9b2f-4faa-9335-ca84961ea0a6-client-ca\") pod \"controller-manager-7cf89766c-wb795\" (UID: \"f280b8dd-9b2f-4faa-9335-ca84961ea0a6\") " pod="openshift-controller-manager/controller-manager-7cf89766c-wb795" Mar 18 10:17:18 crc kubenswrapper[4733]: I0318 10:17:18.630313 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-575kc\" (UniqueName: \"kubernetes.io/projected/f280b8dd-9b2f-4faa-9335-ca84961ea0a6-kube-api-access-575kc\") pod \"controller-manager-7cf89766c-wb795\" (UID: \"f280b8dd-9b2f-4faa-9335-ca84961ea0a6\") " pod="openshift-controller-manager/controller-manager-7cf89766c-wb795" Mar 18 10:17:18 crc kubenswrapper[4733]: I0318 10:17:18.630343 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f280b8dd-9b2f-4faa-9335-ca84961ea0a6-serving-cert\") pod \"controller-manager-7cf89766c-wb795\" (UID: \"f280b8dd-9b2f-4faa-9335-ca84961ea0a6\") " pod="openshift-controller-manager/controller-manager-7cf89766c-wb795" Mar 18 10:17:18 crc kubenswrapper[4733]: I0318 10:17:18.630408 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f280b8dd-9b2f-4faa-9335-ca84961ea0a6-config\") pod \"controller-manager-7cf89766c-wb795\" (UID: \"f280b8dd-9b2f-4faa-9335-ca84961ea0a6\") " pod="openshift-controller-manager/controller-manager-7cf89766c-wb795" Mar 18 10:17:18 crc kubenswrapper[4733]: I0318 10:17:18.630439 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f280b8dd-9b2f-4faa-9335-ca84961ea0a6-proxy-ca-bundles\") pod \"controller-manager-7cf89766c-wb795\" (UID: \"f280b8dd-9b2f-4faa-9335-ca84961ea0a6\") " pod="openshift-controller-manager/controller-manager-7cf89766c-wb795" Mar 18 10:17:18 crc kubenswrapper[4733]: I0318 10:17:18.631647 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f280b8dd-9b2f-4faa-9335-ca84961ea0a6-client-ca\") pod \"controller-manager-7cf89766c-wb795\" (UID: \"f280b8dd-9b2f-4faa-9335-ca84961ea0a6\") " pod="openshift-controller-manager/controller-manager-7cf89766c-wb795" Mar 18 10:17:18 crc kubenswrapper[4733]: I0318 10:17:18.631966 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f280b8dd-9b2f-4faa-9335-ca84961ea0a6-config\") pod \"controller-manager-7cf89766c-wb795\" (UID: \"f280b8dd-9b2f-4faa-9335-ca84961ea0a6\") " pod="openshift-controller-manager/controller-manager-7cf89766c-wb795" Mar 18 10:17:18 crc kubenswrapper[4733]: I0318 10:17:18.632110 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f280b8dd-9b2f-4faa-9335-ca84961ea0a6-proxy-ca-bundles\") pod \"controller-manager-7cf89766c-wb795\" (UID: \"f280b8dd-9b2f-4faa-9335-ca84961ea0a6\") " pod="openshift-controller-manager/controller-manager-7cf89766c-wb795" Mar 18 10:17:18 crc kubenswrapper[4733]: I0318 10:17:18.638958 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f280b8dd-9b2f-4faa-9335-ca84961ea0a6-serving-cert\") pod \"controller-manager-7cf89766c-wb795\" (UID: \"f280b8dd-9b2f-4faa-9335-ca84961ea0a6\") " pod="openshift-controller-manager/controller-manager-7cf89766c-wb795" Mar 18 10:17:18 crc kubenswrapper[4733]: I0318 10:17:18.656794 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-575kc\" (UniqueName: \"kubernetes.io/projected/f280b8dd-9b2f-4faa-9335-ca84961ea0a6-kube-api-access-575kc\") pod \"controller-manager-7cf89766c-wb795\" (UID: \"f280b8dd-9b2f-4faa-9335-ca84961ea0a6\") " pod="openshift-controller-manager/controller-manager-7cf89766c-wb795" Mar 18 10:17:18 crc kubenswrapper[4733]: I0318 10:17:18.670478 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563816-4582s" event={"ID":"71a70c3c-d483-43f4-9f54-10978c7f8cc8","Type":"ContainerDied","Data":"f9019fd1aca4002d61050c62413d5f0b6ff4613e81da7416fe1c8a2924a20e45"} Mar 18 10:17:18 crc kubenswrapper[4733]: I0318 10:17:18.670527 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9019fd1aca4002d61050c62413d5f0b6ff4613e81da7416fe1c8a2924a20e45" Mar 18 10:17:18 crc kubenswrapper[4733]: I0318 10:17:18.670769 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563816-4582s" Mar 18 10:17:18 crc kubenswrapper[4733]: I0318 10:17:18.759592 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7cf89766c-wb795" Mar 18 10:17:18 crc kubenswrapper[4733]: I0318 10:17:18.818604 4733 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hrwxg" podUID="fb7ed879-1474-4200-88d4-70e425e2bcb1" containerName="registry-server" probeResult="failure" output=< Mar 18 10:17:18 crc kubenswrapper[4733]: timeout: failed to connect service ":50051" within 1s Mar 18 10:17:18 crc kubenswrapper[4733]: > Mar 18 10:17:19 crc kubenswrapper[4733]: I0318 10:17:19.006710 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7cf89766c-wb795"] Mar 18 10:17:19 crc kubenswrapper[4733]: W0318 10:17:19.013495 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf280b8dd_9b2f_4faa_9335_ca84961ea0a6.slice/crio-15b644538c3645c79d06f703952a07b7b4ae39d43d1dc32a31009b361d9095cd WatchSource:0}: Error finding container 15b644538c3645c79d06f703952a07b7b4ae39d43d1dc32a31009b361d9095cd: Status 404 returned error can't find the container with id 15b644538c3645c79d06f703952a07b7b4ae39d43d1dc32a31009b361d9095cd Mar 18 10:17:19 crc kubenswrapper[4733]: I0318 10:17:19.385870 4733 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ll7dp" podUID="82922e1e-56fb-432e-9441-b99bdb19fbb0" containerName="registry-server" probeResult="failure" output=< Mar 18 10:17:19 crc kubenswrapper[4733]: timeout: failed to connect service ":50051" within 1s Mar 18 10:17:19 crc kubenswrapper[4733]: > Mar 18 10:17:19 crc kubenswrapper[4733]: I0318 10:17:19.683662 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7cf89766c-wb795" event={"ID":"f280b8dd-9b2f-4faa-9335-ca84961ea0a6","Type":"ContainerStarted","Data":"c282a4dca9ca3d5260c23c43faf22ce611247dc38c8752227edd1fbf81bf90dd"} Mar 18 10:17:19 crc kubenswrapper[4733]: I0318 10:17:19.683712 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7cf89766c-wb795" event={"ID":"f280b8dd-9b2f-4faa-9335-ca84961ea0a6","Type":"ContainerStarted","Data":"15b644538c3645c79d06f703952a07b7b4ae39d43d1dc32a31009b361d9095cd"} Mar 18 10:17:19 crc kubenswrapper[4733]: I0318 10:17:19.684016 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7cf89766c-wb795" Mar 18 10:17:19 crc kubenswrapper[4733]: I0318 10:17:19.689060 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7cf89766c-wb795" Mar 18 10:17:19 crc kubenswrapper[4733]: I0318 10:17:19.701124 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7cf89766c-wb795" podStartSLOduration=6.7011000670000005 podStartE2EDuration="6.701100067s" podCreationTimestamp="2026-03-18 10:17:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:17:19.69834471 +0000 UTC m=+279.190079045" watchObservedRunningTime="2026-03-18 10:17:19.701100067 +0000 UTC m=+279.192834392" Mar 18 10:17:20 crc kubenswrapper[4733]: I0318 10:17:20.695800 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 18 10:17:20 crc kubenswrapper[4733]: I0318 10:17:20.696812 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 18 10:17:20 crc kubenswrapper[4733]: I0318 10:17:20.701902 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 18 10:17:20 crc kubenswrapper[4733]: I0318 10:17:20.702182 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 18 10:17:20 crc kubenswrapper[4733]: I0318 10:17:20.760806 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/71bc6618-8df4-4a35-9469-772a853eff06-kubelet-dir\") pod \"installer-9-crc\" (UID: \"71bc6618-8df4-4a35-9469-772a853eff06\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 10:17:20 crc kubenswrapper[4733]: I0318 10:17:20.760911 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/71bc6618-8df4-4a35-9469-772a853eff06-var-lock\") pod \"installer-9-crc\" (UID: \"71bc6618-8df4-4a35-9469-772a853eff06\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 10:17:20 crc kubenswrapper[4733]: I0318 10:17:20.760937 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/71bc6618-8df4-4a35-9469-772a853eff06-kube-api-access\") pod \"installer-9-crc\" (UID: \"71bc6618-8df4-4a35-9469-772a853eff06\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 10:17:20 crc kubenswrapper[4733]: I0318 10:17:20.862110 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/71bc6618-8df4-4a35-9469-772a853eff06-var-lock\") pod \"installer-9-crc\" (UID: \"71bc6618-8df4-4a35-9469-772a853eff06\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 10:17:20 crc kubenswrapper[4733]: I0318 10:17:20.862625 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/71bc6618-8df4-4a35-9469-772a853eff06-kube-api-access\") pod \"installer-9-crc\" (UID: \"71bc6618-8df4-4a35-9469-772a853eff06\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 10:17:20 crc kubenswrapper[4733]: I0318 10:17:20.862696 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/71bc6618-8df4-4a35-9469-772a853eff06-kubelet-dir\") pod \"installer-9-crc\" (UID: \"71bc6618-8df4-4a35-9469-772a853eff06\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 10:17:20 crc kubenswrapper[4733]: I0318 10:17:20.862793 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/71bc6618-8df4-4a35-9469-772a853eff06-kubelet-dir\") pod \"installer-9-crc\" (UID: \"71bc6618-8df4-4a35-9469-772a853eff06\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 10:17:20 crc kubenswrapper[4733]: I0318 10:17:20.862840 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/71bc6618-8df4-4a35-9469-772a853eff06-var-lock\") pod \"installer-9-crc\" (UID: \"71bc6618-8df4-4a35-9469-772a853eff06\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 10:17:20 crc kubenswrapper[4733]: I0318 10:17:20.892063 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/71bc6618-8df4-4a35-9469-772a853eff06-kube-api-access\") pod \"installer-9-crc\" (UID: \"71bc6618-8df4-4a35-9469-772a853eff06\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 10:17:21 crc kubenswrapper[4733]: I0318 10:17:21.022280 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 18 10:17:21 crc kubenswrapper[4733]: I0318 10:17:21.160720 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 18 10:17:21 crc kubenswrapper[4733]: I0318 10:17:21.413022 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 18 10:17:22 crc kubenswrapper[4733]: I0318 10:17:22.139111 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"71bc6618-8df4-4a35-9469-772a853eff06","Type":"ContainerStarted","Data":"5f44602afd79b72c25bccf945c72fd688dbc42ff6b86533bf0722398bd85fb3d"} Mar 18 10:17:23 crc kubenswrapper[4733]: I0318 10:17:23.144921 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"71bc6618-8df4-4a35-9469-772a853eff06","Type":"ContainerStarted","Data":"7f5d2a4800b0b935a593d622bab229709f8902d75a6b9d3d310047bf50063a1a"} Mar 18 10:17:23 crc kubenswrapper[4733]: I0318 10:17:23.150758 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w7rrs" event={"ID":"02cd6358-355c-4db8-b0f7-2528618602ff","Type":"ContainerStarted","Data":"6888d6a2cdf0ef31a9d456bbbbc7efb04bffa6fcf33a7a14044f9c00de4a1450"} Mar 18 10:17:23 crc kubenswrapper[4733]: I0318 10:17:23.189387 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=3.18934915 podStartE2EDuration="3.18934915s" podCreationTimestamp="2026-03-18 10:17:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:17:23.181056997 +0000 UTC m=+282.672791322" watchObservedRunningTime="2026-03-18 10:17:23.18934915 +0000 UTC m=+282.681083495" Mar 18 10:17:24 crc kubenswrapper[4733]: I0318 10:17:24.158674 4733 generic.go:334] "Generic (PLEG): container finished" podID="02cd6358-355c-4db8-b0f7-2528618602ff" containerID="6888d6a2cdf0ef31a9d456bbbbc7efb04bffa6fcf33a7a14044f9c00de4a1450" exitCode=0 Mar 18 10:17:24 crc kubenswrapper[4733]: I0318 10:17:24.158774 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w7rrs" event={"ID":"02cd6358-355c-4db8-b0f7-2528618602ff","Type":"ContainerDied","Data":"6888d6a2cdf0ef31a9d456bbbbc7efb04bffa6fcf33a7a14044f9c00de4a1450"} Mar 18 10:17:24 crc kubenswrapper[4733]: I0318 10:17:24.503411 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-f92nl" Mar 18 10:17:24 crc kubenswrapper[4733]: I0318 10:17:24.503480 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-f92nl" Mar 18 10:17:24 crc kubenswrapper[4733]: I0318 10:17:24.560416 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-f92nl" Mar 18 10:17:24 crc kubenswrapper[4733]: I0318 10:17:24.859045 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gmw2d" Mar 18 10:17:24 crc kubenswrapper[4733]: I0318 10:17:24.859453 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gmw2d" Mar 18 10:17:24 crc kubenswrapper[4733]: I0318 10:17:24.900908 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gmw2d" Mar 18 10:17:25 crc kubenswrapper[4733]: I0318 10:17:25.168034 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w7rrs" event={"ID":"02cd6358-355c-4db8-b0f7-2528618602ff","Type":"ContainerStarted","Data":"0670cad9cd6cfa6a4cf42522d884617aa8495c18fcee7ac17083576de64388b6"} Mar 18 10:17:25 crc kubenswrapper[4733]: I0318 10:17:25.199177 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-w7rrs" podStartSLOduration=3.472350661 podStartE2EDuration="51.199154028s" podCreationTimestamp="2026-03-18 10:16:34 +0000 UTC" firstStartedPulling="2026-03-18 10:16:36.811562101 +0000 UTC m=+236.303296426" lastFinishedPulling="2026-03-18 10:17:24.538365448 +0000 UTC m=+284.030099793" observedRunningTime="2026-03-18 10:17:25.196106243 +0000 UTC m=+284.687840578" watchObservedRunningTime="2026-03-18 10:17:25.199154028 +0000 UTC m=+284.690888363" Mar 18 10:17:25 crc kubenswrapper[4733]: I0318 10:17:25.232544 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gmw2d" Mar 18 10:17:25 crc kubenswrapper[4733]: I0318 10:17:25.234203 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-f92nl" Mar 18 10:17:26 crc kubenswrapper[4733]: I0318 10:17:26.177922 4733 generic.go:334] "Generic (PLEG): container finished" podID="0fd306cb-05db-40e1-a1ec-9f811ce7fec0" containerID="4da80ec2ba0c104ba8616114aa62d195906b3ceb35fe815aeee6c6a50ba00bd9" exitCode=0 Mar 18 10:17:26 crc kubenswrapper[4733]: I0318 10:17:26.178082 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jb86w" event={"ID":"0fd306cb-05db-40e1-a1ec-9f811ce7fec0","Type":"ContainerDied","Data":"4da80ec2ba0c104ba8616114aa62d195906b3ceb35fe815aeee6c6a50ba00bd9"} Mar 18 10:17:27 crc kubenswrapper[4733]: I0318 10:17:27.188949 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jb86w" event={"ID":"0fd306cb-05db-40e1-a1ec-9f811ce7fec0","Type":"ContainerStarted","Data":"36d15214eccc522b73ee0fe4b5f5b4531b1d0593c4e73af5bdcac8f8e55d7014"} Mar 18 10:17:27 crc kubenswrapper[4733]: I0318 10:17:27.189924 4733 generic.go:334] "Generic (PLEG): container finished" podID="92996997-080b-42c9-bc2c-19c2e68db896" containerID="9295312051c24cc07301903e63a22c698207253e2dd4d338c0be4c6fd4de6dec" exitCode=0 Mar 18 10:17:27 crc kubenswrapper[4733]: I0318 10:17:27.189971 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rls2r" event={"ID":"92996997-080b-42c9-bc2c-19c2e68db896","Type":"ContainerDied","Data":"9295312051c24cc07301903e63a22c698207253e2dd4d338c0be4c6fd4de6dec"} Mar 18 10:17:27 crc kubenswrapper[4733]: I0318 10:17:27.251394 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jb86w" podStartSLOduration=3.614479629 podStartE2EDuration="52.251358894s" podCreationTimestamp="2026-03-18 10:16:35 +0000 UTC" firstStartedPulling="2026-03-18 10:16:37.924782901 +0000 UTC m=+237.416517226" lastFinishedPulling="2026-03-18 10:17:26.561662156 +0000 UTC m=+286.053396491" observedRunningTime="2026-03-18 10:17:27.24978807 +0000 UTC m=+286.741522425" watchObservedRunningTime="2026-03-18 10:17:27.251358894 +0000 UTC m=+286.743093229" Mar 18 10:17:27 crc kubenswrapper[4733]: I0318 10:17:27.694777 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hrwxg" Mar 18 10:17:27 crc kubenswrapper[4733]: I0318 10:17:27.748824 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hrwxg" Mar 18 10:17:28 crc kubenswrapper[4733]: I0318 10:17:28.405702 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ll7dp" Mar 18 10:17:28 crc kubenswrapper[4733]: I0318 10:17:28.460150 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ll7dp" Mar 18 10:17:29 crc kubenswrapper[4733]: I0318 10:17:29.423180 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gmw2d"] Mar 18 10:17:29 crc kubenswrapper[4733]: I0318 10:17:29.423880 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gmw2d" podUID="7eb97f2d-18fa-4e8c-895f-de4602c9dbbc" containerName="registry-server" containerID="cri-o://59fff67ee2ee5eba64f97e1503253ee04deab9883241245399247be534c88d2d" gracePeriod=2 Mar 18 10:17:30 crc kubenswrapper[4733]: I0318 10:17:30.075653 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gmw2d" Mar 18 10:17:30 crc kubenswrapper[4733]: I0318 10:17:30.216156 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7eb97f2d-18fa-4e8c-895f-de4602c9dbbc-catalog-content\") pod \"7eb97f2d-18fa-4e8c-895f-de4602c9dbbc\" (UID: \"7eb97f2d-18fa-4e8c-895f-de4602c9dbbc\") " Mar 18 10:17:30 crc kubenswrapper[4733]: I0318 10:17:30.216267 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rs5rx\" (UniqueName: \"kubernetes.io/projected/7eb97f2d-18fa-4e8c-895f-de4602c9dbbc-kube-api-access-rs5rx\") pod \"7eb97f2d-18fa-4e8c-895f-de4602c9dbbc\" (UID: \"7eb97f2d-18fa-4e8c-895f-de4602c9dbbc\") " Mar 18 10:17:30 crc kubenswrapper[4733]: I0318 10:17:30.216376 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7eb97f2d-18fa-4e8c-895f-de4602c9dbbc-utilities\") pod \"7eb97f2d-18fa-4e8c-895f-de4602c9dbbc\" (UID: \"7eb97f2d-18fa-4e8c-895f-de4602c9dbbc\") " Mar 18 10:17:30 crc kubenswrapper[4733]: I0318 10:17:30.217246 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7eb97f2d-18fa-4e8c-895f-de4602c9dbbc-utilities" (OuterVolumeSpecName: "utilities") pod "7eb97f2d-18fa-4e8c-895f-de4602c9dbbc" (UID: "7eb97f2d-18fa-4e8c-895f-de4602c9dbbc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:17:30 crc kubenswrapper[4733]: I0318 10:17:30.219506 4733 generic.go:334] "Generic (PLEG): container finished" podID="7eb97f2d-18fa-4e8c-895f-de4602c9dbbc" containerID="59fff67ee2ee5eba64f97e1503253ee04deab9883241245399247be534c88d2d" exitCode=0 Mar 18 10:17:30 crc kubenswrapper[4733]: I0318 10:17:30.219580 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gmw2d" event={"ID":"7eb97f2d-18fa-4e8c-895f-de4602c9dbbc","Type":"ContainerDied","Data":"59fff67ee2ee5eba64f97e1503253ee04deab9883241245399247be534c88d2d"} Mar 18 10:17:30 crc kubenswrapper[4733]: I0318 10:17:30.219615 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gmw2d" event={"ID":"7eb97f2d-18fa-4e8c-895f-de4602c9dbbc","Type":"ContainerDied","Data":"a2b42f75b17ecdce018f92ac6406accaeca335b14c1245cfd417767d5e5802c4"} Mar 18 10:17:30 crc kubenswrapper[4733]: I0318 10:17:30.219641 4733 scope.go:117] "RemoveContainer" containerID="59fff67ee2ee5eba64f97e1503253ee04deab9883241245399247be534c88d2d" Mar 18 10:17:30 crc kubenswrapper[4733]: I0318 10:17:30.219770 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gmw2d" Mar 18 10:17:30 crc kubenswrapper[4733]: I0318 10:17:30.222606 4733 generic.go:334] "Generic (PLEG): container finished" podID="c91f12fa-96f0-442a-a3f7-70d56a697839" containerID="e4094d8b4eb850df07d3a19e616d2e9c130ee64b58c626295f963df49e875ea5" exitCode=0 Mar 18 10:17:30 crc kubenswrapper[4733]: I0318 10:17:30.222663 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f82xf" event={"ID":"c91f12fa-96f0-442a-a3f7-70d56a697839","Type":"ContainerDied","Data":"e4094d8b4eb850df07d3a19e616d2e9c130ee64b58c626295f963df49e875ea5"} Mar 18 10:17:30 crc kubenswrapper[4733]: I0318 10:17:30.230393 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7eb97f2d-18fa-4e8c-895f-de4602c9dbbc-kube-api-access-rs5rx" (OuterVolumeSpecName: "kube-api-access-rs5rx") pod "7eb97f2d-18fa-4e8c-895f-de4602c9dbbc" (UID: "7eb97f2d-18fa-4e8c-895f-de4602c9dbbc"). InnerVolumeSpecName "kube-api-access-rs5rx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:17:30 crc kubenswrapper[4733]: I0318 10:17:30.231709 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rls2r" event={"ID":"92996997-080b-42c9-bc2c-19c2e68db896","Type":"ContainerStarted","Data":"7e9cf80fc09f50439f722c47d01b38f7d154cd5514d553f0573a5303858564f5"} Mar 18 10:17:30 crc kubenswrapper[4733]: I0318 10:17:30.264814 4733 scope.go:117] "RemoveContainer" containerID="c0f64ebe0af1fce843609f30c833e4b965000df30afbd1af8fae99160a42210c" Mar 18 10:17:30 crc kubenswrapper[4733]: I0318 10:17:30.280558 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rls2r" podStartSLOduration=4.117886001 podStartE2EDuration="57.280534814s" podCreationTimestamp="2026-03-18 10:16:33 +0000 UTC" firstStartedPulling="2026-03-18 10:16:35.749727766 +0000 UTC m=+235.241462091" lastFinishedPulling="2026-03-18 10:17:28.912376579 +0000 UTC m=+288.404110904" observedRunningTime="2026-03-18 10:17:30.272480679 +0000 UTC m=+289.764215014" watchObservedRunningTime="2026-03-18 10:17:30.280534814 +0000 UTC m=+289.772269139" Mar 18 10:17:30 crc kubenswrapper[4733]: I0318 10:17:30.285727 4733 scope.go:117] "RemoveContainer" containerID="17741288ba852c25d8355eb97aa338d2e36690e9d066bbb56a0857710c52f266" Mar 18 10:17:30 crc kubenswrapper[4733]: I0318 10:17:30.286176 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7eb97f2d-18fa-4e8c-895f-de4602c9dbbc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7eb97f2d-18fa-4e8c-895f-de4602c9dbbc" (UID: "7eb97f2d-18fa-4e8c-895f-de4602c9dbbc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:17:30 crc kubenswrapper[4733]: I0318 10:17:30.301039 4733 scope.go:117] "RemoveContainer" containerID="59fff67ee2ee5eba64f97e1503253ee04deab9883241245399247be534c88d2d" Mar 18 10:17:30 crc kubenswrapper[4733]: E0318 10:17:30.301757 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59fff67ee2ee5eba64f97e1503253ee04deab9883241245399247be534c88d2d\": container with ID starting with 59fff67ee2ee5eba64f97e1503253ee04deab9883241245399247be534c88d2d not found: ID does not exist" containerID="59fff67ee2ee5eba64f97e1503253ee04deab9883241245399247be534c88d2d" Mar 18 10:17:30 crc kubenswrapper[4733]: I0318 10:17:30.301823 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59fff67ee2ee5eba64f97e1503253ee04deab9883241245399247be534c88d2d"} err="failed to get container status \"59fff67ee2ee5eba64f97e1503253ee04deab9883241245399247be534c88d2d\": rpc error: code = NotFound desc = could not find container \"59fff67ee2ee5eba64f97e1503253ee04deab9883241245399247be534c88d2d\": container with ID starting with 59fff67ee2ee5eba64f97e1503253ee04deab9883241245399247be534c88d2d not found: ID does not exist" Mar 18 10:17:30 crc kubenswrapper[4733]: I0318 10:17:30.301863 4733 scope.go:117] "RemoveContainer" containerID="c0f64ebe0af1fce843609f30c833e4b965000df30afbd1af8fae99160a42210c" Mar 18 10:17:30 crc kubenswrapper[4733]: E0318 10:17:30.302389 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0f64ebe0af1fce843609f30c833e4b965000df30afbd1af8fae99160a42210c\": container with ID starting with c0f64ebe0af1fce843609f30c833e4b965000df30afbd1af8fae99160a42210c not found: ID does not exist" containerID="c0f64ebe0af1fce843609f30c833e4b965000df30afbd1af8fae99160a42210c" Mar 18 10:17:30 crc kubenswrapper[4733]: I0318 10:17:30.302463 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0f64ebe0af1fce843609f30c833e4b965000df30afbd1af8fae99160a42210c"} err="failed to get container status \"c0f64ebe0af1fce843609f30c833e4b965000df30afbd1af8fae99160a42210c\": rpc error: code = NotFound desc = could not find container \"c0f64ebe0af1fce843609f30c833e4b965000df30afbd1af8fae99160a42210c\": container with ID starting with c0f64ebe0af1fce843609f30c833e4b965000df30afbd1af8fae99160a42210c not found: ID does not exist" Mar 18 10:17:30 crc kubenswrapper[4733]: I0318 10:17:30.302498 4733 scope.go:117] "RemoveContainer" containerID="17741288ba852c25d8355eb97aa338d2e36690e9d066bbb56a0857710c52f266" Mar 18 10:17:30 crc kubenswrapper[4733]: E0318 10:17:30.302878 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17741288ba852c25d8355eb97aa338d2e36690e9d066bbb56a0857710c52f266\": container with ID starting with 17741288ba852c25d8355eb97aa338d2e36690e9d066bbb56a0857710c52f266 not found: ID does not exist" containerID="17741288ba852c25d8355eb97aa338d2e36690e9d066bbb56a0857710c52f266" Mar 18 10:17:30 crc kubenswrapper[4733]: I0318 10:17:30.302947 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17741288ba852c25d8355eb97aa338d2e36690e9d066bbb56a0857710c52f266"} err="failed to get container status \"17741288ba852c25d8355eb97aa338d2e36690e9d066bbb56a0857710c52f266\": rpc error: code = NotFound desc = could not find container \"17741288ba852c25d8355eb97aa338d2e36690e9d066bbb56a0857710c52f266\": container with ID starting with 17741288ba852c25d8355eb97aa338d2e36690e9d066bbb56a0857710c52f266 not found: ID does not exist" Mar 18 10:17:30 crc kubenswrapper[4733]: I0318 10:17:30.317831 4733 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7eb97f2d-18fa-4e8c-895f-de4602c9dbbc-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 10:17:30 crc kubenswrapper[4733]: I0318 10:17:30.317856 4733 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7eb97f2d-18fa-4e8c-895f-de4602c9dbbc-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 10:17:30 crc kubenswrapper[4733]: I0318 10:17:30.317868 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rs5rx\" (UniqueName: \"kubernetes.io/projected/7eb97f2d-18fa-4e8c-895f-de4602c9dbbc-kube-api-access-rs5rx\") on node \"crc\" DevicePath \"\"" Mar 18 10:17:30 crc kubenswrapper[4733]: I0318 10:17:30.554155 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gmw2d"] Mar 18 10:17:30 crc kubenswrapper[4733]: I0318 10:17:30.557766 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gmw2d"] Mar 18 10:17:31 crc kubenswrapper[4733]: I0318 10:17:31.184307 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7eb97f2d-18fa-4e8c-895f-de4602c9dbbc" path="/var/lib/kubelet/pods/7eb97f2d-18fa-4e8c-895f-de4602c9dbbc/volumes" Mar 18 10:17:31 crc kubenswrapper[4733]: I0318 10:17:31.245379 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f82xf" event={"ID":"c91f12fa-96f0-442a-a3f7-70d56a697839","Type":"ContainerStarted","Data":"ceccb7b8039b2e6bd282db7ca8b7756c23dd7eceb685ae3d99b11f442338e94b"} Mar 18 10:17:31 crc kubenswrapper[4733]: I0318 10:17:31.816930 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-f82xf" podStartSLOduration=3.17356909 podStartE2EDuration="55.816904468s" podCreationTimestamp="2026-03-18 10:16:36 +0000 UTC" firstStartedPulling="2026-03-18 10:16:37.971698041 +0000 UTC m=+237.463432366" lastFinishedPulling="2026-03-18 10:17:30.615033419 +0000 UTC m=+290.106767744" observedRunningTime="2026-03-18 10:17:31.266377205 +0000 UTC m=+290.758111540" watchObservedRunningTime="2026-03-18 10:17:31.816904468 +0000 UTC m=+291.308638813" Mar 18 10:17:31 crc kubenswrapper[4733]: I0318 10:17:31.821796 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ll7dp"] Mar 18 10:17:31 crc kubenswrapper[4733]: I0318 10:17:31.822161 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ll7dp" podUID="82922e1e-56fb-432e-9441-b99bdb19fbb0" containerName="registry-server" containerID="cri-o://7a59787487ae6ad4c8775bad7fe0e44e006e25d60fd069b4d5bd8cc6ceca6c70" gracePeriod=2 Mar 18 10:17:32 crc kubenswrapper[4733]: I0318 10:17:32.254430 4733 generic.go:334] "Generic (PLEG): container finished" podID="82922e1e-56fb-432e-9441-b99bdb19fbb0" containerID="7a59787487ae6ad4c8775bad7fe0e44e006e25d60fd069b4d5bd8cc6ceca6c70" exitCode=0 Mar 18 10:17:32 crc kubenswrapper[4733]: I0318 10:17:32.254517 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ll7dp" event={"ID":"82922e1e-56fb-432e-9441-b99bdb19fbb0","Type":"ContainerDied","Data":"7a59787487ae6ad4c8775bad7fe0e44e006e25d60fd069b4d5bd8cc6ceca6c70"} Mar 18 10:17:33 crc kubenswrapper[4733]: I0318 10:17:33.023871 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ll7dp" Mar 18 10:17:33 crc kubenswrapper[4733]: I0318 10:17:33.129639 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7cf89766c-wb795"] Mar 18 10:17:33 crc kubenswrapper[4733]: I0318 10:17:33.129944 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7cf89766c-wb795" podUID="f280b8dd-9b2f-4faa-9335-ca84961ea0a6" containerName="controller-manager" containerID="cri-o://c282a4dca9ca3d5260c23c43faf22ce611247dc38c8752227edd1fbf81bf90dd" gracePeriod=30 Mar 18 10:17:33 crc kubenswrapper[4733]: I0318 10:17:33.148747 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6966b4866d-xm6hn"] Mar 18 10:17:33 crc kubenswrapper[4733]: I0318 10:17:33.149052 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6966b4866d-xm6hn" podUID="debb4f43-7f9a-4fdc-9896-db5106650a74" containerName="route-controller-manager" containerID="cri-o://7412c3c1effa7cc3545719a633ad64080beb347c499c8ddc9951beee83c2e740" gracePeriod=30 Mar 18 10:17:33 crc kubenswrapper[4733]: I0318 10:17:33.166775 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txwcn\" (UniqueName: \"kubernetes.io/projected/82922e1e-56fb-432e-9441-b99bdb19fbb0-kube-api-access-txwcn\") pod \"82922e1e-56fb-432e-9441-b99bdb19fbb0\" (UID: \"82922e1e-56fb-432e-9441-b99bdb19fbb0\") " Mar 18 10:17:33 crc kubenswrapper[4733]: I0318 10:17:33.168152 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82922e1e-56fb-432e-9441-b99bdb19fbb0-utilities\") pod \"82922e1e-56fb-432e-9441-b99bdb19fbb0\" (UID: \"82922e1e-56fb-432e-9441-b99bdb19fbb0\") " Mar 18 10:17:33 crc kubenswrapper[4733]: I0318 10:17:33.168226 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82922e1e-56fb-432e-9441-b99bdb19fbb0-catalog-content\") pod \"82922e1e-56fb-432e-9441-b99bdb19fbb0\" (UID: \"82922e1e-56fb-432e-9441-b99bdb19fbb0\") " Mar 18 10:17:33 crc kubenswrapper[4733]: I0318 10:17:33.169102 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82922e1e-56fb-432e-9441-b99bdb19fbb0-utilities" (OuterVolumeSpecName: "utilities") pod "82922e1e-56fb-432e-9441-b99bdb19fbb0" (UID: "82922e1e-56fb-432e-9441-b99bdb19fbb0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:17:33 crc kubenswrapper[4733]: I0318 10:17:33.183363 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82922e1e-56fb-432e-9441-b99bdb19fbb0-kube-api-access-txwcn" (OuterVolumeSpecName: "kube-api-access-txwcn") pod "82922e1e-56fb-432e-9441-b99bdb19fbb0" (UID: "82922e1e-56fb-432e-9441-b99bdb19fbb0"). InnerVolumeSpecName "kube-api-access-txwcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:17:33 crc kubenswrapper[4733]: I0318 10:17:33.262801 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ll7dp" event={"ID":"82922e1e-56fb-432e-9441-b99bdb19fbb0","Type":"ContainerDied","Data":"1a1211028e93b8b114b76fa499d9200418412506c6795f17a8a464f56e421c4c"} Mar 18 10:17:33 crc kubenswrapper[4733]: I0318 10:17:33.262866 4733 scope.go:117] "RemoveContainer" containerID="7a59787487ae6ad4c8775bad7fe0e44e006e25d60fd069b4d5bd8cc6ceca6c70" Mar 18 10:17:33 crc kubenswrapper[4733]: I0318 10:17:33.262989 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ll7dp" Mar 18 10:17:33 crc kubenswrapper[4733]: I0318 10:17:33.269664 4733 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82922e1e-56fb-432e-9441-b99bdb19fbb0-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 10:17:33 crc kubenswrapper[4733]: I0318 10:17:33.269686 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txwcn\" (UniqueName: \"kubernetes.io/projected/82922e1e-56fb-432e-9441-b99bdb19fbb0-kube-api-access-txwcn\") on node \"crc\" DevicePath \"\"" Mar 18 10:17:33 crc kubenswrapper[4733]: I0318 10:17:33.282181 4733 scope.go:117] "RemoveContainer" containerID="29a9561b8927709b5dd59a92cbf81b78eacad45dd6ac5ec49191d6faee246d53" Mar 18 10:17:33 crc kubenswrapper[4733]: I0318 10:17:33.298274 4733 scope.go:117] "RemoveContainer" containerID="9d9d502e889f0bc1ff5ac5bd25eb5937fb15878b89bb5f2186b3e420cda96e62" Mar 18 10:17:33 crc kubenswrapper[4733]: I0318 10:17:33.309202 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82922e1e-56fb-432e-9441-b99bdb19fbb0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "82922e1e-56fb-432e-9441-b99bdb19fbb0" (UID: "82922e1e-56fb-432e-9441-b99bdb19fbb0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:17:33 crc kubenswrapper[4733]: I0318 10:17:33.370767 4733 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82922e1e-56fb-432e-9441-b99bdb19fbb0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 10:17:33 crc kubenswrapper[4733]: I0318 10:17:33.594064 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ll7dp"] Mar 18 10:17:33 crc kubenswrapper[4733]: I0318 10:17:33.605441 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ll7dp"] Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.143696 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6966b4866d-xm6hn" Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.268910 4733 generic.go:334] "Generic (PLEG): container finished" podID="debb4f43-7f9a-4fdc-9896-db5106650a74" containerID="7412c3c1effa7cc3545719a633ad64080beb347c499c8ddc9951beee83c2e740" exitCode=0 Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.268999 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6966b4866d-xm6hn" event={"ID":"debb4f43-7f9a-4fdc-9896-db5106650a74","Type":"ContainerDied","Data":"7412c3c1effa7cc3545719a633ad64080beb347c499c8ddc9951beee83c2e740"} Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.269035 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6966b4866d-xm6hn" event={"ID":"debb4f43-7f9a-4fdc-9896-db5106650a74","Type":"ContainerDied","Data":"f30c8eb187bc3102906d7afcd8a11a3b91672d508cc51863174f4d3b52b10205"} Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.269056 4733 scope.go:117] "RemoveContainer" containerID="7412c3c1effa7cc3545719a633ad64080beb347c499c8ddc9951beee83c2e740" Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.269092 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6966b4866d-xm6hn" Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.273353 4733 generic.go:334] "Generic (PLEG): container finished" podID="f280b8dd-9b2f-4faa-9335-ca84961ea0a6" containerID="c282a4dca9ca3d5260c23c43faf22ce611247dc38c8752227edd1fbf81bf90dd" exitCode=0 Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.273390 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7cf89766c-wb795" event={"ID":"f280b8dd-9b2f-4faa-9335-ca84961ea0a6","Type":"ContainerDied","Data":"c282a4dca9ca3d5260c23c43faf22ce611247dc38c8752227edd1fbf81bf90dd"} Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.282722 4733 scope.go:117] "RemoveContainer" containerID="7412c3c1effa7cc3545719a633ad64080beb347c499c8ddc9951beee83c2e740" Mar 18 10:17:34 crc kubenswrapper[4733]: E0318 10:17:34.283184 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7412c3c1effa7cc3545719a633ad64080beb347c499c8ddc9951beee83c2e740\": container with ID starting with 7412c3c1effa7cc3545719a633ad64080beb347c499c8ddc9951beee83c2e740 not found: ID does not exist" containerID="7412c3c1effa7cc3545719a633ad64080beb347c499c8ddc9951beee83c2e740" Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.283290 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7412c3c1effa7cc3545719a633ad64080beb347c499c8ddc9951beee83c2e740"} err="failed to get container status \"7412c3c1effa7cc3545719a633ad64080beb347c499c8ddc9951beee83c2e740\": rpc error: code = NotFound desc = could not find container \"7412c3c1effa7cc3545719a633ad64080beb347c499c8ddc9951beee83c2e740\": container with ID starting with 7412c3c1effa7cc3545719a633ad64080beb347c499c8ddc9951beee83c2e740 not found: ID does not exist" Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.284716 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwfk4\" (UniqueName: \"kubernetes.io/projected/debb4f43-7f9a-4fdc-9896-db5106650a74-kube-api-access-wwfk4\") pod \"debb4f43-7f9a-4fdc-9896-db5106650a74\" (UID: \"debb4f43-7f9a-4fdc-9896-db5106650a74\") " Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.284803 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/debb4f43-7f9a-4fdc-9896-db5106650a74-client-ca\") pod \"debb4f43-7f9a-4fdc-9896-db5106650a74\" (UID: \"debb4f43-7f9a-4fdc-9896-db5106650a74\") " Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.284855 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/debb4f43-7f9a-4fdc-9896-db5106650a74-config\") pod \"debb4f43-7f9a-4fdc-9896-db5106650a74\" (UID: \"debb4f43-7f9a-4fdc-9896-db5106650a74\") " Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.284944 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/debb4f43-7f9a-4fdc-9896-db5106650a74-serving-cert\") pod \"debb4f43-7f9a-4fdc-9896-db5106650a74\" (UID: \"debb4f43-7f9a-4fdc-9896-db5106650a74\") " Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.285938 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/debb4f43-7f9a-4fdc-9896-db5106650a74-config" (OuterVolumeSpecName: "config") pod "debb4f43-7f9a-4fdc-9896-db5106650a74" (UID: "debb4f43-7f9a-4fdc-9896-db5106650a74"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.286569 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/debb4f43-7f9a-4fdc-9896-db5106650a74-client-ca" (OuterVolumeSpecName: "client-ca") pod "debb4f43-7f9a-4fdc-9896-db5106650a74" (UID: "debb4f43-7f9a-4fdc-9896-db5106650a74"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.290148 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/debb4f43-7f9a-4fdc-9896-db5106650a74-kube-api-access-wwfk4" (OuterVolumeSpecName: "kube-api-access-wwfk4") pod "debb4f43-7f9a-4fdc-9896-db5106650a74" (UID: "debb4f43-7f9a-4fdc-9896-db5106650a74"). InnerVolumeSpecName "kube-api-access-wwfk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.290637 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/debb4f43-7f9a-4fdc-9896-db5106650a74-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "debb4f43-7f9a-4fdc-9896-db5106650a74" (UID: "debb4f43-7f9a-4fdc-9896-db5106650a74"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.324818 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7cf89766c-wb795" Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.386807 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f280b8dd-9b2f-4faa-9335-ca84961ea0a6-config\") pod \"f280b8dd-9b2f-4faa-9335-ca84961ea0a6\" (UID: \"f280b8dd-9b2f-4faa-9335-ca84961ea0a6\") " Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.386865 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-575kc\" (UniqueName: \"kubernetes.io/projected/f280b8dd-9b2f-4faa-9335-ca84961ea0a6-kube-api-access-575kc\") pod \"f280b8dd-9b2f-4faa-9335-ca84961ea0a6\" (UID: \"f280b8dd-9b2f-4faa-9335-ca84961ea0a6\") " Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.386919 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f280b8dd-9b2f-4faa-9335-ca84961ea0a6-client-ca\") pod \"f280b8dd-9b2f-4faa-9335-ca84961ea0a6\" (UID: \"f280b8dd-9b2f-4faa-9335-ca84961ea0a6\") " Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.387016 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f280b8dd-9b2f-4faa-9335-ca84961ea0a6-proxy-ca-bundles\") pod \"f280b8dd-9b2f-4faa-9335-ca84961ea0a6\" (UID: \"f280b8dd-9b2f-4faa-9335-ca84961ea0a6\") " Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.387045 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f280b8dd-9b2f-4faa-9335-ca84961ea0a6-serving-cert\") pod \"f280b8dd-9b2f-4faa-9335-ca84961ea0a6\" (UID: \"f280b8dd-9b2f-4faa-9335-ca84961ea0a6\") " Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.387304 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwfk4\" (UniqueName: \"kubernetes.io/projected/debb4f43-7f9a-4fdc-9896-db5106650a74-kube-api-access-wwfk4\") on node \"crc\" DevicePath \"\"" Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.387319 4733 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/debb4f43-7f9a-4fdc-9896-db5106650a74-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.387329 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/debb4f43-7f9a-4fdc-9896-db5106650a74-config\") on node \"crc\" DevicePath \"\"" Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.387337 4733 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/debb4f43-7f9a-4fdc-9896-db5106650a74-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.388113 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f280b8dd-9b2f-4faa-9335-ca84961ea0a6-client-ca" (OuterVolumeSpecName: "client-ca") pod "f280b8dd-9b2f-4faa-9335-ca84961ea0a6" (UID: "f280b8dd-9b2f-4faa-9335-ca84961ea0a6"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.388231 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f280b8dd-9b2f-4faa-9335-ca84961ea0a6-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "f280b8dd-9b2f-4faa-9335-ca84961ea0a6" (UID: "f280b8dd-9b2f-4faa-9335-ca84961ea0a6"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.388842 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f280b8dd-9b2f-4faa-9335-ca84961ea0a6-config" (OuterVolumeSpecName: "config") pod "f280b8dd-9b2f-4faa-9335-ca84961ea0a6" (UID: "f280b8dd-9b2f-4faa-9335-ca84961ea0a6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.390419 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f280b8dd-9b2f-4faa-9335-ca84961ea0a6-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f280b8dd-9b2f-4faa-9335-ca84961ea0a6" (UID: "f280b8dd-9b2f-4faa-9335-ca84961ea0a6"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.390476 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f280b8dd-9b2f-4faa-9335-ca84961ea0a6-kube-api-access-575kc" (OuterVolumeSpecName: "kube-api-access-575kc") pod "f280b8dd-9b2f-4faa-9335-ca84961ea0a6" (UID: "f280b8dd-9b2f-4faa-9335-ca84961ea0a6"). InnerVolumeSpecName "kube-api-access-575kc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.429122 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rls2r" Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.429394 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rls2r" Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.451067 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-795c5666f8-fqxxn"] Mar 18 10:17:34 crc kubenswrapper[4733]: E0318 10:17:34.452085 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f280b8dd-9b2f-4faa-9335-ca84961ea0a6" containerName="controller-manager" Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.452097 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="f280b8dd-9b2f-4faa-9335-ca84961ea0a6" containerName="controller-manager" Mar 18 10:17:34 crc kubenswrapper[4733]: E0318 10:17:34.452108 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7eb97f2d-18fa-4e8c-895f-de4602c9dbbc" containerName="extract-utilities" Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.452114 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="7eb97f2d-18fa-4e8c-895f-de4602c9dbbc" containerName="extract-utilities" Mar 18 10:17:34 crc kubenswrapper[4733]: E0318 10:17:34.452121 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82922e1e-56fb-432e-9441-b99bdb19fbb0" containerName="extract-utilities" Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.452127 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="82922e1e-56fb-432e-9441-b99bdb19fbb0" containerName="extract-utilities" Mar 18 10:17:34 crc kubenswrapper[4733]: E0318 10:17:34.452137 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7eb97f2d-18fa-4e8c-895f-de4602c9dbbc" containerName="registry-server" Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.452142 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="7eb97f2d-18fa-4e8c-895f-de4602c9dbbc" containerName="registry-server" Mar 18 10:17:34 crc kubenswrapper[4733]: E0318 10:17:34.452149 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7eb97f2d-18fa-4e8c-895f-de4602c9dbbc" containerName="extract-content" Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.452155 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="7eb97f2d-18fa-4e8c-895f-de4602c9dbbc" containerName="extract-content" Mar 18 10:17:34 crc kubenswrapper[4733]: E0318 10:17:34.452164 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82922e1e-56fb-432e-9441-b99bdb19fbb0" containerName="extract-content" Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.452169 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="82922e1e-56fb-432e-9441-b99bdb19fbb0" containerName="extract-content" Mar 18 10:17:34 crc kubenswrapper[4733]: E0318 10:17:34.452176 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82922e1e-56fb-432e-9441-b99bdb19fbb0" containerName="registry-server" Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.452181 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="82922e1e-56fb-432e-9441-b99bdb19fbb0" containerName="registry-server" Mar 18 10:17:34 crc kubenswrapper[4733]: E0318 10:17:34.452201 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="debb4f43-7f9a-4fdc-9896-db5106650a74" containerName="route-controller-manager" Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.452206 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="debb4f43-7f9a-4fdc-9896-db5106650a74" containerName="route-controller-manager" Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.452303 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="7eb97f2d-18fa-4e8c-895f-de4602c9dbbc" containerName="registry-server" Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.452314 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="82922e1e-56fb-432e-9441-b99bdb19fbb0" containerName="registry-server" Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.452322 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="debb4f43-7f9a-4fdc-9896-db5106650a74" containerName="route-controller-manager" Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.452331 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="f280b8dd-9b2f-4faa-9335-ca84961ea0a6" containerName="controller-manager" Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.452660 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-795c5666f8-fqxxn" Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.467257 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-795c5666f8-fqxxn"] Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.476092 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rls2r" Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.488432 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f280b8dd-9b2f-4faa-9335-ca84961ea0a6-config\") on node \"crc\" DevicePath \"\"" Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.488464 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-575kc\" (UniqueName: \"kubernetes.io/projected/f280b8dd-9b2f-4faa-9335-ca84961ea0a6-kube-api-access-575kc\") on node \"crc\" DevicePath \"\"" Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.488474 4733 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f280b8dd-9b2f-4faa-9335-ca84961ea0a6-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.488483 4733 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f280b8dd-9b2f-4faa-9335-ca84961ea0a6-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.488491 4733 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f280b8dd-9b2f-4faa-9335-ca84961ea0a6-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.589378 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgfsn\" (UniqueName: \"kubernetes.io/projected/2338c705-9627-4c7c-97c5-60c492309e8f-kube-api-access-sgfsn\") pod \"route-controller-manager-795c5666f8-fqxxn\" (UID: \"2338c705-9627-4c7c-97c5-60c492309e8f\") " pod="openshift-route-controller-manager/route-controller-manager-795c5666f8-fqxxn" Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.589533 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2338c705-9627-4c7c-97c5-60c492309e8f-serving-cert\") pod \"route-controller-manager-795c5666f8-fqxxn\" (UID: \"2338c705-9627-4c7c-97c5-60c492309e8f\") " pod="openshift-route-controller-manager/route-controller-manager-795c5666f8-fqxxn" Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.589555 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2338c705-9627-4c7c-97c5-60c492309e8f-config\") pod \"route-controller-manager-795c5666f8-fqxxn\" (UID: \"2338c705-9627-4c7c-97c5-60c492309e8f\") " pod="openshift-route-controller-manager/route-controller-manager-795c5666f8-fqxxn" Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.589603 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2338c705-9627-4c7c-97c5-60c492309e8f-client-ca\") pod \"route-controller-manager-795c5666f8-fqxxn\" (UID: \"2338c705-9627-4c7c-97c5-60c492309e8f\") " pod="openshift-route-controller-manager/route-controller-manager-795c5666f8-fqxxn" Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.593344 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6966b4866d-xm6hn"] Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.596840 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6966b4866d-xm6hn"] Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.690605 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2338c705-9627-4c7c-97c5-60c492309e8f-serving-cert\") pod \"route-controller-manager-795c5666f8-fqxxn\" (UID: \"2338c705-9627-4c7c-97c5-60c492309e8f\") " pod="openshift-route-controller-manager/route-controller-manager-795c5666f8-fqxxn" Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.690654 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2338c705-9627-4c7c-97c5-60c492309e8f-config\") pod \"route-controller-manager-795c5666f8-fqxxn\" (UID: \"2338c705-9627-4c7c-97c5-60c492309e8f\") " pod="openshift-route-controller-manager/route-controller-manager-795c5666f8-fqxxn" Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.690695 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2338c705-9627-4c7c-97c5-60c492309e8f-client-ca\") pod \"route-controller-manager-795c5666f8-fqxxn\" (UID: \"2338c705-9627-4c7c-97c5-60c492309e8f\") " pod="openshift-route-controller-manager/route-controller-manager-795c5666f8-fqxxn" Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.690762 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgfsn\" (UniqueName: \"kubernetes.io/projected/2338c705-9627-4c7c-97c5-60c492309e8f-kube-api-access-sgfsn\") pod \"route-controller-manager-795c5666f8-fqxxn\" (UID: \"2338c705-9627-4c7c-97c5-60c492309e8f\") " pod="openshift-route-controller-manager/route-controller-manager-795c5666f8-fqxxn" Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.691760 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2338c705-9627-4c7c-97c5-60c492309e8f-client-ca\") pod \"route-controller-manager-795c5666f8-fqxxn\" (UID: \"2338c705-9627-4c7c-97c5-60c492309e8f\") " pod="openshift-route-controller-manager/route-controller-manager-795c5666f8-fqxxn" Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.692075 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2338c705-9627-4c7c-97c5-60c492309e8f-config\") pod \"route-controller-manager-795c5666f8-fqxxn\" (UID: \"2338c705-9627-4c7c-97c5-60c492309e8f\") " pod="openshift-route-controller-manager/route-controller-manager-795c5666f8-fqxxn" Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.698858 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2338c705-9627-4c7c-97c5-60c492309e8f-serving-cert\") pod \"route-controller-manager-795c5666f8-fqxxn\" (UID: \"2338c705-9627-4c7c-97c5-60c492309e8f\") " pod="openshift-route-controller-manager/route-controller-manager-795c5666f8-fqxxn" Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.705857 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgfsn\" (UniqueName: \"kubernetes.io/projected/2338c705-9627-4c7c-97c5-60c492309e8f-kube-api-access-sgfsn\") pod \"route-controller-manager-795c5666f8-fqxxn\" (UID: \"2338c705-9627-4c7c-97c5-60c492309e8f\") " pod="openshift-route-controller-manager/route-controller-manager-795c5666f8-fqxxn" Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.723499 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-w7rrs" Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.723529 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-w7rrs" Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.763519 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-w7rrs" Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.767764 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-795c5666f8-fqxxn" Mar 18 10:17:34 crc kubenswrapper[4733]: I0318 10:17:34.962917 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-795c5666f8-fqxxn"] Mar 18 10:17:34 crc kubenswrapper[4733]: W0318 10:17:34.971061 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2338c705_9627_4c7c_97c5_60c492309e8f.slice/crio-d2316c222486d15d81f6544c130d255f209d75ef4a225b569454d5bafde3bc6e WatchSource:0}: Error finding container d2316c222486d15d81f6544c130d255f209d75ef4a225b569454d5bafde3bc6e: Status 404 returned error can't find the container with id d2316c222486d15d81f6544c130d255f209d75ef4a225b569454d5bafde3bc6e Mar 18 10:17:35 crc kubenswrapper[4733]: I0318 10:17:35.183645 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82922e1e-56fb-432e-9441-b99bdb19fbb0" path="/var/lib/kubelet/pods/82922e1e-56fb-432e-9441-b99bdb19fbb0/volumes" Mar 18 10:17:35 crc kubenswrapper[4733]: I0318 10:17:35.185356 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="debb4f43-7f9a-4fdc-9896-db5106650a74" path="/var/lib/kubelet/pods/debb4f43-7f9a-4fdc-9896-db5106650a74/volumes" Mar 18 10:17:35 crc kubenswrapper[4733]: I0318 10:17:35.281046 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7cf89766c-wb795" event={"ID":"f280b8dd-9b2f-4faa-9335-ca84961ea0a6","Type":"ContainerDied","Data":"15b644538c3645c79d06f703952a07b7b4ae39d43d1dc32a31009b361d9095cd"} Mar 18 10:17:35 crc kubenswrapper[4733]: I0318 10:17:35.281061 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7cf89766c-wb795" Mar 18 10:17:35 crc kubenswrapper[4733]: I0318 10:17:35.281115 4733 scope.go:117] "RemoveContainer" containerID="c282a4dca9ca3d5260c23c43faf22ce611247dc38c8752227edd1fbf81bf90dd" Mar 18 10:17:35 crc kubenswrapper[4733]: I0318 10:17:35.282507 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-795c5666f8-fqxxn" event={"ID":"2338c705-9627-4c7c-97c5-60c492309e8f","Type":"ContainerStarted","Data":"d2316c222486d15d81f6544c130d255f209d75ef4a225b569454d5bafde3bc6e"} Mar 18 10:17:35 crc kubenswrapper[4733]: I0318 10:17:35.306258 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7cf89766c-wb795"] Mar 18 10:17:35 crc kubenswrapper[4733]: I0318 10:17:35.308711 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7cf89766c-wb795"] Mar 18 10:17:35 crc kubenswrapper[4733]: I0318 10:17:35.320807 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-w7rrs" Mar 18 10:17:35 crc kubenswrapper[4733]: I0318 10:17:35.331402 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rls2r" Mar 18 10:17:36 crc kubenswrapper[4733]: I0318 10:17:36.243384 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jb86w" Mar 18 10:17:36 crc kubenswrapper[4733]: I0318 10:17:36.243517 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jb86w" Mar 18 10:17:36 crc kubenswrapper[4733]: I0318 10:17:36.289035 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jb86w" Mar 18 10:17:36 crc kubenswrapper[4733]: I0318 10:17:36.290821 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-795c5666f8-fqxxn" event={"ID":"2338c705-9627-4c7c-97c5-60c492309e8f","Type":"ContainerStarted","Data":"b5cbc2cf0ca3b801430e89c3bd7280dd3b0874c8d8c15426c4e2737151b8f3ad"} Mar 18 10:17:36 crc kubenswrapper[4733]: I0318 10:17:36.291079 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-795c5666f8-fqxxn" Mar 18 10:17:36 crc kubenswrapper[4733]: I0318 10:17:36.296788 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-795c5666f8-fqxxn" Mar 18 10:17:36 crc kubenswrapper[4733]: I0318 10:17:36.340559 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jb86w" Mar 18 10:17:36 crc kubenswrapper[4733]: I0318 10:17:36.433604 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-795c5666f8-fqxxn" podStartSLOduration=3.433581354 podStartE2EDuration="3.433581354s" podCreationTimestamp="2026-03-18 10:17:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:17:36.349178901 +0000 UTC m=+295.840913216" watchObservedRunningTime="2026-03-18 10:17:36.433581354 +0000 UTC m=+295.925315679" Mar 18 10:17:36 crc kubenswrapper[4733]: I0318 10:17:36.454847 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-855cb9cb7d-nd8zj"] Mar 18 10:17:36 crc kubenswrapper[4733]: I0318 10:17:36.456275 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-855cb9cb7d-nd8zj" Mar 18 10:17:36 crc kubenswrapper[4733]: I0318 10:17:36.462795 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 10:17:36 crc kubenswrapper[4733]: I0318 10:17:36.464229 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 10:17:36 crc kubenswrapper[4733]: I0318 10:17:36.464443 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 10:17:36 crc kubenswrapper[4733]: I0318 10:17:36.464541 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 10:17:36 crc kubenswrapper[4733]: I0318 10:17:36.464227 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 10:17:36 crc kubenswrapper[4733]: I0318 10:17:36.464813 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 10:17:36 crc kubenswrapper[4733]: I0318 10:17:36.471349 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 10:17:36 crc kubenswrapper[4733]: I0318 10:17:36.480831 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-855cb9cb7d-nd8zj"] Mar 18 10:17:36 crc kubenswrapper[4733]: I0318 10:17:36.521650 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c910a75-c5cb-4f2e-ba5b-29866e412aae-serving-cert\") pod \"controller-manager-855cb9cb7d-nd8zj\" (UID: \"9c910a75-c5cb-4f2e-ba5b-29866e412aae\") " pod="openshift-controller-manager/controller-manager-855cb9cb7d-nd8zj" Mar 18 10:17:36 crc kubenswrapper[4733]: I0318 10:17:36.521711 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c910a75-c5cb-4f2e-ba5b-29866e412aae-config\") pod \"controller-manager-855cb9cb7d-nd8zj\" (UID: \"9c910a75-c5cb-4f2e-ba5b-29866e412aae\") " pod="openshift-controller-manager/controller-manager-855cb9cb7d-nd8zj" Mar 18 10:17:36 crc kubenswrapper[4733]: I0318 10:17:36.521737 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bffk7\" (UniqueName: \"kubernetes.io/projected/9c910a75-c5cb-4f2e-ba5b-29866e412aae-kube-api-access-bffk7\") pod \"controller-manager-855cb9cb7d-nd8zj\" (UID: \"9c910a75-c5cb-4f2e-ba5b-29866e412aae\") " pod="openshift-controller-manager/controller-manager-855cb9cb7d-nd8zj" Mar 18 10:17:36 crc kubenswrapper[4733]: I0318 10:17:36.521762 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9c910a75-c5cb-4f2e-ba5b-29866e412aae-client-ca\") pod \"controller-manager-855cb9cb7d-nd8zj\" (UID: \"9c910a75-c5cb-4f2e-ba5b-29866e412aae\") " pod="openshift-controller-manager/controller-manager-855cb9cb7d-nd8zj" Mar 18 10:17:36 crc kubenswrapper[4733]: I0318 10:17:36.521956 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9c910a75-c5cb-4f2e-ba5b-29866e412aae-proxy-ca-bundles\") pod \"controller-manager-855cb9cb7d-nd8zj\" (UID: \"9c910a75-c5cb-4f2e-ba5b-29866e412aae\") " pod="openshift-controller-manager/controller-manager-855cb9cb7d-nd8zj" Mar 18 10:17:36 crc kubenswrapper[4733]: I0318 10:17:36.623603 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c910a75-c5cb-4f2e-ba5b-29866e412aae-serving-cert\") pod \"controller-manager-855cb9cb7d-nd8zj\" (UID: \"9c910a75-c5cb-4f2e-ba5b-29866e412aae\") " pod="openshift-controller-manager/controller-manager-855cb9cb7d-nd8zj" Mar 18 10:17:36 crc kubenswrapper[4733]: I0318 10:17:36.623664 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c910a75-c5cb-4f2e-ba5b-29866e412aae-config\") pod \"controller-manager-855cb9cb7d-nd8zj\" (UID: \"9c910a75-c5cb-4f2e-ba5b-29866e412aae\") " pod="openshift-controller-manager/controller-manager-855cb9cb7d-nd8zj" Mar 18 10:17:36 crc kubenswrapper[4733]: I0318 10:17:36.623686 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bffk7\" (UniqueName: \"kubernetes.io/projected/9c910a75-c5cb-4f2e-ba5b-29866e412aae-kube-api-access-bffk7\") pod \"controller-manager-855cb9cb7d-nd8zj\" (UID: \"9c910a75-c5cb-4f2e-ba5b-29866e412aae\") " pod="openshift-controller-manager/controller-manager-855cb9cb7d-nd8zj" Mar 18 10:17:36 crc kubenswrapper[4733]: I0318 10:17:36.623713 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9c910a75-c5cb-4f2e-ba5b-29866e412aae-client-ca\") pod \"controller-manager-855cb9cb7d-nd8zj\" (UID: \"9c910a75-c5cb-4f2e-ba5b-29866e412aae\") " pod="openshift-controller-manager/controller-manager-855cb9cb7d-nd8zj" Mar 18 10:17:36 crc kubenswrapper[4733]: I0318 10:17:36.623759 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9c910a75-c5cb-4f2e-ba5b-29866e412aae-proxy-ca-bundles\") pod \"controller-manager-855cb9cb7d-nd8zj\" (UID: \"9c910a75-c5cb-4f2e-ba5b-29866e412aae\") " pod="openshift-controller-manager/controller-manager-855cb9cb7d-nd8zj" Mar 18 10:17:36 crc kubenswrapper[4733]: I0318 10:17:36.624952 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9c910a75-c5cb-4f2e-ba5b-29866e412aae-proxy-ca-bundles\") pod \"controller-manager-855cb9cb7d-nd8zj\" (UID: \"9c910a75-c5cb-4f2e-ba5b-29866e412aae\") " pod="openshift-controller-manager/controller-manager-855cb9cb7d-nd8zj" Mar 18 10:17:36 crc kubenswrapper[4733]: I0318 10:17:36.625068 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9c910a75-c5cb-4f2e-ba5b-29866e412aae-client-ca\") pod \"controller-manager-855cb9cb7d-nd8zj\" (UID: \"9c910a75-c5cb-4f2e-ba5b-29866e412aae\") " pod="openshift-controller-manager/controller-manager-855cb9cb7d-nd8zj" Mar 18 10:17:36 crc kubenswrapper[4733]: I0318 10:17:36.625164 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c910a75-c5cb-4f2e-ba5b-29866e412aae-config\") pod \"controller-manager-855cb9cb7d-nd8zj\" (UID: \"9c910a75-c5cb-4f2e-ba5b-29866e412aae\") " pod="openshift-controller-manager/controller-manager-855cb9cb7d-nd8zj" Mar 18 10:17:36 crc kubenswrapper[4733]: I0318 10:17:36.631613 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c910a75-c5cb-4f2e-ba5b-29866e412aae-serving-cert\") pod \"controller-manager-855cb9cb7d-nd8zj\" (UID: \"9c910a75-c5cb-4f2e-ba5b-29866e412aae\") " pod="openshift-controller-manager/controller-manager-855cb9cb7d-nd8zj" Mar 18 10:17:36 crc kubenswrapper[4733]: I0318 10:17:36.633089 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-f82xf" Mar 18 10:17:36 crc kubenswrapper[4733]: I0318 10:17:36.633128 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-f82xf" Mar 18 10:17:36 crc kubenswrapper[4733]: I0318 10:17:36.648935 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bffk7\" (UniqueName: \"kubernetes.io/projected/9c910a75-c5cb-4f2e-ba5b-29866e412aae-kube-api-access-bffk7\") pod \"controller-manager-855cb9cb7d-nd8zj\" (UID: \"9c910a75-c5cb-4f2e-ba5b-29866e412aae\") " pod="openshift-controller-manager/controller-manager-855cb9cb7d-nd8zj" Mar 18 10:17:36 crc kubenswrapper[4733]: I0318 10:17:36.688140 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-f82xf" Mar 18 10:17:36 crc kubenswrapper[4733]: I0318 10:17:36.774601 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-855cb9cb7d-nd8zj" Mar 18 10:17:36 crc kubenswrapper[4733]: I0318 10:17:36.981970 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-855cb9cb7d-nd8zj"] Mar 18 10:17:37 crc kubenswrapper[4733]: I0318 10:17:37.183283 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f280b8dd-9b2f-4faa-9335-ca84961ea0a6" path="/var/lib/kubelet/pods/f280b8dd-9b2f-4faa-9335-ca84961ea0a6/volumes" Mar 18 10:17:37 crc kubenswrapper[4733]: I0318 10:17:37.300277 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-855cb9cb7d-nd8zj" event={"ID":"9c910a75-c5cb-4f2e-ba5b-29866e412aae","Type":"ContainerStarted","Data":"b2763b1f739435470aef969e3ffa833e68884b68a2e5a4fe94dd1c73a705f26a"} Mar 18 10:17:37 crc kubenswrapper[4733]: I0318 10:17:37.300341 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-855cb9cb7d-nd8zj" event={"ID":"9c910a75-c5cb-4f2e-ba5b-29866e412aae","Type":"ContainerStarted","Data":"429b52d24bd3668b5415677a4bed9fb2a028c9049e40bf9e468c59dfe2fee20b"} Mar 18 10:17:37 crc kubenswrapper[4733]: I0318 10:17:37.347835 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-f82xf" Mar 18 10:17:37 crc kubenswrapper[4733]: I0318 10:17:37.381277 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-855cb9cb7d-nd8zj" podStartSLOduration=4.381237756 podStartE2EDuration="4.381237756s" podCreationTimestamp="2026-03-18 10:17:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:17:37.328972333 +0000 UTC m=+296.820706658" watchObservedRunningTime="2026-03-18 10:17:37.381237756 +0000 UTC m=+296.872972071" Mar 18 10:17:37 crc kubenswrapper[4733]: I0318 10:17:37.619370 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w7rrs"] Mar 18 10:17:37 crc kubenswrapper[4733]: I0318 10:17:37.619626 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-w7rrs" podUID="02cd6358-355c-4db8-b0f7-2528618602ff" containerName="registry-server" containerID="cri-o://0670cad9cd6cfa6a4cf42522d884617aa8495c18fcee7ac17083576de64388b6" gracePeriod=2 Mar 18 10:17:38 crc kubenswrapper[4733]: I0318 10:17:38.035973 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w7rrs" Mar 18 10:17:38 crc kubenswrapper[4733]: I0318 10:17:38.146043 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02cd6358-355c-4db8-b0f7-2528618602ff-utilities\") pod \"02cd6358-355c-4db8-b0f7-2528618602ff\" (UID: \"02cd6358-355c-4db8-b0f7-2528618602ff\") " Mar 18 10:17:38 crc kubenswrapper[4733]: I0318 10:17:38.146167 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5vpf\" (UniqueName: \"kubernetes.io/projected/02cd6358-355c-4db8-b0f7-2528618602ff-kube-api-access-s5vpf\") pod \"02cd6358-355c-4db8-b0f7-2528618602ff\" (UID: \"02cd6358-355c-4db8-b0f7-2528618602ff\") " Mar 18 10:17:38 crc kubenswrapper[4733]: I0318 10:17:38.146288 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02cd6358-355c-4db8-b0f7-2528618602ff-catalog-content\") pod \"02cd6358-355c-4db8-b0f7-2528618602ff\" (UID: \"02cd6358-355c-4db8-b0f7-2528618602ff\") " Mar 18 10:17:38 crc kubenswrapper[4733]: I0318 10:17:38.147465 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02cd6358-355c-4db8-b0f7-2528618602ff-utilities" (OuterVolumeSpecName: "utilities") pod "02cd6358-355c-4db8-b0f7-2528618602ff" (UID: "02cd6358-355c-4db8-b0f7-2528618602ff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:17:38 crc kubenswrapper[4733]: I0318 10:17:38.156182 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02cd6358-355c-4db8-b0f7-2528618602ff-kube-api-access-s5vpf" (OuterVolumeSpecName: "kube-api-access-s5vpf") pod "02cd6358-355c-4db8-b0f7-2528618602ff" (UID: "02cd6358-355c-4db8-b0f7-2528618602ff"). InnerVolumeSpecName "kube-api-access-s5vpf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:17:38 crc kubenswrapper[4733]: I0318 10:17:38.193372 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02cd6358-355c-4db8-b0f7-2528618602ff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "02cd6358-355c-4db8-b0f7-2528618602ff" (UID: "02cd6358-355c-4db8-b0f7-2528618602ff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:17:38 crc kubenswrapper[4733]: I0318 10:17:38.248485 4733 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02cd6358-355c-4db8-b0f7-2528618602ff-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 10:17:38 crc kubenswrapper[4733]: I0318 10:17:38.248539 4733 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02cd6358-355c-4db8-b0f7-2528618602ff-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 10:17:38 crc kubenswrapper[4733]: I0318 10:17:38.248565 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5vpf\" (UniqueName: \"kubernetes.io/projected/02cd6358-355c-4db8-b0f7-2528618602ff-kube-api-access-s5vpf\") on node \"crc\" DevicePath \"\"" Mar 18 10:17:38 crc kubenswrapper[4733]: I0318 10:17:38.306713 4733 generic.go:334] "Generic (PLEG): container finished" podID="02cd6358-355c-4db8-b0f7-2528618602ff" containerID="0670cad9cd6cfa6a4cf42522d884617aa8495c18fcee7ac17083576de64388b6" exitCode=0 Mar 18 10:17:38 crc kubenswrapper[4733]: I0318 10:17:38.307033 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w7rrs" Mar 18 10:17:38 crc kubenswrapper[4733]: I0318 10:17:38.307078 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w7rrs" event={"ID":"02cd6358-355c-4db8-b0f7-2528618602ff","Type":"ContainerDied","Data":"0670cad9cd6cfa6a4cf42522d884617aa8495c18fcee7ac17083576de64388b6"} Mar 18 10:17:38 crc kubenswrapper[4733]: I0318 10:17:38.307162 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w7rrs" event={"ID":"02cd6358-355c-4db8-b0f7-2528618602ff","Type":"ContainerDied","Data":"bf716e26a7a1e4408c9cf17e7366833bdc30d38efd823adf2eb5d92d8a80e381"} Mar 18 10:17:38 crc kubenswrapper[4733]: I0318 10:17:38.307201 4733 scope.go:117] "RemoveContainer" containerID="0670cad9cd6cfa6a4cf42522d884617aa8495c18fcee7ac17083576de64388b6" Mar 18 10:17:38 crc kubenswrapper[4733]: I0318 10:17:38.308015 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-855cb9cb7d-nd8zj" Mar 18 10:17:38 crc kubenswrapper[4733]: I0318 10:17:38.314605 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-855cb9cb7d-nd8zj" Mar 18 10:17:38 crc kubenswrapper[4733]: I0318 10:17:38.337368 4733 scope.go:117] "RemoveContainer" containerID="6888d6a2cdf0ef31a9d456bbbbc7efb04bffa6fcf33a7a14044f9c00de4a1450" Mar 18 10:17:38 crc kubenswrapper[4733]: I0318 10:17:38.360856 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w7rrs"] Mar 18 10:17:38 crc kubenswrapper[4733]: I0318 10:17:38.360916 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-w7rrs"] Mar 18 10:17:38 crc kubenswrapper[4733]: I0318 10:17:38.374425 4733 scope.go:117] "RemoveContainer" containerID="0f769f3a01023165d2b55b37631a8e64c99ba9561927f3f83986829531dcb6ed" Mar 18 10:17:38 crc kubenswrapper[4733]: I0318 10:17:38.400342 4733 scope.go:117] "RemoveContainer" containerID="0670cad9cd6cfa6a4cf42522d884617aa8495c18fcee7ac17083576de64388b6" Mar 18 10:17:38 crc kubenswrapper[4733]: E0318 10:17:38.400771 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0670cad9cd6cfa6a4cf42522d884617aa8495c18fcee7ac17083576de64388b6\": container with ID starting with 0670cad9cd6cfa6a4cf42522d884617aa8495c18fcee7ac17083576de64388b6 not found: ID does not exist" containerID="0670cad9cd6cfa6a4cf42522d884617aa8495c18fcee7ac17083576de64388b6" Mar 18 10:17:38 crc kubenswrapper[4733]: I0318 10:17:38.400809 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0670cad9cd6cfa6a4cf42522d884617aa8495c18fcee7ac17083576de64388b6"} err="failed to get container status \"0670cad9cd6cfa6a4cf42522d884617aa8495c18fcee7ac17083576de64388b6\": rpc error: code = NotFound desc = could not find container \"0670cad9cd6cfa6a4cf42522d884617aa8495c18fcee7ac17083576de64388b6\": container with ID starting with 0670cad9cd6cfa6a4cf42522d884617aa8495c18fcee7ac17083576de64388b6 not found: ID does not exist" Mar 18 10:17:38 crc kubenswrapper[4733]: I0318 10:17:38.400836 4733 scope.go:117] "RemoveContainer" containerID="6888d6a2cdf0ef31a9d456bbbbc7efb04bffa6fcf33a7a14044f9c00de4a1450" Mar 18 10:17:38 crc kubenswrapper[4733]: E0318 10:17:38.401555 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6888d6a2cdf0ef31a9d456bbbbc7efb04bffa6fcf33a7a14044f9c00de4a1450\": container with ID starting with 6888d6a2cdf0ef31a9d456bbbbc7efb04bffa6fcf33a7a14044f9c00de4a1450 not found: ID does not exist" containerID="6888d6a2cdf0ef31a9d456bbbbc7efb04bffa6fcf33a7a14044f9c00de4a1450" Mar 18 10:17:38 crc kubenswrapper[4733]: I0318 10:17:38.401576 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6888d6a2cdf0ef31a9d456bbbbc7efb04bffa6fcf33a7a14044f9c00de4a1450"} err="failed to get container status \"6888d6a2cdf0ef31a9d456bbbbc7efb04bffa6fcf33a7a14044f9c00de4a1450\": rpc error: code = NotFound desc = could not find container \"6888d6a2cdf0ef31a9d456bbbbc7efb04bffa6fcf33a7a14044f9c00de4a1450\": container with ID starting with 6888d6a2cdf0ef31a9d456bbbbc7efb04bffa6fcf33a7a14044f9c00de4a1450 not found: ID does not exist" Mar 18 10:17:38 crc kubenswrapper[4733]: I0318 10:17:38.401591 4733 scope.go:117] "RemoveContainer" containerID="0f769f3a01023165d2b55b37631a8e64c99ba9561927f3f83986829531dcb6ed" Mar 18 10:17:38 crc kubenswrapper[4733]: E0318 10:17:38.402061 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f769f3a01023165d2b55b37631a8e64c99ba9561927f3f83986829531dcb6ed\": container with ID starting with 0f769f3a01023165d2b55b37631a8e64c99ba9561927f3f83986829531dcb6ed not found: ID does not exist" containerID="0f769f3a01023165d2b55b37631a8e64c99ba9561927f3f83986829531dcb6ed" Mar 18 10:17:38 crc kubenswrapper[4733]: I0318 10:17:38.402119 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f769f3a01023165d2b55b37631a8e64c99ba9561927f3f83986829531dcb6ed"} err="failed to get container status \"0f769f3a01023165d2b55b37631a8e64c99ba9561927f3f83986829531dcb6ed\": rpc error: code = NotFound desc = could not find container \"0f769f3a01023165d2b55b37631a8e64c99ba9561927f3f83986829531dcb6ed\": container with ID starting with 0f769f3a01023165d2b55b37631a8e64c99ba9561927f3f83986829531dcb6ed not found: ID does not exist" Mar 18 10:17:39 crc kubenswrapper[4733]: I0318 10:17:39.182326 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02cd6358-355c-4db8-b0f7-2528618602ff" path="/var/lib/kubelet/pods/02cd6358-355c-4db8-b0f7-2528618602ff/volumes" Mar 18 10:17:40 crc kubenswrapper[4733]: I0318 10:17:40.025020 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f82xf"] Mar 18 10:17:40 crc kubenswrapper[4733]: I0318 10:17:40.025464 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-f82xf" podUID="c91f12fa-96f0-442a-a3f7-70d56a697839" containerName="registry-server" containerID="cri-o://ceccb7b8039b2e6bd282db7ca8b7756c23dd7eceb685ae3d99b11f442338e94b" gracePeriod=2 Mar 18 10:17:40 crc kubenswrapper[4733]: I0318 10:17:40.324009 4733 generic.go:334] "Generic (PLEG): container finished" podID="c91f12fa-96f0-442a-a3f7-70d56a697839" containerID="ceccb7b8039b2e6bd282db7ca8b7756c23dd7eceb685ae3d99b11f442338e94b" exitCode=0 Mar 18 10:17:40 crc kubenswrapper[4733]: I0318 10:17:40.324084 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f82xf" event={"ID":"c91f12fa-96f0-442a-a3f7-70d56a697839","Type":"ContainerDied","Data":"ceccb7b8039b2e6bd282db7ca8b7756c23dd7eceb685ae3d99b11f442338e94b"} Mar 18 10:17:40 crc kubenswrapper[4733]: I0318 10:17:40.541018 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f82xf" Mar 18 10:17:40 crc kubenswrapper[4733]: I0318 10:17:40.733450 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c91f12fa-96f0-442a-a3f7-70d56a697839-catalog-content\") pod \"c91f12fa-96f0-442a-a3f7-70d56a697839\" (UID: \"c91f12fa-96f0-442a-a3f7-70d56a697839\") " Mar 18 10:17:40 crc kubenswrapper[4733]: I0318 10:17:40.733529 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c91f12fa-96f0-442a-a3f7-70d56a697839-utilities\") pod \"c91f12fa-96f0-442a-a3f7-70d56a697839\" (UID: \"c91f12fa-96f0-442a-a3f7-70d56a697839\") " Mar 18 10:17:40 crc kubenswrapper[4733]: I0318 10:17:40.733610 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6r2vl\" (UniqueName: \"kubernetes.io/projected/c91f12fa-96f0-442a-a3f7-70d56a697839-kube-api-access-6r2vl\") pod \"c91f12fa-96f0-442a-a3f7-70d56a697839\" (UID: \"c91f12fa-96f0-442a-a3f7-70d56a697839\") " Mar 18 10:17:40 crc kubenswrapper[4733]: I0318 10:17:40.734755 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c91f12fa-96f0-442a-a3f7-70d56a697839-utilities" (OuterVolumeSpecName: "utilities") pod "c91f12fa-96f0-442a-a3f7-70d56a697839" (UID: "c91f12fa-96f0-442a-a3f7-70d56a697839"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:17:40 crc kubenswrapper[4733]: I0318 10:17:40.741046 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c91f12fa-96f0-442a-a3f7-70d56a697839-kube-api-access-6r2vl" (OuterVolumeSpecName: "kube-api-access-6r2vl") pod "c91f12fa-96f0-442a-a3f7-70d56a697839" (UID: "c91f12fa-96f0-442a-a3f7-70d56a697839"). InnerVolumeSpecName "kube-api-access-6r2vl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:17:40 crc kubenswrapper[4733]: I0318 10:17:40.831627 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c91f12fa-96f0-442a-a3f7-70d56a697839-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c91f12fa-96f0-442a-a3f7-70d56a697839" (UID: "c91f12fa-96f0-442a-a3f7-70d56a697839"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:17:40 crc kubenswrapper[4733]: I0318 10:17:40.834832 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6r2vl\" (UniqueName: \"kubernetes.io/projected/c91f12fa-96f0-442a-a3f7-70d56a697839-kube-api-access-6r2vl\") on node \"crc\" DevicePath \"\"" Mar 18 10:17:40 crc kubenswrapper[4733]: I0318 10:17:40.834867 4733 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c91f12fa-96f0-442a-a3f7-70d56a697839-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 10:17:40 crc kubenswrapper[4733]: I0318 10:17:40.834879 4733 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c91f12fa-96f0-442a-a3f7-70d56a697839-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 10:17:41 crc kubenswrapper[4733]: I0318 10:17:41.334095 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f82xf" event={"ID":"c91f12fa-96f0-442a-a3f7-70d56a697839","Type":"ContainerDied","Data":"a8fa061a3aa824aa80f6c1569abe326d18dccd731789c62f81d22de7e9a828d3"} Mar 18 10:17:41 crc kubenswrapper[4733]: I0318 10:17:41.334291 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f82xf" Mar 18 10:17:41 crc kubenswrapper[4733]: I0318 10:17:41.334583 4733 scope.go:117] "RemoveContainer" containerID="ceccb7b8039b2e6bd282db7ca8b7756c23dd7eceb685ae3d99b11f442338e94b" Mar 18 10:17:41 crc kubenswrapper[4733]: I0318 10:17:41.365381 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f82xf"] Mar 18 10:17:41 crc kubenswrapper[4733]: I0318 10:17:41.371874 4733 scope.go:117] "RemoveContainer" containerID="e4094d8b4eb850df07d3a19e616d2e9c130ee64b58c626295f963df49e875ea5" Mar 18 10:17:41 crc kubenswrapper[4733]: I0318 10:17:41.372697 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-f82xf"] Mar 18 10:17:41 crc kubenswrapper[4733]: I0318 10:17:41.398234 4733 scope.go:117] "RemoveContainer" containerID="6da3522bbcdb557467c36bac266a9dafb390a5a917de44dd30de9c3ac03051e1" Mar 18 10:17:43 crc kubenswrapper[4733]: I0318 10:17:43.186429 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c91f12fa-96f0-442a-a3f7-70d56a697839" path="/var/lib/kubelet/pods/c91f12fa-96f0-442a-a3f7-70d56a697839/volumes" Mar 18 10:17:43 crc kubenswrapper[4733]: I0318 10:17:43.571123 4733 patch_prober.go:28] interesting pod/machine-config-daemon-2h7dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:17:43 crc kubenswrapper[4733]: I0318 10:17:43.572089 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:17:43 crc kubenswrapper[4733]: I0318 10:17:43.572164 4733 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" Mar 18 10:17:43 crc kubenswrapper[4733]: I0318 10:17:43.572923 4733 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"615e7a90421535b4f8ff5e3b3a0ad9c958710094ffa4e3e4eb3eb41c79f80830"} pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 10:17:43 crc kubenswrapper[4733]: I0318 10:17:43.572987 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" containerName="machine-config-daemon" containerID="cri-o://615e7a90421535b4f8ff5e3b3a0ad9c958710094ffa4e3e4eb3eb41c79f80830" gracePeriod=600 Mar 18 10:17:44 crc kubenswrapper[4733]: I0318 10:17:44.361637 4733 generic.go:334] "Generic (PLEG): container finished" podID="6f75e1c5-e0c5-43df-944f-77b734070793" containerID="615e7a90421535b4f8ff5e3b3a0ad9c958710094ffa4e3e4eb3eb41c79f80830" exitCode=0 Mar 18 10:17:44 crc kubenswrapper[4733]: I0318 10:17:44.362043 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" event={"ID":"6f75e1c5-e0c5-43df-944f-77b734070793","Type":"ContainerDied","Data":"615e7a90421535b4f8ff5e3b3a0ad9c958710094ffa4e3e4eb3eb41c79f80830"} Mar 18 10:17:44 crc kubenswrapper[4733]: I0318 10:17:44.362090 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" event={"ID":"6f75e1c5-e0c5-43df-944f-77b734070793","Type":"ContainerStarted","Data":"2dcc5035fa17fe3e92cf26ce37e02cacce4ad31a0593e6e1184b98062f31f028"} Mar 18 10:17:46 crc kubenswrapper[4733]: I0318 10:17:46.385848 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-n6hmz"] Mar 18 10:17:53 crc kubenswrapper[4733]: I0318 10:17:53.109332 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-855cb9cb7d-nd8zj"] Mar 18 10:17:53 crc kubenswrapper[4733]: I0318 10:17:53.110199 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-855cb9cb7d-nd8zj" podUID="9c910a75-c5cb-4f2e-ba5b-29866e412aae" containerName="controller-manager" containerID="cri-o://b2763b1f739435470aef969e3ffa833e68884b68a2e5a4fe94dd1c73a705f26a" gracePeriod=30 Mar 18 10:17:53 crc kubenswrapper[4733]: I0318 10:17:53.201721 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-795c5666f8-fqxxn"] Mar 18 10:17:53 crc kubenswrapper[4733]: I0318 10:17:53.201983 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-795c5666f8-fqxxn" podUID="2338c705-9627-4c7c-97c5-60c492309e8f" containerName="route-controller-manager" containerID="cri-o://b5cbc2cf0ca3b801430e89c3bd7280dd3b0874c8d8c15426c4e2737151b8f3ad" gracePeriod=30 Mar 18 10:17:53 crc kubenswrapper[4733]: I0318 10:17:53.419406 4733 generic.go:334] "Generic (PLEG): container finished" podID="9c910a75-c5cb-4f2e-ba5b-29866e412aae" containerID="b2763b1f739435470aef969e3ffa833e68884b68a2e5a4fe94dd1c73a705f26a" exitCode=0 Mar 18 10:17:53 crc kubenswrapper[4733]: I0318 10:17:53.419474 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-855cb9cb7d-nd8zj" event={"ID":"9c910a75-c5cb-4f2e-ba5b-29866e412aae","Type":"ContainerDied","Data":"b2763b1f739435470aef969e3ffa833e68884b68a2e5a4fe94dd1c73a705f26a"} Mar 18 10:17:53 crc kubenswrapper[4733]: I0318 10:17:53.421715 4733 generic.go:334] "Generic (PLEG): container finished" podID="2338c705-9627-4c7c-97c5-60c492309e8f" containerID="b5cbc2cf0ca3b801430e89c3bd7280dd3b0874c8d8c15426c4e2737151b8f3ad" exitCode=0 Mar 18 10:17:53 crc kubenswrapper[4733]: I0318 10:17:53.421761 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-795c5666f8-fqxxn" event={"ID":"2338c705-9627-4c7c-97c5-60c492309e8f","Type":"ContainerDied","Data":"b5cbc2cf0ca3b801430e89c3bd7280dd3b0874c8d8c15426c4e2737151b8f3ad"} Mar 18 10:17:53 crc kubenswrapper[4733]: I0318 10:17:53.687568 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-795c5666f8-fqxxn" Mar 18 10:17:53 crc kubenswrapper[4733]: I0318 10:17:53.695440 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-855cb9cb7d-nd8zj" Mar 18 10:17:53 crc kubenswrapper[4733]: I0318 10:17:53.826099 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c910a75-c5cb-4f2e-ba5b-29866e412aae-serving-cert\") pod \"9c910a75-c5cb-4f2e-ba5b-29866e412aae\" (UID: \"9c910a75-c5cb-4f2e-ba5b-29866e412aae\") " Mar 18 10:17:53 crc kubenswrapper[4733]: I0318 10:17:53.826165 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c910a75-c5cb-4f2e-ba5b-29866e412aae-config\") pod \"9c910a75-c5cb-4f2e-ba5b-29866e412aae\" (UID: \"9c910a75-c5cb-4f2e-ba5b-29866e412aae\") " Mar 18 10:17:53 crc kubenswrapper[4733]: I0318 10:17:53.826227 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2338c705-9627-4c7c-97c5-60c492309e8f-config\") pod \"2338c705-9627-4c7c-97c5-60c492309e8f\" (UID: \"2338c705-9627-4c7c-97c5-60c492309e8f\") " Mar 18 10:17:53 crc kubenswrapper[4733]: I0318 10:17:53.826287 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9c910a75-c5cb-4f2e-ba5b-29866e412aae-client-ca\") pod \"9c910a75-c5cb-4f2e-ba5b-29866e412aae\" (UID: \"9c910a75-c5cb-4f2e-ba5b-29866e412aae\") " Mar 18 10:17:53 crc kubenswrapper[4733]: I0318 10:17:53.826311 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2338c705-9627-4c7c-97c5-60c492309e8f-serving-cert\") pod \"2338c705-9627-4c7c-97c5-60c492309e8f\" (UID: \"2338c705-9627-4c7c-97c5-60c492309e8f\") " Mar 18 10:17:53 crc kubenswrapper[4733]: I0318 10:17:53.826339 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bffk7\" (UniqueName: \"kubernetes.io/projected/9c910a75-c5cb-4f2e-ba5b-29866e412aae-kube-api-access-bffk7\") pod \"9c910a75-c5cb-4f2e-ba5b-29866e412aae\" (UID: \"9c910a75-c5cb-4f2e-ba5b-29866e412aae\") " Mar 18 10:17:53 crc kubenswrapper[4733]: I0318 10:17:53.826382 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9c910a75-c5cb-4f2e-ba5b-29866e412aae-proxy-ca-bundles\") pod \"9c910a75-c5cb-4f2e-ba5b-29866e412aae\" (UID: \"9c910a75-c5cb-4f2e-ba5b-29866e412aae\") " Mar 18 10:17:53 crc kubenswrapper[4733]: I0318 10:17:53.826453 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2338c705-9627-4c7c-97c5-60c492309e8f-client-ca\") pod \"2338c705-9627-4c7c-97c5-60c492309e8f\" (UID: \"2338c705-9627-4c7c-97c5-60c492309e8f\") " Mar 18 10:17:53 crc kubenswrapper[4733]: I0318 10:17:53.826485 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgfsn\" (UniqueName: \"kubernetes.io/projected/2338c705-9627-4c7c-97c5-60c492309e8f-kube-api-access-sgfsn\") pod \"2338c705-9627-4c7c-97c5-60c492309e8f\" (UID: \"2338c705-9627-4c7c-97c5-60c492309e8f\") " Mar 18 10:17:53 crc kubenswrapper[4733]: I0318 10:17:53.827353 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c910a75-c5cb-4f2e-ba5b-29866e412aae-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "9c910a75-c5cb-4f2e-ba5b-29866e412aae" (UID: "9c910a75-c5cb-4f2e-ba5b-29866e412aae"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:17:53 crc kubenswrapper[4733]: I0318 10:17:53.827374 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c910a75-c5cb-4f2e-ba5b-29866e412aae-client-ca" (OuterVolumeSpecName: "client-ca") pod "9c910a75-c5cb-4f2e-ba5b-29866e412aae" (UID: "9c910a75-c5cb-4f2e-ba5b-29866e412aae"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:17:53 crc kubenswrapper[4733]: I0318 10:17:53.827353 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2338c705-9627-4c7c-97c5-60c492309e8f-client-ca" (OuterVolumeSpecName: "client-ca") pod "2338c705-9627-4c7c-97c5-60c492309e8f" (UID: "2338c705-9627-4c7c-97c5-60c492309e8f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:17:53 crc kubenswrapper[4733]: I0318 10:17:53.827598 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2338c705-9627-4c7c-97c5-60c492309e8f-config" (OuterVolumeSpecName: "config") pod "2338c705-9627-4c7c-97c5-60c492309e8f" (UID: "2338c705-9627-4c7c-97c5-60c492309e8f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:17:53 crc kubenswrapper[4733]: I0318 10:17:53.827802 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c910a75-c5cb-4f2e-ba5b-29866e412aae-config" (OuterVolumeSpecName: "config") pod "9c910a75-c5cb-4f2e-ba5b-29866e412aae" (UID: "9c910a75-c5cb-4f2e-ba5b-29866e412aae"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:17:53 crc kubenswrapper[4733]: I0318 10:17:53.832273 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c910a75-c5cb-4f2e-ba5b-29866e412aae-kube-api-access-bffk7" (OuterVolumeSpecName: "kube-api-access-bffk7") pod "9c910a75-c5cb-4f2e-ba5b-29866e412aae" (UID: "9c910a75-c5cb-4f2e-ba5b-29866e412aae"). InnerVolumeSpecName "kube-api-access-bffk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:17:53 crc kubenswrapper[4733]: I0318 10:17:53.832453 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c910a75-c5cb-4f2e-ba5b-29866e412aae-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9c910a75-c5cb-4f2e-ba5b-29866e412aae" (UID: "9c910a75-c5cb-4f2e-ba5b-29866e412aae"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:17:53 crc kubenswrapper[4733]: I0318 10:17:53.832494 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2338c705-9627-4c7c-97c5-60c492309e8f-kube-api-access-sgfsn" (OuterVolumeSpecName: "kube-api-access-sgfsn") pod "2338c705-9627-4c7c-97c5-60c492309e8f" (UID: "2338c705-9627-4c7c-97c5-60c492309e8f"). InnerVolumeSpecName "kube-api-access-sgfsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:17:53 crc kubenswrapper[4733]: I0318 10:17:53.832616 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2338c705-9627-4c7c-97c5-60c492309e8f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2338c705-9627-4c7c-97c5-60c492309e8f" (UID: "2338c705-9627-4c7c-97c5-60c492309e8f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:17:53 crc kubenswrapper[4733]: I0318 10:17:53.927882 4733 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c910a75-c5cb-4f2e-ba5b-29866e412aae-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 10:17:53 crc kubenswrapper[4733]: I0318 10:17:53.928090 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c910a75-c5cb-4f2e-ba5b-29866e412aae-config\") on node \"crc\" DevicePath \"\"" Mar 18 10:17:53 crc kubenswrapper[4733]: I0318 10:17:53.928213 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2338c705-9627-4c7c-97c5-60c492309e8f-config\") on node \"crc\" DevicePath \"\"" Mar 18 10:17:53 crc kubenswrapper[4733]: I0318 10:17:53.928287 4733 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9c910a75-c5cb-4f2e-ba5b-29866e412aae-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 10:17:53 crc kubenswrapper[4733]: I0318 10:17:53.928348 4733 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2338c705-9627-4c7c-97c5-60c492309e8f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 10:17:53 crc kubenswrapper[4733]: I0318 10:17:53.928409 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bffk7\" (UniqueName: \"kubernetes.io/projected/9c910a75-c5cb-4f2e-ba5b-29866e412aae-kube-api-access-bffk7\") on node \"crc\" DevicePath \"\"" Mar 18 10:17:53 crc kubenswrapper[4733]: I0318 10:17:53.928632 4733 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9c910a75-c5cb-4f2e-ba5b-29866e412aae-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 10:17:53 crc kubenswrapper[4733]: I0318 10:17:53.928674 4733 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2338c705-9627-4c7c-97c5-60c492309e8f-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 10:17:53 crc kubenswrapper[4733]: I0318 10:17:53.928693 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgfsn\" (UniqueName: \"kubernetes.io/projected/2338c705-9627-4c7c-97c5-60c492309e8f-kube-api-access-sgfsn\") on node \"crc\" DevicePath \"\"" Mar 18 10:17:54 crc kubenswrapper[4733]: I0318 10:17:54.431540 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-795c5666f8-fqxxn" Mar 18 10:17:54 crc kubenswrapper[4733]: I0318 10:17:54.431562 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-795c5666f8-fqxxn" event={"ID":"2338c705-9627-4c7c-97c5-60c492309e8f","Type":"ContainerDied","Data":"d2316c222486d15d81f6544c130d255f209d75ef4a225b569454d5bafde3bc6e"} Mar 18 10:17:54 crc kubenswrapper[4733]: I0318 10:17:54.431670 4733 scope.go:117] "RemoveContainer" containerID="b5cbc2cf0ca3b801430e89c3bd7280dd3b0874c8d8c15426c4e2737151b8f3ad" Mar 18 10:17:54 crc kubenswrapper[4733]: I0318 10:17:54.438009 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-855cb9cb7d-nd8zj" event={"ID":"9c910a75-c5cb-4f2e-ba5b-29866e412aae","Type":"ContainerDied","Data":"429b52d24bd3668b5415677a4bed9fb2a028c9049e40bf9e468c59dfe2fee20b"} Mar 18 10:17:54 crc kubenswrapper[4733]: I0318 10:17:54.438167 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-855cb9cb7d-nd8zj" Mar 18 10:17:54 crc kubenswrapper[4733]: I0318 10:17:54.451401 4733 scope.go:117] "RemoveContainer" containerID="b2763b1f739435470aef969e3ffa833e68884b68a2e5a4fe94dd1c73a705f26a" Mar 18 10:17:54 crc kubenswrapper[4733]: I0318 10:17:54.467039 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-56547f49c8-qcl2v"] Mar 18 10:17:54 crc kubenswrapper[4733]: E0318 10:17:54.467992 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02cd6358-355c-4db8-b0f7-2528618602ff" containerName="extract-utilities" Mar 18 10:17:54 crc kubenswrapper[4733]: I0318 10:17:54.468254 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="02cd6358-355c-4db8-b0f7-2528618602ff" containerName="extract-utilities" Mar 18 10:17:54 crc kubenswrapper[4733]: E0318 10:17:54.468440 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c91f12fa-96f0-442a-a3f7-70d56a697839" containerName="registry-server" Mar 18 10:17:54 crc kubenswrapper[4733]: I0318 10:17:54.468678 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="c91f12fa-96f0-442a-a3f7-70d56a697839" containerName="registry-server" Mar 18 10:17:54 crc kubenswrapper[4733]: E0318 10:17:54.468873 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02cd6358-355c-4db8-b0f7-2528618602ff" containerName="registry-server" Mar 18 10:17:54 crc kubenswrapper[4733]: I0318 10:17:54.469050 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="02cd6358-355c-4db8-b0f7-2528618602ff" containerName="registry-server" Mar 18 10:17:54 crc kubenswrapper[4733]: E0318 10:17:54.469276 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c91f12fa-96f0-442a-a3f7-70d56a697839" containerName="extract-utilities" Mar 18 10:17:54 crc kubenswrapper[4733]: I0318 10:17:54.469458 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="c91f12fa-96f0-442a-a3f7-70d56a697839" containerName="extract-utilities" Mar 18 10:17:54 crc kubenswrapper[4733]: E0318 10:17:54.469666 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02cd6358-355c-4db8-b0f7-2528618602ff" containerName="extract-content" Mar 18 10:17:54 crc kubenswrapper[4733]: I0318 10:17:54.472552 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="02cd6358-355c-4db8-b0f7-2528618602ff" containerName="extract-content" Mar 18 10:17:54 crc kubenswrapper[4733]: E0318 10:17:54.472849 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2338c705-9627-4c7c-97c5-60c492309e8f" containerName="route-controller-manager" Mar 18 10:17:54 crc kubenswrapper[4733]: I0318 10:17:54.473019 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="2338c705-9627-4c7c-97c5-60c492309e8f" containerName="route-controller-manager" Mar 18 10:17:54 crc kubenswrapper[4733]: E0318 10:17:54.473164 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c910a75-c5cb-4f2e-ba5b-29866e412aae" containerName="controller-manager" Mar 18 10:17:54 crc kubenswrapper[4733]: I0318 10:17:54.473743 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c910a75-c5cb-4f2e-ba5b-29866e412aae" containerName="controller-manager" Mar 18 10:17:54 crc kubenswrapper[4733]: E0318 10:17:54.473923 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c91f12fa-96f0-442a-a3f7-70d56a697839" containerName="extract-content" Mar 18 10:17:54 crc kubenswrapper[4733]: I0318 10:17:54.474096 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="c91f12fa-96f0-442a-a3f7-70d56a697839" containerName="extract-content" Mar 18 10:17:54 crc kubenswrapper[4733]: I0318 10:17:54.474786 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c910a75-c5cb-4f2e-ba5b-29866e412aae" containerName="controller-manager" Mar 18 10:17:54 crc kubenswrapper[4733]: I0318 10:17:54.474837 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="02cd6358-355c-4db8-b0f7-2528618602ff" containerName="registry-server" Mar 18 10:17:54 crc kubenswrapper[4733]: I0318 10:17:54.474850 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="c91f12fa-96f0-442a-a3f7-70d56a697839" containerName="registry-server" Mar 18 10:17:54 crc kubenswrapper[4733]: I0318 10:17:54.474871 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="2338c705-9627-4c7c-97c5-60c492309e8f" containerName="route-controller-manager" Mar 18 10:17:54 crc kubenswrapper[4733]: I0318 10:17:54.476442 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5975b786b-2xcz5"] Mar 18 10:17:54 crc kubenswrapper[4733]: I0318 10:17:54.477237 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5975b786b-2xcz5" Mar 18 10:17:54 crc kubenswrapper[4733]: I0318 10:17:54.478038 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-56547f49c8-qcl2v" Mar 18 10:17:54 crc kubenswrapper[4733]: I0318 10:17:54.481755 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 18 10:17:54 crc kubenswrapper[4733]: I0318 10:17:54.482763 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-795c5666f8-fqxxn"] Mar 18 10:17:54 crc kubenswrapper[4733]: I0318 10:17:54.485832 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 10:17:54 crc kubenswrapper[4733]: I0318 10:17:54.487289 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 10:17:54 crc kubenswrapper[4733]: I0318 10:17:54.487622 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 10:17:54 crc kubenswrapper[4733]: I0318 10:17:54.488325 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 10:17:54 crc kubenswrapper[4733]: I0318 10:17:54.488556 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 10:17:54 crc kubenswrapper[4733]: I0318 10:17:54.488678 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 18 10:17:54 crc kubenswrapper[4733]: I0318 10:17:54.488744 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 18 10:17:54 crc kubenswrapper[4733]: I0318 10:17:54.488900 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 18 10:17:54 crc kubenswrapper[4733]: I0318 10:17:54.488950 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 10:17:54 crc kubenswrapper[4733]: I0318 10:17:54.488550 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 18 10:17:54 crc kubenswrapper[4733]: I0318 10:17:54.489257 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 18 10:17:54 crc kubenswrapper[4733]: I0318 10:17:54.490294 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-795c5666f8-fqxxn"] Mar 18 10:17:54 crc kubenswrapper[4733]: I0318 10:17:54.494603 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 10:17:54 crc kubenswrapper[4733]: I0318 10:17:54.495650 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-56547f49c8-qcl2v"] Mar 18 10:17:54 crc kubenswrapper[4733]: I0318 10:17:54.498222 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5975b786b-2xcz5"] Mar 18 10:17:54 crc kubenswrapper[4733]: I0318 10:17:54.536604 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ff40242-1a33-443f-892d-145e88f98be2-config\") pod \"controller-manager-56547f49c8-qcl2v\" (UID: \"6ff40242-1a33-443f-892d-145e88f98be2\") " pod="openshift-controller-manager/controller-manager-56547f49c8-qcl2v" Mar 18 10:17:54 crc kubenswrapper[4733]: I0318 10:17:54.536664 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6ff40242-1a33-443f-892d-145e88f98be2-proxy-ca-bundles\") pod \"controller-manager-56547f49c8-qcl2v\" (UID: \"6ff40242-1a33-443f-892d-145e88f98be2\") " pod="openshift-controller-manager/controller-manager-56547f49c8-qcl2v" Mar 18 10:17:54 crc kubenswrapper[4733]: I0318 10:17:54.536806 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cv8sx\" (UniqueName: \"kubernetes.io/projected/f128aaea-3d79-459f-9a33-3a2505089c1c-kube-api-access-cv8sx\") pod \"route-controller-manager-5975b786b-2xcz5\" (UID: \"f128aaea-3d79-459f-9a33-3a2505089c1c\") " pod="openshift-route-controller-manager/route-controller-manager-5975b786b-2xcz5" Mar 18 10:17:54 crc kubenswrapper[4733]: I0318 10:17:54.536881 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f128aaea-3d79-459f-9a33-3a2505089c1c-client-ca\") pod \"route-controller-manager-5975b786b-2xcz5\" (UID: \"f128aaea-3d79-459f-9a33-3a2505089c1c\") " pod="openshift-route-controller-manager/route-controller-manager-5975b786b-2xcz5" Mar 18 10:17:54 crc kubenswrapper[4733]: I0318 10:17:54.536965 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f128aaea-3d79-459f-9a33-3a2505089c1c-config\") pod \"route-controller-manager-5975b786b-2xcz5\" (UID: \"f128aaea-3d79-459f-9a33-3a2505089c1c\") " pod="openshift-route-controller-manager/route-controller-manager-5975b786b-2xcz5" Mar 18 10:17:54 crc kubenswrapper[4733]: I0318 10:17:54.537087 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjfg9\" (UniqueName: \"kubernetes.io/projected/6ff40242-1a33-443f-892d-145e88f98be2-kube-api-access-qjfg9\") pod \"controller-manager-56547f49c8-qcl2v\" (UID: \"6ff40242-1a33-443f-892d-145e88f98be2\") " pod="openshift-controller-manager/controller-manager-56547f49c8-qcl2v" Mar 18 10:17:54 crc kubenswrapper[4733]: I0318 10:17:54.537115 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6ff40242-1a33-443f-892d-145e88f98be2-client-ca\") pod \"controller-manager-56547f49c8-qcl2v\" (UID: \"6ff40242-1a33-443f-892d-145e88f98be2\") " pod="openshift-controller-manager/controller-manager-56547f49c8-qcl2v" Mar 18 10:17:54 crc kubenswrapper[4733]: I0318 10:17:54.537138 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ff40242-1a33-443f-892d-145e88f98be2-serving-cert\") pod \"controller-manager-56547f49c8-qcl2v\" (UID: \"6ff40242-1a33-443f-892d-145e88f98be2\") " pod="openshift-controller-manager/controller-manager-56547f49c8-qcl2v" Mar 18 10:17:54 crc kubenswrapper[4733]: I0318 10:17:54.537169 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f128aaea-3d79-459f-9a33-3a2505089c1c-serving-cert\") pod \"route-controller-manager-5975b786b-2xcz5\" (UID: \"f128aaea-3d79-459f-9a33-3a2505089c1c\") " pod="openshift-route-controller-manager/route-controller-manager-5975b786b-2xcz5" Mar 18 10:17:54 crc kubenswrapper[4733]: I0318 10:17:54.539783 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-855cb9cb7d-nd8zj"] Mar 18 10:17:54 crc kubenswrapper[4733]: I0318 10:17:54.548364 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-855cb9cb7d-nd8zj"] Mar 18 10:17:54 crc kubenswrapper[4733]: I0318 10:17:54.638448 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6ff40242-1a33-443f-892d-145e88f98be2-client-ca\") pod \"controller-manager-56547f49c8-qcl2v\" (UID: \"6ff40242-1a33-443f-892d-145e88f98be2\") " pod="openshift-controller-manager/controller-manager-56547f49c8-qcl2v" Mar 18 10:17:54 crc kubenswrapper[4733]: I0318 10:17:54.638548 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ff40242-1a33-443f-892d-145e88f98be2-serving-cert\") pod \"controller-manager-56547f49c8-qcl2v\" (UID: \"6ff40242-1a33-443f-892d-145e88f98be2\") " pod="openshift-controller-manager/controller-manager-56547f49c8-qcl2v" Mar 18 10:17:54 crc kubenswrapper[4733]: I0318 10:17:54.638735 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f128aaea-3d79-459f-9a33-3a2505089c1c-serving-cert\") pod \"route-controller-manager-5975b786b-2xcz5\" (UID: \"f128aaea-3d79-459f-9a33-3a2505089c1c\") " pod="openshift-route-controller-manager/route-controller-manager-5975b786b-2xcz5" Mar 18 10:17:54 crc kubenswrapper[4733]: I0318 10:17:54.638816 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ff40242-1a33-443f-892d-145e88f98be2-config\") pod \"controller-manager-56547f49c8-qcl2v\" (UID: \"6ff40242-1a33-443f-892d-145e88f98be2\") " pod="openshift-controller-manager/controller-manager-56547f49c8-qcl2v" Mar 18 10:17:54 crc kubenswrapper[4733]: I0318 10:17:54.638869 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6ff40242-1a33-443f-892d-145e88f98be2-proxy-ca-bundles\") pod \"controller-manager-56547f49c8-qcl2v\" (UID: \"6ff40242-1a33-443f-892d-145e88f98be2\") " pod="openshift-controller-manager/controller-manager-56547f49c8-qcl2v" Mar 18 10:17:54 crc kubenswrapper[4733]: I0318 10:17:54.638911 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cv8sx\" (UniqueName: \"kubernetes.io/projected/f128aaea-3d79-459f-9a33-3a2505089c1c-kube-api-access-cv8sx\") pod \"route-controller-manager-5975b786b-2xcz5\" (UID: \"f128aaea-3d79-459f-9a33-3a2505089c1c\") " pod="openshift-route-controller-manager/route-controller-manager-5975b786b-2xcz5" Mar 18 10:17:54 crc kubenswrapper[4733]: I0318 10:17:54.638993 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f128aaea-3d79-459f-9a33-3a2505089c1c-client-ca\") pod \"route-controller-manager-5975b786b-2xcz5\" (UID: \"f128aaea-3d79-459f-9a33-3a2505089c1c\") " pod="openshift-route-controller-manager/route-controller-manager-5975b786b-2xcz5" Mar 18 10:17:54 crc kubenswrapper[4733]: I0318 10:17:54.639096 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f128aaea-3d79-459f-9a33-3a2505089c1c-config\") pod \"route-controller-manager-5975b786b-2xcz5\" (UID: \"f128aaea-3d79-459f-9a33-3a2505089c1c\") " pod="openshift-route-controller-manager/route-controller-manager-5975b786b-2xcz5" Mar 18 10:17:54 crc kubenswrapper[4733]: I0318 10:17:54.639322 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjfg9\" (UniqueName: \"kubernetes.io/projected/6ff40242-1a33-443f-892d-145e88f98be2-kube-api-access-qjfg9\") pod \"controller-manager-56547f49c8-qcl2v\" (UID: \"6ff40242-1a33-443f-892d-145e88f98be2\") " pod="openshift-controller-manager/controller-manager-56547f49c8-qcl2v" Mar 18 10:17:54 crc kubenswrapper[4733]: I0318 10:17:54.639945 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6ff40242-1a33-443f-892d-145e88f98be2-client-ca\") pod \"controller-manager-56547f49c8-qcl2v\" (UID: \"6ff40242-1a33-443f-892d-145e88f98be2\") " pod="openshift-controller-manager/controller-manager-56547f49c8-qcl2v" Mar 18 10:17:54 crc kubenswrapper[4733]: I0318 10:17:54.641236 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f128aaea-3d79-459f-9a33-3a2505089c1c-client-ca\") pod \"route-controller-manager-5975b786b-2xcz5\" (UID: \"f128aaea-3d79-459f-9a33-3a2505089c1c\") " pod="openshift-route-controller-manager/route-controller-manager-5975b786b-2xcz5" Mar 18 10:17:54 crc kubenswrapper[4733]: I0318 10:17:54.641340 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6ff40242-1a33-443f-892d-145e88f98be2-proxy-ca-bundles\") pod \"controller-manager-56547f49c8-qcl2v\" (UID: \"6ff40242-1a33-443f-892d-145e88f98be2\") " pod="openshift-controller-manager/controller-manager-56547f49c8-qcl2v" Mar 18 10:17:54 crc kubenswrapper[4733]: I0318 10:17:54.642137 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ff40242-1a33-443f-892d-145e88f98be2-config\") pod \"controller-manager-56547f49c8-qcl2v\" (UID: \"6ff40242-1a33-443f-892d-145e88f98be2\") " pod="openshift-controller-manager/controller-manager-56547f49c8-qcl2v" Mar 18 10:17:54 crc kubenswrapper[4733]: I0318 10:17:54.642466 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f128aaea-3d79-459f-9a33-3a2505089c1c-config\") pod \"route-controller-manager-5975b786b-2xcz5\" (UID: \"f128aaea-3d79-459f-9a33-3a2505089c1c\") " pod="openshift-route-controller-manager/route-controller-manager-5975b786b-2xcz5" Mar 18 10:17:54 crc kubenswrapper[4733]: I0318 10:17:54.644580 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f128aaea-3d79-459f-9a33-3a2505089c1c-serving-cert\") pod \"route-controller-manager-5975b786b-2xcz5\" (UID: \"f128aaea-3d79-459f-9a33-3a2505089c1c\") " pod="openshift-route-controller-manager/route-controller-manager-5975b786b-2xcz5" Mar 18 10:17:54 crc kubenswrapper[4733]: I0318 10:17:54.648738 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ff40242-1a33-443f-892d-145e88f98be2-serving-cert\") pod \"controller-manager-56547f49c8-qcl2v\" (UID: \"6ff40242-1a33-443f-892d-145e88f98be2\") " pod="openshift-controller-manager/controller-manager-56547f49c8-qcl2v" Mar 18 10:17:54 crc kubenswrapper[4733]: I0318 10:17:54.659736 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjfg9\" (UniqueName: \"kubernetes.io/projected/6ff40242-1a33-443f-892d-145e88f98be2-kube-api-access-qjfg9\") pod \"controller-manager-56547f49c8-qcl2v\" (UID: \"6ff40242-1a33-443f-892d-145e88f98be2\") " pod="openshift-controller-manager/controller-manager-56547f49c8-qcl2v" Mar 18 10:17:54 crc kubenswrapper[4733]: I0318 10:17:54.661960 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cv8sx\" (UniqueName: \"kubernetes.io/projected/f128aaea-3d79-459f-9a33-3a2505089c1c-kube-api-access-cv8sx\") pod \"route-controller-manager-5975b786b-2xcz5\" (UID: \"f128aaea-3d79-459f-9a33-3a2505089c1c\") " pod="openshift-route-controller-manager/route-controller-manager-5975b786b-2xcz5" Mar 18 10:17:54 crc kubenswrapper[4733]: I0318 10:17:54.845941 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5975b786b-2xcz5" Mar 18 10:17:54 crc kubenswrapper[4733]: I0318 10:17:54.856171 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-56547f49c8-qcl2v" Mar 18 10:17:55 crc kubenswrapper[4733]: I0318 10:17:55.090410 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-56547f49c8-qcl2v"] Mar 18 10:17:55 crc kubenswrapper[4733]: I0318 10:17:55.184518 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2338c705-9627-4c7c-97c5-60c492309e8f" path="/var/lib/kubelet/pods/2338c705-9627-4c7c-97c5-60c492309e8f/volumes" Mar 18 10:17:55 crc kubenswrapper[4733]: I0318 10:17:55.185400 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c910a75-c5cb-4f2e-ba5b-29866e412aae" path="/var/lib/kubelet/pods/9c910a75-c5cb-4f2e-ba5b-29866e412aae/volumes" Mar 18 10:17:55 crc kubenswrapper[4733]: I0318 10:17:55.254357 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5975b786b-2xcz5"] Mar 18 10:17:55 crc kubenswrapper[4733]: W0318 10:17:55.263615 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf128aaea_3d79_459f_9a33_3a2505089c1c.slice/crio-cdf7013e2f2b84291c4f32f283296d753d937c2f51fa8a1530466e4ba3550edd WatchSource:0}: Error finding container cdf7013e2f2b84291c4f32f283296d753d937c2f51fa8a1530466e4ba3550edd: Status 404 returned error can't find the container with id cdf7013e2f2b84291c4f32f283296d753d937c2f51fa8a1530466e4ba3550edd Mar 18 10:17:55 crc kubenswrapper[4733]: I0318 10:17:55.446112 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5975b786b-2xcz5" event={"ID":"f128aaea-3d79-459f-9a33-3a2505089c1c","Type":"ContainerStarted","Data":"cdf7013e2f2b84291c4f32f283296d753d937c2f51fa8a1530466e4ba3550edd"} Mar 18 10:17:55 crc kubenswrapper[4733]: I0318 10:17:55.449406 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-56547f49c8-qcl2v" event={"ID":"6ff40242-1a33-443f-892d-145e88f98be2","Type":"ContainerStarted","Data":"720c9ebb3f4e168d5a9de625b15345ed0376c77651ce2ea95b52e8d23eeeb0ca"} Mar 18 10:17:55 crc kubenswrapper[4733]: I0318 10:17:55.449458 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-56547f49c8-qcl2v" event={"ID":"6ff40242-1a33-443f-892d-145e88f98be2","Type":"ContainerStarted","Data":"df50f7e523e0addea33cd24a662f83fd05891abc2d4d7482d097e979154a9807"} Mar 18 10:17:55 crc kubenswrapper[4733]: I0318 10:17:55.450571 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-56547f49c8-qcl2v" Mar 18 10:17:55 crc kubenswrapper[4733]: I0318 10:17:55.454223 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-56547f49c8-qcl2v" Mar 18 10:17:55 crc kubenswrapper[4733]: I0318 10:17:55.466819 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-56547f49c8-qcl2v" podStartSLOduration=2.466805775 podStartE2EDuration="2.466805775s" podCreationTimestamp="2026-03-18 10:17:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:17:55.465957992 +0000 UTC m=+314.957692317" watchObservedRunningTime="2026-03-18 10:17:55.466805775 +0000 UTC m=+314.958540090" Mar 18 10:17:56 crc kubenswrapper[4733]: I0318 10:17:56.460174 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5975b786b-2xcz5" event={"ID":"f128aaea-3d79-459f-9a33-3a2505089c1c","Type":"ContainerStarted","Data":"fc309deba97733a5605b5f9ba276b7e31a6f5046e89882a706264a8983d5cffe"} Mar 18 10:17:56 crc kubenswrapper[4733]: I0318 10:17:56.477733 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5975b786b-2xcz5" podStartSLOduration=3.477714958 podStartE2EDuration="3.477714958s" podCreationTimestamp="2026-03-18 10:17:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:17:56.474731634 +0000 UTC m=+315.966465969" watchObservedRunningTime="2026-03-18 10:17:56.477714958 +0000 UTC m=+315.969449283" Mar 18 10:17:57 crc kubenswrapper[4733]: I0318 10:17:57.464685 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5975b786b-2xcz5" Mar 18 10:17:57 crc kubenswrapper[4733]: I0318 10:17:57.469978 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5975b786b-2xcz5" Mar 18 10:17:59 crc kubenswrapper[4733]: I0318 10:17:59.975306 4733 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 18 10:17:59 crc kubenswrapper[4733]: I0318 10:17:59.982343 4733 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 18 10:17:59 crc kubenswrapper[4733]: I0318 10:17:59.982991 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 10:17:59 crc kubenswrapper[4733]: I0318 10:17:59.983032 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://1614bd2915eb4ab62554cfe72d63669c062baaf25ae2e533788b876ff9544eba" gracePeriod=15 Mar 18 10:17:59 crc kubenswrapper[4733]: I0318 10:17:59.983667 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://6fa9eed1a11fd6a14b82ea9f34ead9b9c67e9c9d52c2675651b37f9838875052" gracePeriod=15 Mar 18 10:17:59 crc kubenswrapper[4733]: I0318 10:17:59.983878 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://b698902beccdf67c5646c01b34eea131f61dee8d5d6e1f566cdb70c930b2cde6" gracePeriod=15 Mar 18 10:17:59 crc kubenswrapper[4733]: I0318 10:17:59.983957 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://edcafff0c9902e275fc23a2f154d3030c0e751e2f3230a4ca226c9cef8efcbfa" gracePeriod=15 Mar 18 10:17:59 crc kubenswrapper[4733]: I0318 10:17:59.984550 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://7aaa002cf5203102149456e58fcc5db02a5e861736d3699e432a91186bac47d0" gracePeriod=15 Mar 18 10:18:00 crc kubenswrapper[4733]: I0318 10:18:00.003131 4733 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 18 10:18:00 crc kubenswrapper[4733]: E0318 10:18:00.003918 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 10:18:00 crc kubenswrapper[4733]: I0318 10:18:00.003953 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 10:18:00 crc kubenswrapper[4733]: E0318 10:18:00.003979 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 10:18:00 crc kubenswrapper[4733]: I0318 10:18:00.003993 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 10:18:00 crc kubenswrapper[4733]: E0318 10:18:00.004009 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 18 10:18:00 crc kubenswrapper[4733]: I0318 10:18:00.004022 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 18 10:18:00 crc kubenswrapper[4733]: E0318 10:18:00.004037 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 18 10:18:00 crc kubenswrapper[4733]: I0318 10:18:00.004050 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 18 10:18:00 crc kubenswrapper[4733]: E0318 10:18:00.004093 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 10:18:00 crc kubenswrapper[4733]: I0318 10:18:00.004105 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 10:18:00 crc kubenswrapper[4733]: E0318 10:18:00.004121 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 18 10:18:00 crc kubenswrapper[4733]: I0318 10:18:00.004133 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 18 10:18:00 crc kubenswrapper[4733]: E0318 10:18:00.004156 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 10:18:00 crc kubenswrapper[4733]: I0318 10:18:00.004168 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 10:18:00 crc kubenswrapper[4733]: E0318 10:18:00.004220 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 18 10:18:00 crc kubenswrapper[4733]: I0318 10:18:00.004238 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 18 10:18:00 crc kubenswrapper[4733]: E0318 10:18:00.004268 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 18 10:18:00 crc kubenswrapper[4733]: I0318 10:18:00.004281 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 18 10:18:00 crc kubenswrapper[4733]: I0318 10:18:00.004573 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 18 10:18:00 crc kubenswrapper[4733]: I0318 10:18:00.004606 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 18 10:18:00 crc kubenswrapper[4733]: I0318 10:18:00.004629 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 18 10:18:00 crc kubenswrapper[4733]: I0318 10:18:00.004656 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 10:18:00 crc kubenswrapper[4733]: I0318 10:18:00.004672 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 10:18:00 crc kubenswrapper[4733]: I0318 10:18:00.004689 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 10:18:00 crc kubenswrapper[4733]: I0318 10:18:00.004714 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 10:18:00 crc kubenswrapper[4733]: I0318 10:18:00.004779 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 18 10:18:00 crc kubenswrapper[4733]: E0318 10:18:00.005152 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 10:18:00 crc kubenswrapper[4733]: I0318 10:18:00.005170 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 10:18:00 crc kubenswrapper[4733]: I0318 10:18:00.010412 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 10:18:00 crc kubenswrapper[4733]: I0318 10:18:00.119173 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 10:18:00 crc kubenswrapper[4733]: I0318 10:18:00.119277 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 10:18:00 crc kubenswrapper[4733]: I0318 10:18:00.119323 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 10:18:00 crc kubenswrapper[4733]: I0318 10:18:00.119363 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 10:18:00 crc kubenswrapper[4733]: I0318 10:18:00.119394 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 10:18:00 crc kubenswrapper[4733]: I0318 10:18:00.119465 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 10:18:00 crc kubenswrapper[4733]: I0318 10:18:00.119508 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 10:18:00 crc kubenswrapper[4733]: I0318 10:18:00.119603 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 10:18:00 crc kubenswrapper[4733]: I0318 10:18:00.220211 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 10:18:00 crc kubenswrapper[4733]: I0318 10:18:00.220338 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 10:18:00 crc kubenswrapper[4733]: I0318 10:18:00.220360 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 10:18:00 crc kubenswrapper[4733]: I0318 10:18:00.220381 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 10:18:00 crc kubenswrapper[4733]: I0318 10:18:00.220399 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 10:18:00 crc kubenswrapper[4733]: I0318 10:18:00.220416 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 10:18:00 crc kubenswrapper[4733]: I0318 10:18:00.220466 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 10:18:00 crc kubenswrapper[4733]: I0318 10:18:00.220507 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 10:18:00 crc kubenswrapper[4733]: I0318 10:18:00.221161 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 10:18:00 crc kubenswrapper[4733]: I0318 10:18:00.221453 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 10:18:00 crc kubenswrapper[4733]: I0318 10:18:00.221490 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 10:18:00 crc kubenswrapper[4733]: I0318 10:18:00.221699 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 10:18:00 crc kubenswrapper[4733]: I0318 10:18:00.221701 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 10:18:00 crc kubenswrapper[4733]: I0318 10:18:00.221732 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 10:18:00 crc kubenswrapper[4733]: I0318 10:18:00.221723 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 10:18:00 crc kubenswrapper[4733]: I0318 10:18:00.221784 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 10:18:00 crc kubenswrapper[4733]: I0318 10:18:00.485236 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 18 10:18:00 crc kubenswrapper[4733]: I0318 10:18:00.488124 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 18 10:18:00 crc kubenswrapper[4733]: I0318 10:18:00.490659 4733 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6fa9eed1a11fd6a14b82ea9f34ead9b9c67e9c9d52c2675651b37f9838875052" exitCode=0 Mar 18 10:18:00 crc kubenswrapper[4733]: I0318 10:18:00.490731 4733 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b698902beccdf67c5646c01b34eea131f61dee8d5d6e1f566cdb70c930b2cde6" exitCode=0 Mar 18 10:18:00 crc kubenswrapper[4733]: I0318 10:18:00.490750 4733 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7aaa002cf5203102149456e58fcc5db02a5e861736d3699e432a91186bac47d0" exitCode=0 Mar 18 10:18:00 crc kubenswrapper[4733]: I0318 10:18:00.490767 4733 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="edcafff0c9902e275fc23a2f154d3030c0e751e2f3230a4ca226c9cef8efcbfa" exitCode=2 Mar 18 10:18:00 crc kubenswrapper[4733]: I0318 10:18:00.490879 4733 scope.go:117] "RemoveContainer" containerID="ba371d0dc81f8827d305037cab25306e3abe8ed3d243f74923b4709198f7ea38" Mar 18 10:18:00 crc kubenswrapper[4733]: I0318 10:18:00.494265 4733 generic.go:334] "Generic (PLEG): container finished" podID="71bc6618-8df4-4a35-9469-772a853eff06" containerID="7f5d2a4800b0b935a593d622bab229709f8902d75a6b9d3d310047bf50063a1a" exitCode=0 Mar 18 10:18:00 crc kubenswrapper[4733]: I0318 10:18:00.494351 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"71bc6618-8df4-4a35-9469-772a853eff06","Type":"ContainerDied","Data":"7f5d2a4800b0b935a593d622bab229709f8902d75a6b9d3d310047bf50063a1a"} Mar 18 10:18:00 crc kubenswrapper[4733]: I0318 10:18:00.495599 4733 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.184:6443: connect: connection refused" Mar 18 10:18:00 crc kubenswrapper[4733]: I0318 10:18:00.496102 4733 status_manager.go:851] "Failed to get status for pod" podUID="71bc6618-8df4-4a35-9469-772a853eff06" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.184:6443: connect: connection refused" Mar 18 10:18:01 crc kubenswrapper[4733]: I0318 10:18:01.179108 4733 status_manager.go:851] "Failed to get status for pod" podUID="71bc6618-8df4-4a35-9469-772a853eff06" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.184:6443: connect: connection refused" Mar 18 10:18:01 crc kubenswrapper[4733]: I0318 10:18:01.179540 4733 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.184:6443: connect: connection refused" Mar 18 10:18:01 crc kubenswrapper[4733]: I0318 10:18:01.507209 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 18 10:18:01 crc kubenswrapper[4733]: I0318 10:18:01.923438 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 18 10:18:01 crc kubenswrapper[4733]: I0318 10:18:01.924170 4733 status_manager.go:851] "Failed to get status for pod" podUID="71bc6618-8df4-4a35-9469-772a853eff06" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.184:6443: connect: connection refused" Mar 18 10:18:01 crc kubenswrapper[4733]: I0318 10:18:01.946264 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/71bc6618-8df4-4a35-9469-772a853eff06-kubelet-dir\") pod \"71bc6618-8df4-4a35-9469-772a853eff06\" (UID: \"71bc6618-8df4-4a35-9469-772a853eff06\") " Mar 18 10:18:01 crc kubenswrapper[4733]: I0318 10:18:01.946417 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/71bc6618-8df4-4a35-9469-772a853eff06-var-lock\") pod \"71bc6618-8df4-4a35-9469-772a853eff06\" (UID: \"71bc6618-8df4-4a35-9469-772a853eff06\") " Mar 18 10:18:01 crc kubenswrapper[4733]: I0318 10:18:01.946416 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/71bc6618-8df4-4a35-9469-772a853eff06-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "71bc6618-8df4-4a35-9469-772a853eff06" (UID: "71bc6618-8df4-4a35-9469-772a853eff06"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 10:18:01 crc kubenswrapper[4733]: I0318 10:18:01.946446 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/71bc6618-8df4-4a35-9469-772a853eff06-kube-api-access\") pod \"71bc6618-8df4-4a35-9469-772a853eff06\" (UID: \"71bc6618-8df4-4a35-9469-772a853eff06\") " Mar 18 10:18:01 crc kubenswrapper[4733]: I0318 10:18:01.946468 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/71bc6618-8df4-4a35-9469-772a853eff06-var-lock" (OuterVolumeSpecName: "var-lock") pod "71bc6618-8df4-4a35-9469-772a853eff06" (UID: "71bc6618-8df4-4a35-9469-772a853eff06"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 10:18:01 crc kubenswrapper[4733]: I0318 10:18:01.946766 4733 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/71bc6618-8df4-4a35-9469-772a853eff06-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 18 10:18:01 crc kubenswrapper[4733]: I0318 10:18:01.946777 4733 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/71bc6618-8df4-4a35-9469-772a853eff06-var-lock\") on node \"crc\" DevicePath \"\"" Mar 18 10:18:01 crc kubenswrapper[4733]: I0318 10:18:01.952958 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71bc6618-8df4-4a35-9469-772a853eff06-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "71bc6618-8df4-4a35-9469-772a853eff06" (UID: "71bc6618-8df4-4a35-9469-772a853eff06"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:18:02 crc kubenswrapper[4733]: I0318 10:18:02.048226 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/71bc6618-8df4-4a35-9469-772a853eff06-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 10:18:02 crc kubenswrapper[4733]: I0318 10:18:02.370074 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 18 10:18:02 crc kubenswrapper[4733]: I0318 10:18:02.371069 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 10:18:02 crc kubenswrapper[4733]: I0318 10:18:02.371737 4733 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.184:6443: connect: connection refused" Mar 18 10:18:02 crc kubenswrapper[4733]: I0318 10:18:02.372218 4733 status_manager.go:851] "Failed to get status for pod" podUID="71bc6618-8df4-4a35-9469-772a853eff06" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.184:6443: connect: connection refused" Mar 18 10:18:02 crc kubenswrapper[4733]: I0318 10:18:02.451705 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 18 10:18:02 crc kubenswrapper[4733]: I0318 10:18:02.451766 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 18 10:18:02 crc kubenswrapper[4733]: I0318 10:18:02.451820 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 10:18:02 crc kubenswrapper[4733]: I0318 10:18:02.451877 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 10:18:02 crc kubenswrapper[4733]: I0318 10:18:02.451972 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 18 10:18:02 crc kubenswrapper[4733]: I0318 10:18:02.452169 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 10:18:02 crc kubenswrapper[4733]: I0318 10:18:02.452629 4733 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 18 10:18:02 crc kubenswrapper[4733]: I0318 10:18:02.452690 4733 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 18 10:18:02 crc kubenswrapper[4733]: I0318 10:18:02.452711 4733 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 18 10:18:02 crc kubenswrapper[4733]: I0318 10:18:02.517054 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 18 10:18:02 crc kubenswrapper[4733]: I0318 10:18:02.517943 4733 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1614bd2915eb4ab62554cfe72d63669c062baaf25ae2e533788b876ff9544eba" exitCode=0 Mar 18 10:18:02 crc kubenswrapper[4733]: I0318 10:18:02.518010 4733 scope.go:117] "RemoveContainer" containerID="6fa9eed1a11fd6a14b82ea9f34ead9b9c67e9c9d52c2675651b37f9838875052" Mar 18 10:18:02 crc kubenswrapper[4733]: I0318 10:18:02.518030 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 10:18:02 crc kubenswrapper[4733]: I0318 10:18:02.519845 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"71bc6618-8df4-4a35-9469-772a853eff06","Type":"ContainerDied","Data":"5f44602afd79b72c25bccf945c72fd688dbc42ff6b86533bf0722398bd85fb3d"} Mar 18 10:18:02 crc kubenswrapper[4733]: I0318 10:18:02.519888 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f44602afd79b72c25bccf945c72fd688dbc42ff6b86533bf0722398bd85fb3d" Mar 18 10:18:02 crc kubenswrapper[4733]: I0318 10:18:02.519975 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 18 10:18:02 crc kubenswrapper[4733]: I0318 10:18:02.532848 4733 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.184:6443: connect: connection refused" Mar 18 10:18:02 crc kubenswrapper[4733]: I0318 10:18:02.533573 4733 status_manager.go:851] "Failed to get status for pod" podUID="71bc6618-8df4-4a35-9469-772a853eff06" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.184:6443: connect: connection refused" Mar 18 10:18:02 crc kubenswrapper[4733]: I0318 10:18:02.539788 4733 scope.go:117] "RemoveContainer" containerID="b698902beccdf67c5646c01b34eea131f61dee8d5d6e1f566cdb70c930b2cde6" Mar 18 10:18:02 crc kubenswrapper[4733]: I0318 10:18:02.544842 4733 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.184:6443: connect: connection refused" Mar 18 10:18:02 crc kubenswrapper[4733]: I0318 10:18:02.545576 4733 status_manager.go:851] "Failed to get status for pod" podUID="71bc6618-8df4-4a35-9469-772a853eff06" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.184:6443: connect: connection refused" Mar 18 10:18:02 crc kubenswrapper[4733]: I0318 10:18:02.554308 4733 scope.go:117] "RemoveContainer" containerID="7aaa002cf5203102149456e58fcc5db02a5e861736d3699e432a91186bac47d0" Mar 18 10:18:02 crc kubenswrapper[4733]: I0318 10:18:02.574410 4733 scope.go:117] "RemoveContainer" containerID="edcafff0c9902e275fc23a2f154d3030c0e751e2f3230a4ca226c9cef8efcbfa" Mar 18 10:18:02 crc kubenswrapper[4733]: I0318 10:18:02.595901 4733 scope.go:117] "RemoveContainer" containerID="1614bd2915eb4ab62554cfe72d63669c062baaf25ae2e533788b876ff9544eba" Mar 18 10:18:02 crc kubenswrapper[4733]: I0318 10:18:02.617242 4733 scope.go:117] "RemoveContainer" containerID="ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734" Mar 18 10:18:02 crc kubenswrapper[4733]: I0318 10:18:02.637636 4733 scope.go:117] "RemoveContainer" containerID="6fa9eed1a11fd6a14b82ea9f34ead9b9c67e9c9d52c2675651b37f9838875052" Mar 18 10:18:02 crc kubenswrapper[4733]: E0318 10:18:02.638284 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fa9eed1a11fd6a14b82ea9f34ead9b9c67e9c9d52c2675651b37f9838875052\": container with ID starting with 6fa9eed1a11fd6a14b82ea9f34ead9b9c67e9c9d52c2675651b37f9838875052 not found: ID does not exist" containerID="6fa9eed1a11fd6a14b82ea9f34ead9b9c67e9c9d52c2675651b37f9838875052" Mar 18 10:18:02 crc kubenswrapper[4733]: I0318 10:18:02.638361 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fa9eed1a11fd6a14b82ea9f34ead9b9c67e9c9d52c2675651b37f9838875052"} err="failed to get container status \"6fa9eed1a11fd6a14b82ea9f34ead9b9c67e9c9d52c2675651b37f9838875052\": rpc error: code = NotFound desc = could not find container \"6fa9eed1a11fd6a14b82ea9f34ead9b9c67e9c9d52c2675651b37f9838875052\": container with ID starting with 6fa9eed1a11fd6a14b82ea9f34ead9b9c67e9c9d52c2675651b37f9838875052 not found: ID does not exist" Mar 18 10:18:02 crc kubenswrapper[4733]: I0318 10:18:02.638413 4733 scope.go:117] "RemoveContainer" containerID="b698902beccdf67c5646c01b34eea131f61dee8d5d6e1f566cdb70c930b2cde6" Mar 18 10:18:02 crc kubenswrapper[4733]: E0318 10:18:02.638873 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b698902beccdf67c5646c01b34eea131f61dee8d5d6e1f566cdb70c930b2cde6\": container with ID starting with b698902beccdf67c5646c01b34eea131f61dee8d5d6e1f566cdb70c930b2cde6 not found: ID does not exist" containerID="b698902beccdf67c5646c01b34eea131f61dee8d5d6e1f566cdb70c930b2cde6" Mar 18 10:18:02 crc kubenswrapper[4733]: I0318 10:18:02.638917 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b698902beccdf67c5646c01b34eea131f61dee8d5d6e1f566cdb70c930b2cde6"} err="failed to get container status \"b698902beccdf67c5646c01b34eea131f61dee8d5d6e1f566cdb70c930b2cde6\": rpc error: code = NotFound desc = could not find container \"b698902beccdf67c5646c01b34eea131f61dee8d5d6e1f566cdb70c930b2cde6\": container with ID starting with b698902beccdf67c5646c01b34eea131f61dee8d5d6e1f566cdb70c930b2cde6 not found: ID does not exist" Mar 18 10:18:02 crc kubenswrapper[4733]: I0318 10:18:02.638946 4733 scope.go:117] "RemoveContainer" containerID="7aaa002cf5203102149456e58fcc5db02a5e861736d3699e432a91186bac47d0" Mar 18 10:18:02 crc kubenswrapper[4733]: E0318 10:18:02.639386 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7aaa002cf5203102149456e58fcc5db02a5e861736d3699e432a91186bac47d0\": container with ID starting with 7aaa002cf5203102149456e58fcc5db02a5e861736d3699e432a91186bac47d0 not found: ID does not exist" containerID="7aaa002cf5203102149456e58fcc5db02a5e861736d3699e432a91186bac47d0" Mar 18 10:18:02 crc kubenswrapper[4733]: I0318 10:18:02.639441 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7aaa002cf5203102149456e58fcc5db02a5e861736d3699e432a91186bac47d0"} err="failed to get container status \"7aaa002cf5203102149456e58fcc5db02a5e861736d3699e432a91186bac47d0\": rpc error: code = NotFound desc = could not find container \"7aaa002cf5203102149456e58fcc5db02a5e861736d3699e432a91186bac47d0\": container with ID starting with 7aaa002cf5203102149456e58fcc5db02a5e861736d3699e432a91186bac47d0 not found: ID does not exist" Mar 18 10:18:02 crc kubenswrapper[4733]: I0318 10:18:02.639509 4733 scope.go:117] "RemoveContainer" containerID="edcafff0c9902e275fc23a2f154d3030c0e751e2f3230a4ca226c9cef8efcbfa" Mar 18 10:18:02 crc kubenswrapper[4733]: E0318 10:18:02.639928 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edcafff0c9902e275fc23a2f154d3030c0e751e2f3230a4ca226c9cef8efcbfa\": container with ID starting with edcafff0c9902e275fc23a2f154d3030c0e751e2f3230a4ca226c9cef8efcbfa not found: ID does not exist" containerID="edcafff0c9902e275fc23a2f154d3030c0e751e2f3230a4ca226c9cef8efcbfa" Mar 18 10:18:02 crc kubenswrapper[4733]: I0318 10:18:02.639989 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edcafff0c9902e275fc23a2f154d3030c0e751e2f3230a4ca226c9cef8efcbfa"} err="failed to get container status \"edcafff0c9902e275fc23a2f154d3030c0e751e2f3230a4ca226c9cef8efcbfa\": rpc error: code = NotFound desc = could not find container \"edcafff0c9902e275fc23a2f154d3030c0e751e2f3230a4ca226c9cef8efcbfa\": container with ID starting with edcafff0c9902e275fc23a2f154d3030c0e751e2f3230a4ca226c9cef8efcbfa not found: ID does not exist" Mar 18 10:18:02 crc kubenswrapper[4733]: I0318 10:18:02.640010 4733 scope.go:117] "RemoveContainer" containerID="1614bd2915eb4ab62554cfe72d63669c062baaf25ae2e533788b876ff9544eba" Mar 18 10:18:02 crc kubenswrapper[4733]: E0318 10:18:02.640319 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1614bd2915eb4ab62554cfe72d63669c062baaf25ae2e533788b876ff9544eba\": container with ID starting with 1614bd2915eb4ab62554cfe72d63669c062baaf25ae2e533788b876ff9544eba not found: ID does not exist" containerID="1614bd2915eb4ab62554cfe72d63669c062baaf25ae2e533788b876ff9544eba" Mar 18 10:18:02 crc kubenswrapper[4733]: I0318 10:18:02.640346 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1614bd2915eb4ab62554cfe72d63669c062baaf25ae2e533788b876ff9544eba"} err="failed to get container status \"1614bd2915eb4ab62554cfe72d63669c062baaf25ae2e533788b876ff9544eba\": rpc error: code = NotFound desc = could not find container \"1614bd2915eb4ab62554cfe72d63669c062baaf25ae2e533788b876ff9544eba\": container with ID starting with 1614bd2915eb4ab62554cfe72d63669c062baaf25ae2e533788b876ff9544eba not found: ID does not exist" Mar 18 10:18:02 crc kubenswrapper[4733]: I0318 10:18:02.640364 4733 scope.go:117] "RemoveContainer" containerID="ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734" Mar 18 10:18:02 crc kubenswrapper[4733]: E0318 10:18:02.640671 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\": container with ID starting with ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734 not found: ID does not exist" containerID="ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734" Mar 18 10:18:02 crc kubenswrapper[4733]: I0318 10:18:02.640696 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734"} err="failed to get container status \"ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\": rpc error: code = NotFound desc = could not find container \"ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734\": container with ID starting with ea9fbcd7d532de4f7ea45ab0610692732e5b6f4df725aac2f68c72d8dbdb6734 not found: ID does not exist" Mar 18 10:18:03 crc kubenswrapper[4733]: I0318 10:18:03.190888 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 18 10:18:05 crc kubenswrapper[4733]: E0318 10:18:05.046157 4733 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.184:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 10:18:05 crc kubenswrapper[4733]: I0318 10:18:05.047129 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 10:18:05 crc kubenswrapper[4733]: E0318 10:18:05.083341 4733 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.184:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189de8279c9a6215 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 10:18:05.082575381 +0000 UTC m=+324.574309706,LastTimestamp:2026-03-18 10:18:05.082575381 +0000 UTC m=+324.574309706,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 10:18:05 crc kubenswrapper[4733]: I0318 10:18:05.543289 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"2a3ba4486c7a6274fac695e45bc91d2e46d704c4c9018832a7072975d4fecee0"} Mar 18 10:18:05 crc kubenswrapper[4733]: I0318 10:18:05.543355 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"e08d1b29840f18c35701cc9dbeb6a5d031bf4e9b063c0a1576f871038a4ba2b5"} Mar 18 10:18:05 crc kubenswrapper[4733]: I0318 10:18:05.544061 4733 status_manager.go:851] "Failed to get status for pod" podUID="71bc6618-8df4-4a35-9469-772a853eff06" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.184:6443: connect: connection refused" Mar 18 10:18:05 crc kubenswrapper[4733]: E0318 10:18:05.544073 4733 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.184:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 10:18:05 crc kubenswrapper[4733]: E0318 10:18:05.921210 4733 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.184:6443: connect: connection refused" Mar 18 10:18:05 crc kubenswrapper[4733]: E0318 10:18:05.922654 4733 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.184:6443: connect: connection refused" Mar 18 10:18:05 crc kubenswrapper[4733]: E0318 10:18:05.923160 4733 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.184:6443: connect: connection refused" Mar 18 10:18:05 crc kubenswrapper[4733]: E0318 10:18:05.923893 4733 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.184:6443: connect: connection refused" Mar 18 10:18:05 crc kubenswrapper[4733]: E0318 10:18:05.924308 4733 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.184:6443: connect: connection refused" Mar 18 10:18:05 crc kubenswrapper[4733]: I0318 10:18:05.924342 4733 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 18 10:18:05 crc kubenswrapper[4733]: E0318 10:18:05.924617 4733 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.184:6443: connect: connection refused" interval="200ms" Mar 18 10:18:06 crc kubenswrapper[4733]: E0318 10:18:06.125794 4733 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.184:6443: connect: connection refused" interval="400ms" Mar 18 10:18:06 crc kubenswrapper[4733]: E0318 10:18:06.528370 4733 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.184:6443: connect: connection refused" interval="800ms" Mar 18 10:18:07 crc kubenswrapper[4733]: E0318 10:18:07.329725 4733 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.184:6443: connect: connection refused" interval="1.6s" Mar 18 10:18:07 crc kubenswrapper[4733]: E0318 10:18:07.634769 4733 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.184:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189de8279c9a6215 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 10:18:05.082575381 +0000 UTC m=+324.574309706,LastTimestamp:2026-03-18 10:18:05.082575381 +0000 UTC m=+324.574309706,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 10:18:08 crc kubenswrapper[4733]: E0318 10:18:08.931420 4733 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.184:6443: connect: connection refused" interval="3.2s" Mar 18 10:18:11 crc kubenswrapper[4733]: I0318 10:18:11.177988 4733 status_manager.go:851] "Failed to get status for pod" podUID="71bc6618-8df4-4a35-9469-772a853eff06" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.184:6443: connect: connection refused" Mar 18 10:18:11 crc kubenswrapper[4733]: I0318 10:18:11.422415 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-n6hmz" podUID="486eda8c-6e6f-4761-b28c-8aeb72fcfcc1" containerName="oauth-openshift" containerID="cri-o://2063ba38b8f338dff7686f6578cd42c9d0c532672eb45f293854b46ba18f0fea" gracePeriod=15 Mar 18 10:18:11 crc kubenswrapper[4733]: I0318 10:18:11.584380 4733 generic.go:334] "Generic (PLEG): container finished" podID="486eda8c-6e6f-4761-b28c-8aeb72fcfcc1" containerID="2063ba38b8f338dff7686f6578cd42c9d0c532672eb45f293854b46ba18f0fea" exitCode=0 Mar 18 10:18:11 crc kubenswrapper[4733]: I0318 10:18:11.584452 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-n6hmz" event={"ID":"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1","Type":"ContainerDied","Data":"2063ba38b8f338dff7686f6578cd42c9d0c532672eb45f293854b46ba18f0fea"} Mar 18 10:18:11 crc kubenswrapper[4733]: I0318 10:18:11.934311 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-n6hmz" Mar 18 10:18:11 crc kubenswrapper[4733]: I0318 10:18:11.935136 4733 status_manager.go:851] "Failed to get status for pod" podUID="71bc6618-8df4-4a35-9469-772a853eff06" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.184:6443: connect: connection refused" Mar 18 10:18:11 crc kubenswrapper[4733]: I0318 10:18:11.935427 4733 status_manager.go:851] "Failed to get status for pod" podUID="486eda8c-6e6f-4761-b28c-8aeb72fcfcc1" pod="openshift-authentication/oauth-openshift-558db77b4-n6hmz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-n6hmz\": dial tcp 38.102.83.184:6443: connect: connection refused" Mar 18 10:18:12 crc kubenswrapper[4733]: I0318 10:18:12.029070 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-system-trusted-ca-bundle\") pod \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\" (UID: \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\") " Mar 18 10:18:12 crc kubenswrapper[4733]: I0318 10:18:12.029122 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-audit-policies\") pod \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\" (UID: \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\") " Mar 18 10:18:12 crc kubenswrapper[4733]: I0318 10:18:12.029149 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-system-cliconfig\") pod \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\" (UID: \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\") " Mar 18 10:18:12 crc kubenswrapper[4733]: I0318 10:18:12.029208 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-audit-dir\") pod \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\" (UID: \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\") " Mar 18 10:18:12 crc kubenswrapper[4733]: I0318 10:18:12.029227 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-system-service-ca\") pod \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\" (UID: \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\") " Mar 18 10:18:12 crc kubenswrapper[4733]: I0318 10:18:12.029372 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "486eda8c-6e6f-4761-b28c-8aeb72fcfcc1" (UID: "486eda8c-6e6f-4761-b28c-8aeb72fcfcc1"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 10:18:12 crc kubenswrapper[4733]: I0318 10:18:12.030091 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "486eda8c-6e6f-4761-b28c-8aeb72fcfcc1" (UID: "486eda8c-6e6f-4761-b28c-8aeb72fcfcc1"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:18:12 crc kubenswrapper[4733]: I0318 10:18:12.030109 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "486eda8c-6e6f-4761-b28c-8aeb72fcfcc1" (UID: "486eda8c-6e6f-4761-b28c-8aeb72fcfcc1"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:18:12 crc kubenswrapper[4733]: I0318 10:18:12.030206 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "486eda8c-6e6f-4761-b28c-8aeb72fcfcc1" (UID: "486eda8c-6e6f-4761-b28c-8aeb72fcfcc1"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:18:12 crc kubenswrapper[4733]: I0318 10:18:12.030344 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "486eda8c-6e6f-4761-b28c-8aeb72fcfcc1" (UID: "486eda8c-6e6f-4761-b28c-8aeb72fcfcc1"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:18:12 crc kubenswrapper[4733]: I0318 10:18:12.030384 4733 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 18 10:18:12 crc kubenswrapper[4733]: I0318 10:18:12.030404 4733 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 10:18:12 crc kubenswrapper[4733]: I0318 10:18:12.030417 4733 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 10:18:12 crc kubenswrapper[4733]: I0318 10:18:12.030428 4733 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 18 10:18:12 crc kubenswrapper[4733]: I0318 10:18:12.131266 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-user-template-login\") pod \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\" (UID: \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\") " Mar 18 10:18:12 crc kubenswrapper[4733]: I0318 10:18:12.131331 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-system-serving-cert\") pod \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\" (UID: \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\") " Mar 18 10:18:12 crc kubenswrapper[4733]: I0318 10:18:12.131349 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-user-template-provider-selection\") pod \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\" (UID: \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\") " Mar 18 10:18:12 crc kubenswrapper[4733]: I0318 10:18:12.131382 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-system-router-certs\") pod \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\" (UID: \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\") " Mar 18 10:18:12 crc kubenswrapper[4733]: I0318 10:18:12.131443 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-system-ocp-branding-template\") pod \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\" (UID: \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\") " Mar 18 10:18:12 crc kubenswrapper[4733]: I0318 10:18:12.131461 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-user-template-error\") pod \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\" (UID: \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\") " Mar 18 10:18:12 crc kubenswrapper[4733]: I0318 10:18:12.131482 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-user-idp-0-file-data\") pod \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\" (UID: \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\") " Mar 18 10:18:12 crc kubenswrapper[4733]: I0318 10:18:12.131504 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4clx\" (UniqueName: \"kubernetes.io/projected/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-kube-api-access-h4clx\") pod \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\" (UID: \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\") " Mar 18 10:18:12 crc kubenswrapper[4733]: I0318 10:18:12.131606 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-system-session\") pod \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\" (UID: \"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1\") " Mar 18 10:18:12 crc kubenswrapper[4733]: I0318 10:18:12.131972 4733 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 18 10:18:12 crc kubenswrapper[4733]: E0318 10:18:12.132911 4733 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.184:6443: connect: connection refused" interval="6.4s" Mar 18 10:18:12 crc kubenswrapper[4733]: I0318 10:18:12.138860 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "486eda8c-6e6f-4761-b28c-8aeb72fcfcc1" (UID: "486eda8c-6e6f-4761-b28c-8aeb72fcfcc1"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:18:12 crc kubenswrapper[4733]: I0318 10:18:12.138984 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-kube-api-access-h4clx" (OuterVolumeSpecName: "kube-api-access-h4clx") pod "486eda8c-6e6f-4761-b28c-8aeb72fcfcc1" (UID: "486eda8c-6e6f-4761-b28c-8aeb72fcfcc1"). InnerVolumeSpecName "kube-api-access-h4clx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:18:12 crc kubenswrapper[4733]: I0318 10:18:12.139605 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "486eda8c-6e6f-4761-b28c-8aeb72fcfcc1" (UID: "486eda8c-6e6f-4761-b28c-8aeb72fcfcc1"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:18:12 crc kubenswrapper[4733]: I0318 10:18:12.139903 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "486eda8c-6e6f-4761-b28c-8aeb72fcfcc1" (UID: "486eda8c-6e6f-4761-b28c-8aeb72fcfcc1"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:18:12 crc kubenswrapper[4733]: I0318 10:18:12.140168 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "486eda8c-6e6f-4761-b28c-8aeb72fcfcc1" (UID: "486eda8c-6e6f-4761-b28c-8aeb72fcfcc1"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:18:12 crc kubenswrapper[4733]: I0318 10:18:12.140378 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "486eda8c-6e6f-4761-b28c-8aeb72fcfcc1" (UID: "486eda8c-6e6f-4761-b28c-8aeb72fcfcc1"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:18:12 crc kubenswrapper[4733]: I0318 10:18:12.140637 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "486eda8c-6e6f-4761-b28c-8aeb72fcfcc1" (UID: "486eda8c-6e6f-4761-b28c-8aeb72fcfcc1"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:18:12 crc kubenswrapper[4733]: I0318 10:18:12.141031 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "486eda8c-6e6f-4761-b28c-8aeb72fcfcc1" (UID: "486eda8c-6e6f-4761-b28c-8aeb72fcfcc1"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:18:12 crc kubenswrapper[4733]: I0318 10:18:12.142750 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "486eda8c-6e6f-4761-b28c-8aeb72fcfcc1" (UID: "486eda8c-6e6f-4761-b28c-8aeb72fcfcc1"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:18:12 crc kubenswrapper[4733]: I0318 10:18:12.175361 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 10:18:12 crc kubenswrapper[4733]: I0318 10:18:12.175952 4733 status_manager.go:851] "Failed to get status for pod" podUID="71bc6618-8df4-4a35-9469-772a853eff06" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.184:6443: connect: connection refused" Mar 18 10:18:12 crc kubenswrapper[4733]: I0318 10:18:12.176544 4733 status_manager.go:851] "Failed to get status for pod" podUID="486eda8c-6e6f-4761-b28c-8aeb72fcfcc1" pod="openshift-authentication/oauth-openshift-558db77b4-n6hmz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-n6hmz\": dial tcp 38.102.83.184:6443: connect: connection refused" Mar 18 10:18:12 crc kubenswrapper[4733]: I0318 10:18:12.190442 4733 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ddb303e3-8922-4b43-9bba-2d3f0c30c6b8" Mar 18 10:18:12 crc kubenswrapper[4733]: I0318 10:18:12.190478 4733 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ddb303e3-8922-4b43-9bba-2d3f0c30c6b8" Mar 18 10:18:12 crc kubenswrapper[4733]: E0318 10:18:12.190849 4733 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.184:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 10:18:12 crc kubenswrapper[4733]: I0318 10:18:12.191347 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 10:18:12 crc kubenswrapper[4733]: W0318 10:18:12.207582 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-dc696d54957a20c5a5cfe41b4bb7d9a141673cb5d5e8a271f41febb11379b5d8 WatchSource:0}: Error finding container dc696d54957a20c5a5cfe41b4bb7d9a141673cb5d5e8a271f41febb11379b5d8: Status 404 returned error can't find the container with id dc696d54957a20c5a5cfe41b4bb7d9a141673cb5d5e8a271f41febb11379b5d8 Mar 18 10:18:12 crc kubenswrapper[4733]: I0318 10:18:12.232409 4733 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 18 10:18:12 crc kubenswrapper[4733]: I0318 10:18:12.232437 4733 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 10:18:12 crc kubenswrapper[4733]: I0318 10:18:12.232450 4733 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 18 10:18:12 crc kubenswrapper[4733]: I0318 10:18:12.232460 4733 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 18 10:18:12 crc kubenswrapper[4733]: I0318 10:18:12.232471 4733 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 18 10:18:12 crc kubenswrapper[4733]: I0318 10:18:12.232482 4733 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 18 10:18:12 crc kubenswrapper[4733]: I0318 10:18:12.232491 4733 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 18 10:18:12 crc kubenswrapper[4733]: I0318 10:18:12.232501 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4clx\" (UniqueName: \"kubernetes.io/projected/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-kube-api-access-h4clx\") on node \"crc\" DevicePath \"\"" Mar 18 10:18:12 crc kubenswrapper[4733]: I0318 10:18:12.232510 4733 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 18 10:18:12 crc kubenswrapper[4733]: I0318 10:18:12.595221 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-n6hmz" Mar 18 10:18:12 crc kubenswrapper[4733]: I0318 10:18:12.595229 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-n6hmz" event={"ID":"486eda8c-6e6f-4761-b28c-8aeb72fcfcc1","Type":"ContainerDied","Data":"a4a546ed80545bf50a0d399d05bcd3718be5de86367b6c0e97b326427eeeb776"} Mar 18 10:18:12 crc kubenswrapper[4733]: I0318 10:18:12.595821 4733 scope.go:117] "RemoveContainer" containerID="2063ba38b8f338dff7686f6578cd42c9d0c532672eb45f293854b46ba18f0fea" Mar 18 10:18:12 crc kubenswrapper[4733]: I0318 10:18:12.596568 4733 status_manager.go:851] "Failed to get status for pod" podUID="71bc6618-8df4-4a35-9469-772a853eff06" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.184:6443: connect: connection refused" Mar 18 10:18:12 crc kubenswrapper[4733]: I0318 10:18:12.596841 4733 status_manager.go:851] "Failed to get status for pod" podUID="486eda8c-6e6f-4761-b28c-8aeb72fcfcc1" pod="openshift-authentication/oauth-openshift-558db77b4-n6hmz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-n6hmz\": dial tcp 38.102.83.184:6443: connect: connection refused" Mar 18 10:18:12 crc kubenswrapper[4733]: I0318 10:18:12.599940 4733 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="caedb19efef737343ed2995bbe146d00288783b4e673a1c79a4ec21f417316dc" exitCode=0 Mar 18 10:18:12 crc kubenswrapper[4733]: I0318 10:18:12.600015 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"caedb19efef737343ed2995bbe146d00288783b4e673a1c79a4ec21f417316dc"} Mar 18 10:18:12 crc kubenswrapper[4733]: I0318 10:18:12.600064 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"dc696d54957a20c5a5cfe41b4bb7d9a141673cb5d5e8a271f41febb11379b5d8"} Mar 18 10:18:12 crc kubenswrapper[4733]: I0318 10:18:12.601258 4733 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ddb303e3-8922-4b43-9bba-2d3f0c30c6b8" Mar 18 10:18:12 crc kubenswrapper[4733]: I0318 10:18:12.601612 4733 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ddb303e3-8922-4b43-9bba-2d3f0c30c6b8" Mar 18 10:18:12 crc kubenswrapper[4733]: I0318 10:18:12.602280 4733 status_manager.go:851] "Failed to get status for pod" podUID="71bc6618-8df4-4a35-9469-772a853eff06" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.184:6443: connect: connection refused" Mar 18 10:18:12 crc kubenswrapper[4733]: E0318 10:18:12.602491 4733 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.184:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 10:18:12 crc kubenswrapper[4733]: I0318 10:18:12.602910 4733 status_manager.go:851] "Failed to get status for pod" podUID="486eda8c-6e6f-4761-b28c-8aeb72fcfcc1" pod="openshift-authentication/oauth-openshift-558db77b4-n6hmz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-n6hmz\": dial tcp 38.102.83.184:6443: connect: connection refused" Mar 18 10:18:12 crc kubenswrapper[4733]: I0318 10:18:12.622209 4733 status_manager.go:851] "Failed to get status for pod" podUID="71bc6618-8df4-4a35-9469-772a853eff06" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.184:6443: connect: connection refused" Mar 18 10:18:12 crc kubenswrapper[4733]: I0318 10:18:12.622792 4733 status_manager.go:851] "Failed to get status for pod" podUID="486eda8c-6e6f-4761-b28c-8aeb72fcfcc1" pod="openshift-authentication/oauth-openshift-558db77b4-n6hmz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-n6hmz\": dial tcp 38.102.83.184:6443: connect: connection refused" Mar 18 10:18:13 crc kubenswrapper[4733]: I0318 10:18:13.633055 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"101b56ee994525adbab0b16f8181f6df6c1e8c1a27b1f7f26f318c3625984c1f"} Mar 18 10:18:13 crc kubenswrapper[4733]: I0318 10:18:13.633426 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"fc338a5575a4bed904037de792db05083f4715433f683a21c15af669a961168d"} Mar 18 10:18:13 crc kubenswrapper[4733]: I0318 10:18:13.633437 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c7441d26972d7d30232fee4709846b01e8d87f67f2beecda109a66454d99a988"} Mar 18 10:18:14 crc kubenswrapper[4733]: I0318 10:18:14.643343 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"961eee9018b09c4798c5cf5e9d4815d8c6df735727c5ebac8789e1399136eb39"} Mar 18 10:18:14 crc kubenswrapper[4733]: I0318 10:18:14.643781 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 10:18:14 crc kubenswrapper[4733]: I0318 10:18:14.643797 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2ad1cb73036760030ef554884132976824f996c9f56abba7b77c7033d32dfa23"} Mar 18 10:18:14 crc kubenswrapper[4733]: I0318 10:18:14.643655 4733 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ddb303e3-8922-4b43-9bba-2d3f0c30c6b8" Mar 18 10:18:14 crc kubenswrapper[4733]: I0318 10:18:14.643824 4733 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ddb303e3-8922-4b43-9bba-2d3f0c30c6b8" Mar 18 10:18:14 crc kubenswrapper[4733]: I0318 10:18:14.645746 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 18 10:18:14 crc kubenswrapper[4733]: I0318 10:18:14.646313 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 18 10:18:14 crc kubenswrapper[4733]: I0318 10:18:14.646367 4733 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="7e84c65c99c9c698f4097bbffe0efebd320e4fc2c4a58788a606e7f0b98e1822" exitCode=1 Mar 18 10:18:14 crc kubenswrapper[4733]: I0318 10:18:14.646393 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"7e84c65c99c9c698f4097bbffe0efebd320e4fc2c4a58788a606e7f0b98e1822"} Mar 18 10:18:14 crc kubenswrapper[4733]: I0318 10:18:14.646936 4733 scope.go:117] "RemoveContainer" containerID="7e84c65c99c9c698f4097bbffe0efebd320e4fc2c4a58788a606e7f0b98e1822" Mar 18 10:18:15 crc kubenswrapper[4733]: I0318 10:18:15.657636 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 18 10:18:15 crc kubenswrapper[4733]: I0318 10:18:15.658450 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 18 10:18:15 crc kubenswrapper[4733]: I0318 10:18:15.658500 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"569551dc2aa3d7b4b8162b0916c16e4ef346d8ca060ae26c5676163ad541f8f2"} Mar 18 10:18:16 crc kubenswrapper[4733]: I0318 10:18:16.643988 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 10:18:17 crc kubenswrapper[4733]: I0318 10:18:17.192091 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 10:18:17 crc kubenswrapper[4733]: I0318 10:18:17.192159 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 10:18:17 crc kubenswrapper[4733]: I0318 10:18:17.203350 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 10:18:18 crc kubenswrapper[4733]: I0318 10:18:18.889454 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 10:18:18 crc kubenswrapper[4733]: I0318 10:18:18.896623 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 10:18:19 crc kubenswrapper[4733]: I0318 10:18:19.659451 4733 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 10:18:19 crc kubenswrapper[4733]: I0318 10:18:19.688388 4733 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ddb303e3-8922-4b43-9bba-2d3f0c30c6b8" Mar 18 10:18:19 crc kubenswrapper[4733]: I0318 10:18:19.688439 4733 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ddb303e3-8922-4b43-9bba-2d3f0c30c6b8" Mar 18 10:18:19 crc kubenswrapper[4733]: I0318 10:18:19.696905 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 10:18:20 crc kubenswrapper[4733]: I0318 10:18:20.992260 4733 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ddb303e3-8922-4b43-9bba-2d3f0c30c6b8" Mar 18 10:18:20 crc kubenswrapper[4733]: I0318 10:18:20.992312 4733 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ddb303e3-8922-4b43-9bba-2d3f0c30c6b8" Mar 18 10:18:21 crc kubenswrapper[4733]: I0318 10:18:21.190032 4733 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="43b0fae6-f355-4c4a-a6d1-10662e33fa79" Mar 18 10:18:23 crc kubenswrapper[4733]: I0318 10:18:23.301121 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:18:23 crc kubenswrapper[4733]: I0318 10:18:23.301291 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b3650177-e338-4eba-ab42-bc0cd14c9d65-metrics-certs\") pod \"network-metrics-daemon-4s425\" (UID: \"b3650177-e338-4eba-ab42-bc0cd14c9d65\") " pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:18:23 crc kubenswrapper[4733]: I0318 10:18:23.301376 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:18:23 crc kubenswrapper[4733]: I0318 10:18:23.301435 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:18:23 crc kubenswrapper[4733]: I0318 10:18:23.303409 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 18 10:18:23 crc kubenswrapper[4733]: I0318 10:18:23.304099 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 18 10:18:23 crc kubenswrapper[4733]: I0318 10:18:23.304116 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 18 10:18:23 crc kubenswrapper[4733]: I0318 10:18:23.304701 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 18 10:18:23 crc kubenswrapper[4733]: I0318 10:18:23.313479 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:18:23 crc kubenswrapper[4733]: I0318 10:18:23.314020 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 18 10:18:23 crc kubenswrapper[4733]: I0318 10:18:23.321372 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:18:23 crc kubenswrapper[4733]: I0318 10:18:23.322303 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b3650177-e338-4eba-ab42-bc0cd14c9d65-metrics-certs\") pod \"network-metrics-daemon-4s425\" (UID: \"b3650177-e338-4eba-ab42-bc0cd14c9d65\") " pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:18:23 crc kubenswrapper[4733]: I0318 10:18:23.327079 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:18:23 crc kubenswrapper[4733]: I0318 10:18:23.402842 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:18:23 crc kubenswrapper[4733]: I0318 10:18:23.408252 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:18:23 crc kubenswrapper[4733]: I0318 10:18:23.498114 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 10:18:23 crc kubenswrapper[4733]: I0318 10:18:23.618140 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:18:23 crc kubenswrapper[4733]: I0318 10:18:23.620530 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 18 10:18:23 crc kubenswrapper[4733]: I0318 10:18:23.625896 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 10:18:23 crc kubenswrapper[4733]: I0318 10:18:23.628797 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4s425" Mar 18 10:18:24 crc kubenswrapper[4733]: W0318 10:18:24.178537 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-52c5d4fd6a6df793c0b80dca1e6d93c8b8b34acbd9d37159057b2efe0c813fb4 WatchSource:0}: Error finding container 52c5d4fd6a6df793c0b80dca1e6d93c8b8b34acbd9d37159057b2efe0c813fb4: Status 404 returned error can't find the container with id 52c5d4fd6a6df793c0b80dca1e6d93c8b8b34acbd9d37159057b2efe0c813fb4 Mar 18 10:18:24 crc kubenswrapper[4733]: W0318 10:18:24.256944 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3650177_e338_4eba_ab42_bc0cd14c9d65.slice/crio-03aba841ac2e1d2a33eb39d018365decea883523496253a386082ce0a0330775 WatchSource:0}: Error finding container 03aba841ac2e1d2a33eb39d018365decea883523496253a386082ce0a0330775: Status 404 returned error can't find the container with id 03aba841ac2e1d2a33eb39d018365decea883523496253a386082ce0a0330775 Mar 18 10:18:25 crc kubenswrapper[4733]: I0318 10:18:25.024268 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"ea66405bcfb3ac15e857ca56bfc2529bb805d39049cd5b8e1864cc65a95481b4"} Mar 18 10:18:25 crc kubenswrapper[4733]: I0318 10:18:25.024812 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"a09cb9227fb4162e36e81dce54e11238242d400b1d1035135a6683d0e4dc78cb"} Mar 18 10:18:25 crc kubenswrapper[4733]: I0318 10:18:25.027249 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"269f3014e6f5c6075c765c0da6300d4eaf9926fd1df5ac0665716155c9042d44"} Mar 18 10:18:25 crc kubenswrapper[4733]: I0318 10:18:25.027342 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"8e97a2edc2af846924e5525c4644039e7d7375498d8a02fe8c5c3850861c5b45"} Mar 18 10:18:25 crc kubenswrapper[4733]: I0318 10:18:25.030204 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4s425" event={"ID":"b3650177-e338-4eba-ab42-bc0cd14c9d65","Type":"ContainerStarted","Data":"8e03bee5addf0ed44c0da8f14501e548939549f0389b5b4d987b33d193738139"} Mar 18 10:18:25 crc kubenswrapper[4733]: I0318 10:18:25.030271 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4s425" event={"ID":"b3650177-e338-4eba-ab42-bc0cd14c9d65","Type":"ContainerStarted","Data":"ec7b75e20fa98c379acdef7414db972be5dfe4a9ed77cb9b8b2a54f4bd2dcaa8"} Mar 18 10:18:25 crc kubenswrapper[4733]: I0318 10:18:25.030285 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4s425" event={"ID":"b3650177-e338-4eba-ab42-bc0cd14c9d65","Type":"ContainerStarted","Data":"03aba841ac2e1d2a33eb39d018365decea883523496253a386082ce0a0330775"} Mar 18 10:18:25 crc kubenswrapper[4733]: I0318 10:18:25.032619 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"c0afe4acc9ed318cef8afdacb201f65e8a1b8e6470bd18362461824eb194d68e"} Mar 18 10:18:25 crc kubenswrapper[4733]: I0318 10:18:25.032676 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"52c5d4fd6a6df793c0b80dca1e6d93c8b8b34acbd9d37159057b2efe0c813fb4"} Mar 18 10:18:25 crc kubenswrapper[4733]: I0318 10:18:25.032835 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:18:26 crc kubenswrapper[4733]: I0318 10:18:26.043060 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/0.log" Mar 18 10:18:26 crc kubenswrapper[4733]: I0318 10:18:26.043694 4733 generic.go:334] "Generic (PLEG): container finished" podID="9d751cbb-f2e2-430d-9754-c882a5e924a5" containerID="269f3014e6f5c6075c765c0da6300d4eaf9926fd1df5ac0665716155c9042d44" exitCode=255 Mar 18 10:18:26 crc kubenswrapper[4733]: I0318 10:18:26.043837 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerDied","Data":"269f3014e6f5c6075c765c0da6300d4eaf9926fd1df5ac0665716155c9042d44"} Mar 18 10:18:26 crc kubenswrapper[4733]: I0318 10:18:26.044776 4733 scope.go:117] "RemoveContainer" containerID="269f3014e6f5c6075c765c0da6300d4eaf9926fd1df5ac0665716155c9042d44" Mar 18 10:18:26 crc kubenswrapper[4733]: I0318 10:18:26.653936 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 10:18:27 crc kubenswrapper[4733]: I0318 10:18:27.051542 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/0.log" Mar 18 10:18:27 crc kubenswrapper[4733]: I0318 10:18:27.051589 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"11a8ded935f89e7b9d933f1c47e20a0e908f0e892416bc28ad676a8d61146191"} Mar 18 10:18:28 crc kubenswrapper[4733]: I0318 10:18:28.068300 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Mar 18 10:18:28 crc kubenswrapper[4733]: I0318 10:18:28.069991 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/0.log" Mar 18 10:18:28 crc kubenswrapper[4733]: I0318 10:18:28.070059 4733 generic.go:334] "Generic (PLEG): container finished" podID="9d751cbb-f2e2-430d-9754-c882a5e924a5" containerID="11a8ded935f89e7b9d933f1c47e20a0e908f0e892416bc28ad676a8d61146191" exitCode=255 Mar 18 10:18:28 crc kubenswrapper[4733]: I0318 10:18:28.070127 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerDied","Data":"11a8ded935f89e7b9d933f1c47e20a0e908f0e892416bc28ad676a8d61146191"} Mar 18 10:18:28 crc kubenswrapper[4733]: I0318 10:18:28.070274 4733 scope.go:117] "RemoveContainer" containerID="269f3014e6f5c6075c765c0da6300d4eaf9926fd1df5ac0665716155c9042d44" Mar 18 10:18:28 crc kubenswrapper[4733]: I0318 10:18:28.071228 4733 scope.go:117] "RemoveContainer" containerID="11a8ded935f89e7b9d933f1c47e20a0e908f0e892416bc28ad676a8d61146191" Mar 18 10:18:28 crc kubenswrapper[4733]: E0318 10:18:28.071623 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=check-endpoints pod=network-check-source-55646444c4-trplf_openshift-network-diagnostics(9d751cbb-f2e2-430d-9754-c882a5e924a5)\"" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 10:18:29 crc kubenswrapper[4733]: I0318 10:18:29.082404 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Mar 18 10:18:30 crc kubenswrapper[4733]: I0318 10:18:30.529709 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 18 10:18:30 crc kubenswrapper[4733]: I0318 10:18:30.832243 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 18 10:18:30 crc kubenswrapper[4733]: I0318 10:18:30.958843 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 18 10:18:31 crc kubenswrapper[4733]: I0318 10:18:31.180573 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 18 10:18:31 crc kubenswrapper[4733]: I0318 10:18:31.517984 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 18 10:18:31 crc kubenswrapper[4733]: I0318 10:18:31.865573 4733 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 18 10:18:31 crc kubenswrapper[4733]: I0318 10:18:31.927499 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 18 10:18:31 crc kubenswrapper[4733]: I0318 10:18:31.930019 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 18 10:18:32 crc kubenswrapper[4733]: I0318 10:18:32.021700 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 18 10:18:32 crc kubenswrapper[4733]: I0318 10:18:32.134915 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 18 10:18:32 crc kubenswrapper[4733]: I0318 10:18:32.156686 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 18 10:18:32 crc kubenswrapper[4733]: I0318 10:18:32.160496 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 18 10:18:32 crc kubenswrapper[4733]: I0318 10:18:32.185933 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 18 10:18:32 crc kubenswrapper[4733]: I0318 10:18:32.374080 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 18 10:18:32 crc kubenswrapper[4733]: I0318 10:18:32.541375 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 18 10:18:32 crc kubenswrapper[4733]: I0318 10:18:32.715143 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 18 10:18:32 crc kubenswrapper[4733]: I0318 10:18:32.772332 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 18 10:18:33 crc kubenswrapper[4733]: I0318 10:18:33.028013 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 18 10:18:33 crc kubenswrapper[4733]: I0318 10:18:33.203601 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 18 10:18:33 crc kubenswrapper[4733]: I0318 10:18:33.332705 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 18 10:18:33 crc kubenswrapper[4733]: I0318 10:18:33.578870 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 18 10:18:33 crc kubenswrapper[4733]: I0318 10:18:33.620229 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 18 10:18:33 crc kubenswrapper[4733]: I0318 10:18:33.860375 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 18 10:18:33 crc kubenswrapper[4733]: I0318 10:18:33.977564 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 18 10:18:33 crc kubenswrapper[4733]: I0318 10:18:33.995443 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 18 10:18:34 crc kubenswrapper[4733]: I0318 10:18:34.110442 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 18 10:18:34 crc kubenswrapper[4733]: I0318 10:18:34.202743 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 18 10:18:34 crc kubenswrapper[4733]: I0318 10:18:34.318319 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 18 10:18:34 crc kubenswrapper[4733]: I0318 10:18:34.348557 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 18 10:18:34 crc kubenswrapper[4733]: I0318 10:18:34.376402 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 18 10:18:34 crc kubenswrapper[4733]: I0318 10:18:34.405978 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 18 10:18:34 crc kubenswrapper[4733]: I0318 10:18:34.439275 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 18 10:18:34 crc kubenswrapper[4733]: I0318 10:18:34.521996 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 18 10:18:34 crc kubenswrapper[4733]: I0318 10:18:34.532960 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 18 10:18:34 crc kubenswrapper[4733]: I0318 10:18:34.539195 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 18 10:18:34 crc kubenswrapper[4733]: I0318 10:18:34.547525 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 18 10:18:34 crc kubenswrapper[4733]: I0318 10:18:34.553398 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 18 10:18:34 crc kubenswrapper[4733]: I0318 10:18:34.714408 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 18 10:18:34 crc kubenswrapper[4733]: I0318 10:18:34.786446 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 18 10:18:34 crc kubenswrapper[4733]: I0318 10:18:34.904291 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 18 10:18:34 crc kubenswrapper[4733]: I0318 10:18:34.921590 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 18 10:18:34 crc kubenswrapper[4733]: I0318 10:18:34.967336 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 18 10:18:34 crc kubenswrapper[4733]: I0318 10:18:34.973735 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 18 10:18:35 crc kubenswrapper[4733]: I0318 10:18:35.041123 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 18 10:18:35 crc kubenswrapper[4733]: I0318 10:18:35.056082 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 18 10:18:35 crc kubenswrapper[4733]: I0318 10:18:35.176882 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 18 10:18:35 crc kubenswrapper[4733]: I0318 10:18:35.607901 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 18 10:18:35 crc kubenswrapper[4733]: I0318 10:18:35.623826 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 18 10:18:35 crc kubenswrapper[4733]: I0318 10:18:35.639268 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 18 10:18:35 crc kubenswrapper[4733]: I0318 10:18:35.696160 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 18 10:18:35 crc kubenswrapper[4733]: I0318 10:18:35.715358 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 18 10:18:35 crc kubenswrapper[4733]: I0318 10:18:35.737552 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 18 10:18:35 crc kubenswrapper[4733]: I0318 10:18:35.744154 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 18 10:18:35 crc kubenswrapper[4733]: I0318 10:18:35.802046 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 18 10:18:35 crc kubenswrapper[4733]: I0318 10:18:35.872157 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 18 10:18:35 crc kubenswrapper[4733]: I0318 10:18:35.872168 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 18 10:18:35 crc kubenswrapper[4733]: I0318 10:18:35.958672 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 18 10:18:35 crc kubenswrapper[4733]: I0318 10:18:35.963101 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 18 10:18:36 crc kubenswrapper[4733]: I0318 10:18:36.087550 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 18 10:18:36 crc kubenswrapper[4733]: I0318 10:18:36.111913 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 18 10:18:36 crc kubenswrapper[4733]: I0318 10:18:36.128677 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 18 10:18:36 crc kubenswrapper[4733]: I0318 10:18:36.156338 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 18 10:18:36 crc kubenswrapper[4733]: I0318 10:18:36.190618 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 18 10:18:36 crc kubenswrapper[4733]: I0318 10:18:36.299315 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 18 10:18:36 crc kubenswrapper[4733]: I0318 10:18:36.303511 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 18 10:18:36 crc kubenswrapper[4733]: I0318 10:18:36.374492 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 18 10:18:36 crc kubenswrapper[4733]: I0318 10:18:36.507842 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 18 10:18:36 crc kubenswrapper[4733]: I0318 10:18:36.546801 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 18 10:18:36 crc kubenswrapper[4733]: I0318 10:18:36.551104 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 18 10:18:36 crc kubenswrapper[4733]: I0318 10:18:36.700883 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 18 10:18:36 crc kubenswrapper[4733]: I0318 10:18:36.723906 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 18 10:18:36 crc kubenswrapper[4733]: I0318 10:18:36.740041 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 18 10:18:36 crc kubenswrapper[4733]: I0318 10:18:36.824941 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 18 10:18:36 crc kubenswrapper[4733]: I0318 10:18:36.955906 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 18 10:18:37 crc kubenswrapper[4733]: I0318 10:18:37.044809 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 18 10:18:37 crc kubenswrapper[4733]: I0318 10:18:37.085593 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 18 10:18:37 crc kubenswrapper[4733]: I0318 10:18:37.137530 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 18 10:18:37 crc kubenswrapper[4733]: I0318 10:18:37.150625 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 18 10:18:37 crc kubenswrapper[4733]: I0318 10:18:37.210574 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 18 10:18:37 crc kubenswrapper[4733]: I0318 10:18:37.238573 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 18 10:18:37 crc kubenswrapper[4733]: I0318 10:18:37.395655 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 18 10:18:37 crc kubenswrapper[4733]: I0318 10:18:37.434485 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 18 10:18:37 crc kubenswrapper[4733]: I0318 10:18:37.467392 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 10:18:37 crc kubenswrapper[4733]: I0318 10:18:37.500572 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 18 10:18:37 crc kubenswrapper[4733]: I0318 10:18:37.505290 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 18 10:18:37 crc kubenswrapper[4733]: I0318 10:18:37.575530 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 18 10:18:37 crc kubenswrapper[4733]: I0318 10:18:37.685850 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 18 10:18:37 crc kubenswrapper[4733]: I0318 10:18:37.694454 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 18 10:18:37 crc kubenswrapper[4733]: I0318 10:18:37.732543 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 18 10:18:37 crc kubenswrapper[4733]: I0318 10:18:37.832461 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 18 10:18:37 crc kubenswrapper[4733]: I0318 10:18:37.916247 4733 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 18 10:18:38 crc kubenswrapper[4733]: I0318 10:18:38.001612 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 18 10:18:38 crc kubenswrapper[4733]: I0318 10:18:38.007074 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 18 10:18:38 crc kubenswrapper[4733]: I0318 10:18:38.211754 4733 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 18 10:18:38 crc kubenswrapper[4733]: I0318 10:18:38.244004 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 18 10:18:38 crc kubenswrapper[4733]: I0318 10:18:38.267427 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 18 10:18:38 crc kubenswrapper[4733]: I0318 10:18:38.373252 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 18 10:18:38 crc kubenswrapper[4733]: I0318 10:18:38.379114 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 18 10:18:38 crc kubenswrapper[4733]: I0318 10:18:38.465156 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 18 10:18:38 crc kubenswrapper[4733]: I0318 10:18:38.484072 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 18 10:18:38 crc kubenswrapper[4733]: I0318 10:18:38.603312 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 18 10:18:38 crc kubenswrapper[4733]: I0318 10:18:38.640865 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 18 10:18:38 crc kubenswrapper[4733]: I0318 10:18:38.681335 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 18 10:18:38 crc kubenswrapper[4733]: I0318 10:18:38.702419 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 18 10:18:38 crc kubenswrapper[4733]: I0318 10:18:38.776882 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 18 10:18:38 crc kubenswrapper[4733]: I0318 10:18:38.904061 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 18 10:18:38 crc kubenswrapper[4733]: I0318 10:18:38.910028 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 18 10:18:38 crc kubenswrapper[4733]: I0318 10:18:38.940369 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 18 10:18:38 crc kubenswrapper[4733]: I0318 10:18:38.941822 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 18 10:18:38 crc kubenswrapper[4733]: I0318 10:18:38.945307 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 18 10:18:39 crc kubenswrapper[4733]: I0318 10:18:39.001722 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 18 10:18:39 crc kubenswrapper[4733]: I0318 10:18:39.105636 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 18 10:18:39 crc kubenswrapper[4733]: I0318 10:18:39.147602 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 18 10:18:39 crc kubenswrapper[4733]: I0318 10:18:39.202001 4733 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 18 10:18:39 crc kubenswrapper[4733]: I0318 10:18:39.207360 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-4s425" podStartSLOduration=301.207335519 podStartE2EDuration="5m1.207335519s" podCreationTimestamp="2026-03-18 10:13:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:18:25.103565048 +0000 UTC m=+344.595299403" watchObservedRunningTime="2026-03-18 10:18:39.207335519 +0000 UTC m=+358.699069874" Mar 18 10:18:39 crc kubenswrapper[4733]: I0318 10:18:39.209848 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-558db77b4-n6hmz"] Mar 18 10:18:39 crc kubenswrapper[4733]: I0318 10:18:39.209937 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 18 10:18:39 crc kubenswrapper[4733]: I0318 10:18:39.209980 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4s425"] Mar 18 10:18:39 crc kubenswrapper[4733]: I0318 10:18:39.215739 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 10:18:39 crc kubenswrapper[4733]: I0318 10:18:39.246355 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=20.246326706 podStartE2EDuration="20.246326706s" podCreationTimestamp="2026-03-18 10:18:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:18:39.24013955 +0000 UTC m=+358.731873935" watchObservedRunningTime="2026-03-18 10:18:39.246326706 +0000 UTC m=+358.738061061" Mar 18 10:18:39 crc kubenswrapper[4733]: I0318 10:18:39.273398 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 18 10:18:39 crc kubenswrapper[4733]: I0318 10:18:39.286679 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 18 10:18:39 crc kubenswrapper[4733]: I0318 10:18:39.346034 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 18 10:18:39 crc kubenswrapper[4733]: I0318 10:18:39.347951 4733 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 18 10:18:39 crc kubenswrapper[4733]: I0318 10:18:39.348617 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 18 10:18:39 crc kubenswrapper[4733]: I0318 10:18:39.384559 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 18 10:18:39 crc kubenswrapper[4733]: I0318 10:18:39.487000 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 18 10:18:39 crc kubenswrapper[4733]: I0318 10:18:39.513339 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 18 10:18:39 crc kubenswrapper[4733]: I0318 10:18:39.513876 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 18 10:18:39 crc kubenswrapper[4733]: I0318 10:18:39.702120 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 10:18:39 crc kubenswrapper[4733]: I0318 10:18:39.708519 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 10:18:39 crc kubenswrapper[4733]: I0318 10:18:39.743040 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 18 10:18:39 crc kubenswrapper[4733]: I0318 10:18:39.771687 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 10:18:40 crc kubenswrapper[4733]: I0318 10:18:40.034986 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 18 10:18:40 crc kubenswrapper[4733]: I0318 10:18:40.201690 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 18 10:18:40 crc kubenswrapper[4733]: I0318 10:18:40.221180 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 18 10:18:40 crc kubenswrapper[4733]: I0318 10:18:40.339630 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 18 10:18:40 crc kubenswrapper[4733]: I0318 10:18:40.427686 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 18 10:18:40 crc kubenswrapper[4733]: I0318 10:18:40.683634 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 18 10:18:40 crc kubenswrapper[4733]: I0318 10:18:40.729365 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 18 10:18:40 crc kubenswrapper[4733]: I0318 10:18:40.757403 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563818-44h4f"] Mar 18 10:18:40 crc kubenswrapper[4733]: E0318 10:18:40.757719 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="486eda8c-6e6f-4761-b28c-8aeb72fcfcc1" containerName="oauth-openshift" Mar 18 10:18:40 crc kubenswrapper[4733]: I0318 10:18:40.757743 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="486eda8c-6e6f-4761-b28c-8aeb72fcfcc1" containerName="oauth-openshift" Mar 18 10:18:40 crc kubenswrapper[4733]: E0318 10:18:40.757760 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71bc6618-8df4-4a35-9469-772a853eff06" containerName="installer" Mar 18 10:18:40 crc kubenswrapper[4733]: I0318 10:18:40.757770 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="71bc6618-8df4-4a35-9469-772a853eff06" containerName="installer" Mar 18 10:18:40 crc kubenswrapper[4733]: I0318 10:18:40.757924 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="71bc6618-8df4-4a35-9469-772a853eff06" containerName="installer" Mar 18 10:18:40 crc kubenswrapper[4733]: I0318 10:18:40.757942 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="486eda8c-6e6f-4761-b28c-8aeb72fcfcc1" containerName="oauth-openshift" Mar 18 10:18:40 crc kubenswrapper[4733]: I0318 10:18:40.758686 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563818-44h4f" Mar 18 10:18:40 crc kubenswrapper[4733]: I0318 10:18:40.765541 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563818-44h4f"] Mar 18 10:18:40 crc kubenswrapper[4733]: I0318 10:18:40.767334 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wmd5k" Mar 18 10:18:40 crc kubenswrapper[4733]: I0318 10:18:40.767338 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:18:40 crc kubenswrapper[4733]: I0318 10:18:40.767407 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:18:40 crc kubenswrapper[4733]: I0318 10:18:40.797675 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84w8q\" (UniqueName: \"kubernetes.io/projected/c7d7efa6-dd10-4ee1-a93b-13ae5f74ebe2-kube-api-access-84w8q\") pod \"auto-csr-approver-29563818-44h4f\" (UID: \"c7d7efa6-dd10-4ee1-a93b-13ae5f74ebe2\") " pod="openshift-infra/auto-csr-approver-29563818-44h4f" Mar 18 10:18:40 crc kubenswrapper[4733]: I0318 10:18:40.849052 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 18 10:18:40 crc kubenswrapper[4733]: I0318 10:18:40.879763 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 18 10:18:40 crc kubenswrapper[4733]: I0318 10:18:40.898926 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84w8q\" (UniqueName: \"kubernetes.io/projected/c7d7efa6-dd10-4ee1-a93b-13ae5f74ebe2-kube-api-access-84w8q\") pod \"auto-csr-approver-29563818-44h4f\" (UID: \"c7d7efa6-dd10-4ee1-a93b-13ae5f74ebe2\") " pod="openshift-infra/auto-csr-approver-29563818-44h4f" Mar 18 10:18:40 crc kubenswrapper[4733]: I0318 10:18:40.905869 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 18 10:18:40 crc kubenswrapper[4733]: I0318 10:18:40.925720 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84w8q\" (UniqueName: \"kubernetes.io/projected/c7d7efa6-dd10-4ee1-a93b-13ae5f74ebe2-kube-api-access-84w8q\") pod \"auto-csr-approver-29563818-44h4f\" (UID: \"c7d7efa6-dd10-4ee1-a93b-13ae5f74ebe2\") " pod="openshift-infra/auto-csr-approver-29563818-44h4f" Mar 18 10:18:40 crc kubenswrapper[4733]: I0318 10:18:40.974997 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 18 10:18:40 crc kubenswrapper[4733]: I0318 10:18:40.982388 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 18 10:18:41 crc kubenswrapper[4733]: I0318 10:18:41.083655 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563818-44h4f" Mar 18 10:18:41 crc kubenswrapper[4733]: I0318 10:18:41.101112 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 18 10:18:41 crc kubenswrapper[4733]: I0318 10:18:41.115138 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 18 10:18:41 crc kubenswrapper[4733]: I0318 10:18:41.167023 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 18 10:18:41 crc kubenswrapper[4733]: I0318 10:18:41.183228 4733 scope.go:117] "RemoveContainer" containerID="11a8ded935f89e7b9d933f1c47e20a0e908f0e892416bc28ad676a8d61146191" Mar 18 10:18:41 crc kubenswrapper[4733]: I0318 10:18:41.183917 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="486eda8c-6e6f-4761-b28c-8aeb72fcfcc1" path="/var/lib/kubelet/pods/486eda8c-6e6f-4761-b28c-8aeb72fcfcc1/volumes" Mar 18 10:18:41 crc kubenswrapper[4733]: I0318 10:18:41.220666 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 18 10:18:41 crc kubenswrapper[4733]: I0318 10:18:41.275630 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 18 10:18:41 crc kubenswrapper[4733]: I0318 10:18:41.377138 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 18 10:18:41 crc kubenswrapper[4733]: I0318 10:18:41.445093 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 18 10:18:41 crc kubenswrapper[4733]: I0318 10:18:41.447077 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 18 10:18:41 crc kubenswrapper[4733]: I0318 10:18:41.450744 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 18 10:18:41 crc kubenswrapper[4733]: I0318 10:18:41.472297 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 18 10:18:41 crc kubenswrapper[4733]: I0318 10:18:41.512906 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 18 10:18:41 crc kubenswrapper[4733]: I0318 10:18:41.531492 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 18 10:18:41 crc kubenswrapper[4733]: I0318 10:18:41.536704 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 18 10:18:41 crc kubenswrapper[4733]: I0318 10:18:41.550751 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563818-44h4f"] Mar 18 10:18:41 crc kubenswrapper[4733]: I0318 10:18:41.556828 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 18 10:18:41 crc kubenswrapper[4733]: I0318 10:18:41.627009 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 18 10:18:41 crc kubenswrapper[4733]: I0318 10:18:41.747430 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 18 10:18:41 crc kubenswrapper[4733]: I0318 10:18:41.763463 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 18 10:18:41 crc kubenswrapper[4733]: I0318 10:18:41.794261 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 18 10:18:41 crc kubenswrapper[4733]: I0318 10:18:41.805030 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 18 10:18:41 crc kubenswrapper[4733]: I0318 10:18:41.826326 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 18 10:18:41 crc kubenswrapper[4733]: I0318 10:18:41.919590 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 18 10:18:41 crc kubenswrapper[4733]: I0318 10:18:41.946661 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 18 10:18:41 crc kubenswrapper[4733]: I0318 10:18:41.993831 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 18 10:18:41 crc kubenswrapper[4733]: I0318 10:18:41.994841 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 18 10:18:42 crc kubenswrapper[4733]: I0318 10:18:42.041946 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 18 10:18:42 crc kubenswrapper[4733]: I0318 10:18:42.119385 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 18 10:18:42 crc kubenswrapper[4733]: I0318 10:18:42.154814 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 18 10:18:42 crc kubenswrapper[4733]: I0318 10:18:42.180692 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Mar 18 10:18:42 crc kubenswrapper[4733]: I0318 10:18:42.181040 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"7157857ac224531b0050659a4ecc30a225cc12de5cd75c465a22ea80041c47d0"} Mar 18 10:18:42 crc kubenswrapper[4733]: I0318 10:18:42.182672 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563818-44h4f" event={"ID":"c7d7efa6-dd10-4ee1-a93b-13ae5f74ebe2","Type":"ContainerStarted","Data":"60980c1eaefeb0e6ad1eda08b1369c7dacc1d0cb042f4a7a9717e7a4226cf89a"} Mar 18 10:18:42 crc kubenswrapper[4733]: I0318 10:18:42.231234 4733 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 18 10:18:42 crc kubenswrapper[4733]: I0318 10:18:42.231504 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://2a3ba4486c7a6274fac695e45bc91d2e46d704c4c9018832a7072975d4fecee0" gracePeriod=5 Mar 18 10:18:42 crc kubenswrapper[4733]: I0318 10:18:42.397167 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 18 10:18:42 crc kubenswrapper[4733]: I0318 10:18:42.595289 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 18 10:18:42 crc kubenswrapper[4733]: I0318 10:18:42.617974 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 10:18:42 crc kubenswrapper[4733]: I0318 10:18:42.696180 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 18 10:18:42 crc kubenswrapper[4733]: I0318 10:18:42.705917 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 18 10:18:42 crc kubenswrapper[4733]: I0318 10:18:42.760932 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 18 10:18:42 crc kubenswrapper[4733]: I0318 10:18:42.761584 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 18 10:18:42 crc kubenswrapper[4733]: I0318 10:18:42.842294 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 18 10:18:42 crc kubenswrapper[4733]: I0318 10:18:42.892841 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 10:18:42 crc kubenswrapper[4733]: I0318 10:18:42.930005 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 18 10:18:42 crc kubenswrapper[4733]: I0318 10:18:42.956332 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 18 10:18:43 crc kubenswrapper[4733]: I0318 10:18:43.033140 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 18 10:18:43 crc kubenswrapper[4733]: I0318 10:18:43.110361 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 18 10:18:43 crc kubenswrapper[4733]: I0318 10:18:43.143021 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 18 10:18:43 crc kubenswrapper[4733]: I0318 10:18:43.166918 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 18 10:18:43 crc kubenswrapper[4733]: I0318 10:18:43.195848 4733 generic.go:334] "Generic (PLEG): container finished" podID="c7d7efa6-dd10-4ee1-a93b-13ae5f74ebe2" containerID="4287f6e7720d29c2928f6ce2bc4de5dd996378a83ad9d6dd58331a0b52048815" exitCode=0 Mar 18 10:18:43 crc kubenswrapper[4733]: I0318 10:18:43.196613 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563818-44h4f" event={"ID":"c7d7efa6-dd10-4ee1-a93b-13ae5f74ebe2","Type":"ContainerDied","Data":"4287f6e7720d29c2928f6ce2bc4de5dd996378a83ad9d6dd58331a0b52048815"} Mar 18 10:18:43 crc kubenswrapper[4733]: I0318 10:18:43.198077 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/2.log" Mar 18 10:18:43 crc kubenswrapper[4733]: I0318 10:18:43.198962 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Mar 18 10:18:43 crc kubenswrapper[4733]: I0318 10:18:43.199162 4733 generic.go:334] "Generic (PLEG): container finished" podID="9d751cbb-f2e2-430d-9754-c882a5e924a5" containerID="7157857ac224531b0050659a4ecc30a225cc12de5cd75c465a22ea80041c47d0" exitCode=255 Mar 18 10:18:43 crc kubenswrapper[4733]: I0318 10:18:43.199215 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerDied","Data":"7157857ac224531b0050659a4ecc30a225cc12de5cd75c465a22ea80041c47d0"} Mar 18 10:18:43 crc kubenswrapper[4733]: I0318 10:18:43.199281 4733 scope.go:117] "RemoveContainer" containerID="11a8ded935f89e7b9d933f1c47e20a0e908f0e892416bc28ad676a8d61146191" Mar 18 10:18:43 crc kubenswrapper[4733]: I0318 10:18:43.212827 4733 scope.go:117] "RemoveContainer" containerID="7157857ac224531b0050659a4ecc30a225cc12de5cd75c465a22ea80041c47d0" Mar 18 10:18:43 crc kubenswrapper[4733]: E0318 10:18:43.213443 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=check-endpoints pod=network-check-source-55646444c4-trplf_openshift-network-diagnostics(9d751cbb-f2e2-430d-9754-c882a5e924a5)\"" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 10:18:43 crc kubenswrapper[4733]: I0318 10:18:43.296491 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 18 10:18:43 crc kubenswrapper[4733]: I0318 10:18:43.353115 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 18 10:18:43 crc kubenswrapper[4733]: I0318 10:18:43.460422 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 18 10:18:43 crc kubenswrapper[4733]: I0318 10:18:43.477282 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 10:18:43 crc kubenswrapper[4733]: I0318 10:18:43.601678 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 18 10:18:43 crc kubenswrapper[4733]: I0318 10:18:43.612168 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 18 10:18:43 crc kubenswrapper[4733]: I0318 10:18:43.623985 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 18 10:18:43 crc kubenswrapper[4733]: I0318 10:18:43.702504 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 18 10:18:43 crc kubenswrapper[4733]: I0318 10:18:43.726556 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 18 10:18:43 crc kubenswrapper[4733]: I0318 10:18:43.992423 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 18 10:18:44 crc kubenswrapper[4733]: I0318 10:18:44.208884 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/2.log" Mar 18 10:18:44 crc kubenswrapper[4733]: I0318 10:18:44.223760 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 18 10:18:44 crc kubenswrapper[4733]: I0318 10:18:44.283458 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 18 10:18:44 crc kubenswrapper[4733]: I0318 10:18:44.373016 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 18 10:18:44 crc kubenswrapper[4733]: I0318 10:18:44.450380 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 18 10:18:44 crc kubenswrapper[4733]: I0318 10:18:44.494201 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 18 10:18:44 crc kubenswrapper[4733]: I0318 10:18:44.540440 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563818-44h4f" Mar 18 10:18:44 crc kubenswrapper[4733]: I0318 10:18:44.618293 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 18 10:18:44 crc kubenswrapper[4733]: I0318 10:18:44.659140 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84w8q\" (UniqueName: \"kubernetes.io/projected/c7d7efa6-dd10-4ee1-a93b-13ae5f74ebe2-kube-api-access-84w8q\") pod \"c7d7efa6-dd10-4ee1-a93b-13ae5f74ebe2\" (UID: \"c7d7efa6-dd10-4ee1-a93b-13ae5f74ebe2\") " Mar 18 10:18:44 crc kubenswrapper[4733]: I0318 10:18:44.679571 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7d7efa6-dd10-4ee1-a93b-13ae5f74ebe2-kube-api-access-84w8q" (OuterVolumeSpecName: "kube-api-access-84w8q") pod "c7d7efa6-dd10-4ee1-a93b-13ae5f74ebe2" (UID: "c7d7efa6-dd10-4ee1-a93b-13ae5f74ebe2"). InnerVolumeSpecName "kube-api-access-84w8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:18:44 crc kubenswrapper[4733]: I0318 10:18:44.761092 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84w8q\" (UniqueName: \"kubernetes.io/projected/c7d7efa6-dd10-4ee1-a93b-13ae5f74ebe2-kube-api-access-84w8q\") on node \"crc\" DevicePath \"\"" Mar 18 10:18:44 crc kubenswrapper[4733]: I0318 10:18:44.800801 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 18 10:18:44 crc kubenswrapper[4733]: I0318 10:18:44.907006 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 18 10:18:44 crc kubenswrapper[4733]: I0318 10:18:44.979481 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 18 10:18:45 crc kubenswrapper[4733]: I0318 10:18:45.031918 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 18 10:18:45 crc kubenswrapper[4733]: I0318 10:18:45.089650 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 18 10:18:45 crc kubenswrapper[4733]: I0318 10:18:45.152467 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 18 10:18:45 crc kubenswrapper[4733]: I0318 10:18:45.217130 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563818-44h4f" event={"ID":"c7d7efa6-dd10-4ee1-a93b-13ae5f74ebe2","Type":"ContainerDied","Data":"60980c1eaefeb0e6ad1eda08b1369c7dacc1d0cb042f4a7a9717e7a4226cf89a"} Mar 18 10:18:45 crc kubenswrapper[4733]: I0318 10:18:45.217204 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60980c1eaefeb0e6ad1eda08b1369c7dacc1d0cb042f4a7a9717e7a4226cf89a" Mar 18 10:18:45 crc kubenswrapper[4733]: I0318 10:18:45.221606 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563818-44h4f" Mar 18 10:18:45 crc kubenswrapper[4733]: I0318 10:18:45.287062 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 18 10:18:45 crc kubenswrapper[4733]: I0318 10:18:45.297753 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 18 10:18:45 crc kubenswrapper[4733]: I0318 10:18:45.368742 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 18 10:18:45 crc kubenswrapper[4733]: I0318 10:18:45.550697 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 18 10:18:45 crc kubenswrapper[4733]: I0318 10:18:45.557599 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 18 10:18:45 crc kubenswrapper[4733]: I0318 10:18:45.597225 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 18 10:18:45 crc kubenswrapper[4733]: I0318 10:18:45.649357 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 18 10:18:45 crc kubenswrapper[4733]: I0318 10:18:45.747645 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 18 10:18:45 crc kubenswrapper[4733]: I0318 10:18:45.841506 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 18 10:18:45 crc kubenswrapper[4733]: I0318 10:18:45.858291 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 18 10:18:46 crc kubenswrapper[4733]: I0318 10:18:46.012986 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 18 10:18:46 crc kubenswrapper[4733]: I0318 10:18:46.179331 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 18 10:18:46 crc kubenswrapper[4733]: I0318 10:18:46.223941 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 18 10:18:46 crc kubenswrapper[4733]: I0318 10:18:46.518586 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 18 10:18:46 crc kubenswrapper[4733]: I0318 10:18:46.530583 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 18 10:18:46 crc kubenswrapper[4733]: I0318 10:18:46.797762 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 18 10:18:46 crc kubenswrapper[4733]: I0318 10:18:46.940832 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 18 10:18:47 crc kubenswrapper[4733]: I0318 10:18:47.030023 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 18 10:18:47 crc kubenswrapper[4733]: I0318 10:18:47.057621 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 18 10:18:47 crc kubenswrapper[4733]: I0318 10:18:47.130994 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 18 10:18:47 crc kubenswrapper[4733]: I0318 10:18:47.148915 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 18 10:18:47 crc kubenswrapper[4733]: I0318 10:18:47.191559 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 18 10:18:47 crc kubenswrapper[4733]: I0318 10:18:47.221439 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 18 10:18:47 crc kubenswrapper[4733]: I0318 10:18:47.390841 4733 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 18 10:18:47 crc kubenswrapper[4733]: I0318 10:18:47.498778 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 18 10:18:47 crc kubenswrapper[4733]: I0318 10:18:47.809553 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 18 10:18:47 crc kubenswrapper[4733]: I0318 10:18:47.809640 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 10:18:47 crc kubenswrapper[4733]: I0318 10:18:47.905846 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 10:18:47 crc kubenswrapper[4733]: I0318 10:18:47.906278 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 10:18:47 crc kubenswrapper[4733]: I0318 10:18:47.906469 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 10:18:47 crc kubenswrapper[4733]: I0318 10:18:47.906658 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 10:18:47 crc kubenswrapper[4733]: I0318 10:18:47.906832 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 10:18:47 crc kubenswrapper[4733]: I0318 10:18:47.906066 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 10:18:47 crc kubenswrapper[4733]: I0318 10:18:47.906325 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 10:18:47 crc kubenswrapper[4733]: I0318 10:18:47.906769 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 10:18:47 crc kubenswrapper[4733]: I0318 10:18:47.906893 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 10:18:47 crc kubenswrapper[4733]: I0318 10:18:47.907564 4733 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 18 10:18:47 crc kubenswrapper[4733]: I0318 10:18:47.907686 4733 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 18 10:18:47 crc kubenswrapper[4733]: I0318 10:18:47.907825 4733 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 18 10:18:47 crc kubenswrapper[4733]: I0318 10:18:47.907945 4733 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 18 10:18:47 crc kubenswrapper[4733]: I0318 10:18:47.917953 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 10:18:48 crc kubenswrapper[4733]: I0318 10:18:48.009034 4733 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 18 10:18:48 crc kubenswrapper[4733]: I0318 10:18:48.238106 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 18 10:18:48 crc kubenswrapper[4733]: I0318 10:18:48.238837 4733 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="2a3ba4486c7a6274fac695e45bc91d2e46d704c4c9018832a7072975d4fecee0" exitCode=137 Mar 18 10:18:48 crc kubenswrapper[4733]: I0318 10:18:48.238936 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 10:18:48 crc kubenswrapper[4733]: I0318 10:18:48.238956 4733 scope.go:117] "RemoveContainer" containerID="2a3ba4486c7a6274fac695e45bc91d2e46d704c4c9018832a7072975d4fecee0" Mar 18 10:18:48 crc kubenswrapper[4733]: I0318 10:18:48.255928 4733 scope.go:117] "RemoveContainer" containerID="2a3ba4486c7a6274fac695e45bc91d2e46d704c4c9018832a7072975d4fecee0" Mar 18 10:18:48 crc kubenswrapper[4733]: E0318 10:18:48.256587 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a3ba4486c7a6274fac695e45bc91d2e46d704c4c9018832a7072975d4fecee0\": container with ID starting with 2a3ba4486c7a6274fac695e45bc91d2e46d704c4c9018832a7072975d4fecee0 not found: ID does not exist" containerID="2a3ba4486c7a6274fac695e45bc91d2e46d704c4c9018832a7072975d4fecee0" Mar 18 10:18:48 crc kubenswrapper[4733]: I0318 10:18:48.256637 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a3ba4486c7a6274fac695e45bc91d2e46d704c4c9018832a7072975d4fecee0"} err="failed to get container status \"2a3ba4486c7a6274fac695e45bc91d2e46d704c4c9018832a7072975d4fecee0\": rpc error: code = NotFound desc = could not find container \"2a3ba4486c7a6274fac695e45bc91d2e46d704c4c9018832a7072975d4fecee0\": container with ID starting with 2a3ba4486c7a6274fac695e45bc91d2e46d704c4c9018832a7072975d4fecee0 not found: ID does not exist" Mar 18 10:18:48 crc kubenswrapper[4733]: I0318 10:18:48.345902 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 18 10:18:48 crc kubenswrapper[4733]: I0318 10:18:48.814885 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 18 10:18:49 crc kubenswrapper[4733]: I0318 10:18:49.183470 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 18 10:18:49 crc kubenswrapper[4733]: I0318 10:18:49.383544 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 18 10:18:49 crc kubenswrapper[4733]: I0318 10:18:49.547268 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-68755f559b-s4zq9"] Mar 18 10:18:49 crc kubenswrapper[4733]: E0318 10:18:49.547534 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 18 10:18:49 crc kubenswrapper[4733]: I0318 10:18:49.547548 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 18 10:18:49 crc kubenswrapper[4733]: E0318 10:18:49.547567 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7d7efa6-dd10-4ee1-a93b-13ae5f74ebe2" containerName="oc" Mar 18 10:18:49 crc kubenswrapper[4733]: I0318 10:18:49.547574 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7d7efa6-dd10-4ee1-a93b-13ae5f74ebe2" containerName="oc" Mar 18 10:18:49 crc kubenswrapper[4733]: I0318 10:18:49.547695 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7d7efa6-dd10-4ee1-a93b-13ae5f74ebe2" containerName="oc" Mar 18 10:18:49 crc kubenswrapper[4733]: I0318 10:18:49.547710 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 18 10:18:49 crc kubenswrapper[4733]: I0318 10:18:49.548120 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-68755f559b-s4zq9" Mar 18 10:18:49 crc kubenswrapper[4733]: I0318 10:18:49.550887 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 18 10:18:49 crc kubenswrapper[4733]: I0318 10:18:49.551266 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 18 10:18:49 crc kubenswrapper[4733]: I0318 10:18:49.552020 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 18 10:18:49 crc kubenswrapper[4733]: I0318 10:18:49.553113 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 18 10:18:49 crc kubenswrapper[4733]: I0318 10:18:49.553272 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 18 10:18:49 crc kubenswrapper[4733]: I0318 10:18:49.553324 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 18 10:18:49 crc kubenswrapper[4733]: I0318 10:18:49.553475 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 18 10:18:49 crc kubenswrapper[4733]: I0318 10:18:49.553244 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 18 10:18:49 crc kubenswrapper[4733]: I0318 10:18:49.553663 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 18 10:18:49 crc kubenswrapper[4733]: I0318 10:18:49.553735 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 18 10:18:49 crc kubenswrapper[4733]: I0318 10:18:49.554747 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 18 10:18:49 crc kubenswrapper[4733]: I0318 10:18:49.555228 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 18 10:18:49 crc kubenswrapper[4733]: I0318 10:18:49.562333 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 18 10:18:49 crc kubenswrapper[4733]: I0318 10:18:49.567274 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-68755f559b-s4zq9"] Mar 18 10:18:49 crc kubenswrapper[4733]: I0318 10:18:49.580943 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 18 10:18:49 crc kubenswrapper[4733]: I0318 10:18:49.583225 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 18 10:18:49 crc kubenswrapper[4733]: I0318 10:18:49.628400 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2c695847-f9ab-4d8f-8e05-868fec637b86-v4-0-config-system-router-certs\") pod \"oauth-openshift-68755f559b-s4zq9\" (UID: \"2c695847-f9ab-4d8f-8e05-868fec637b86\") " pod="openshift-authentication/oauth-openshift-68755f559b-s4zq9" Mar 18 10:18:49 crc kubenswrapper[4733]: I0318 10:18:49.628448 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2c695847-f9ab-4d8f-8e05-868fec637b86-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-68755f559b-s4zq9\" (UID: \"2c695847-f9ab-4d8f-8e05-868fec637b86\") " pod="openshift-authentication/oauth-openshift-68755f559b-s4zq9" Mar 18 10:18:49 crc kubenswrapper[4733]: I0318 10:18:49.628472 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2c695847-f9ab-4d8f-8e05-868fec637b86-v4-0-config-user-template-error\") pod \"oauth-openshift-68755f559b-s4zq9\" (UID: \"2c695847-f9ab-4d8f-8e05-868fec637b86\") " pod="openshift-authentication/oauth-openshift-68755f559b-s4zq9" Mar 18 10:18:49 crc kubenswrapper[4733]: I0318 10:18:49.628493 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2c695847-f9ab-4d8f-8e05-868fec637b86-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-68755f559b-s4zq9\" (UID: \"2c695847-f9ab-4d8f-8e05-868fec637b86\") " pod="openshift-authentication/oauth-openshift-68755f559b-s4zq9" Mar 18 10:18:49 crc kubenswrapper[4733]: I0318 10:18:49.628518 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2c695847-f9ab-4d8f-8e05-868fec637b86-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-68755f559b-s4zq9\" (UID: \"2c695847-f9ab-4d8f-8e05-868fec637b86\") " pod="openshift-authentication/oauth-openshift-68755f559b-s4zq9" Mar 18 10:18:49 crc kubenswrapper[4733]: I0318 10:18:49.628549 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2c695847-f9ab-4d8f-8e05-868fec637b86-v4-0-config-system-service-ca\") pod \"oauth-openshift-68755f559b-s4zq9\" (UID: \"2c695847-f9ab-4d8f-8e05-868fec637b86\") " pod="openshift-authentication/oauth-openshift-68755f559b-s4zq9" Mar 18 10:18:49 crc kubenswrapper[4733]: I0318 10:18:49.628597 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khh9z\" (UniqueName: \"kubernetes.io/projected/2c695847-f9ab-4d8f-8e05-868fec637b86-kube-api-access-khh9z\") pod \"oauth-openshift-68755f559b-s4zq9\" (UID: \"2c695847-f9ab-4d8f-8e05-868fec637b86\") " pod="openshift-authentication/oauth-openshift-68755f559b-s4zq9" Mar 18 10:18:49 crc kubenswrapper[4733]: I0318 10:18:49.628621 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2c695847-f9ab-4d8f-8e05-868fec637b86-audit-policies\") pod \"oauth-openshift-68755f559b-s4zq9\" (UID: \"2c695847-f9ab-4d8f-8e05-868fec637b86\") " pod="openshift-authentication/oauth-openshift-68755f559b-s4zq9" Mar 18 10:18:49 crc kubenswrapper[4733]: I0318 10:18:49.628686 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2c695847-f9ab-4d8f-8e05-868fec637b86-audit-dir\") pod \"oauth-openshift-68755f559b-s4zq9\" (UID: \"2c695847-f9ab-4d8f-8e05-868fec637b86\") " pod="openshift-authentication/oauth-openshift-68755f559b-s4zq9" Mar 18 10:18:49 crc kubenswrapper[4733]: I0318 10:18:49.628741 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2c695847-f9ab-4d8f-8e05-868fec637b86-v4-0-config-system-session\") pod \"oauth-openshift-68755f559b-s4zq9\" (UID: \"2c695847-f9ab-4d8f-8e05-868fec637b86\") " pod="openshift-authentication/oauth-openshift-68755f559b-s4zq9" Mar 18 10:18:49 crc kubenswrapper[4733]: I0318 10:18:49.628773 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2c695847-f9ab-4d8f-8e05-868fec637b86-v4-0-config-user-template-login\") pod \"oauth-openshift-68755f559b-s4zq9\" (UID: \"2c695847-f9ab-4d8f-8e05-868fec637b86\") " pod="openshift-authentication/oauth-openshift-68755f559b-s4zq9" Mar 18 10:18:49 crc kubenswrapper[4733]: I0318 10:18:49.628832 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2c695847-f9ab-4d8f-8e05-868fec637b86-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-68755f559b-s4zq9\" (UID: \"2c695847-f9ab-4d8f-8e05-868fec637b86\") " pod="openshift-authentication/oauth-openshift-68755f559b-s4zq9" Mar 18 10:18:49 crc kubenswrapper[4733]: I0318 10:18:49.628938 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2c695847-f9ab-4d8f-8e05-868fec637b86-v4-0-config-system-serving-cert\") pod \"oauth-openshift-68755f559b-s4zq9\" (UID: \"2c695847-f9ab-4d8f-8e05-868fec637b86\") " pod="openshift-authentication/oauth-openshift-68755f559b-s4zq9" Mar 18 10:18:49 crc kubenswrapper[4733]: I0318 10:18:49.628984 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2c695847-f9ab-4d8f-8e05-868fec637b86-v4-0-config-system-cliconfig\") pod \"oauth-openshift-68755f559b-s4zq9\" (UID: \"2c695847-f9ab-4d8f-8e05-868fec637b86\") " pod="openshift-authentication/oauth-openshift-68755f559b-s4zq9" Mar 18 10:18:49 crc kubenswrapper[4733]: I0318 10:18:49.730701 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2c695847-f9ab-4d8f-8e05-868fec637b86-v4-0-config-system-cliconfig\") pod \"oauth-openshift-68755f559b-s4zq9\" (UID: \"2c695847-f9ab-4d8f-8e05-868fec637b86\") " pod="openshift-authentication/oauth-openshift-68755f559b-s4zq9" Mar 18 10:18:49 crc kubenswrapper[4733]: I0318 10:18:49.730770 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2c695847-f9ab-4d8f-8e05-868fec637b86-v4-0-config-system-router-certs\") pod \"oauth-openshift-68755f559b-s4zq9\" (UID: \"2c695847-f9ab-4d8f-8e05-868fec637b86\") " pod="openshift-authentication/oauth-openshift-68755f559b-s4zq9" Mar 18 10:18:49 crc kubenswrapper[4733]: I0318 10:18:49.730797 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2c695847-f9ab-4d8f-8e05-868fec637b86-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-68755f559b-s4zq9\" (UID: \"2c695847-f9ab-4d8f-8e05-868fec637b86\") " pod="openshift-authentication/oauth-openshift-68755f559b-s4zq9" Mar 18 10:18:49 crc kubenswrapper[4733]: I0318 10:18:49.730824 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2c695847-f9ab-4d8f-8e05-868fec637b86-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-68755f559b-s4zq9\" (UID: \"2c695847-f9ab-4d8f-8e05-868fec637b86\") " pod="openshift-authentication/oauth-openshift-68755f559b-s4zq9" Mar 18 10:18:49 crc kubenswrapper[4733]: I0318 10:18:49.730850 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2c695847-f9ab-4d8f-8e05-868fec637b86-v4-0-config-user-template-error\") pod \"oauth-openshift-68755f559b-s4zq9\" (UID: \"2c695847-f9ab-4d8f-8e05-868fec637b86\") " pod="openshift-authentication/oauth-openshift-68755f559b-s4zq9" Mar 18 10:18:49 crc kubenswrapper[4733]: I0318 10:18:49.730869 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2c695847-f9ab-4d8f-8e05-868fec637b86-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-68755f559b-s4zq9\" (UID: \"2c695847-f9ab-4d8f-8e05-868fec637b86\") " pod="openshift-authentication/oauth-openshift-68755f559b-s4zq9" Mar 18 10:18:49 crc kubenswrapper[4733]: I0318 10:18:49.730885 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2c695847-f9ab-4d8f-8e05-868fec637b86-v4-0-config-system-service-ca\") pod \"oauth-openshift-68755f559b-s4zq9\" (UID: \"2c695847-f9ab-4d8f-8e05-868fec637b86\") " pod="openshift-authentication/oauth-openshift-68755f559b-s4zq9" Mar 18 10:18:49 crc kubenswrapper[4733]: I0318 10:18:49.730904 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khh9z\" (UniqueName: \"kubernetes.io/projected/2c695847-f9ab-4d8f-8e05-868fec637b86-kube-api-access-khh9z\") pod \"oauth-openshift-68755f559b-s4zq9\" (UID: \"2c695847-f9ab-4d8f-8e05-868fec637b86\") " pod="openshift-authentication/oauth-openshift-68755f559b-s4zq9" Mar 18 10:18:49 crc kubenswrapper[4733]: I0318 10:18:49.730929 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2c695847-f9ab-4d8f-8e05-868fec637b86-audit-policies\") pod \"oauth-openshift-68755f559b-s4zq9\" (UID: \"2c695847-f9ab-4d8f-8e05-868fec637b86\") " pod="openshift-authentication/oauth-openshift-68755f559b-s4zq9" Mar 18 10:18:49 crc kubenswrapper[4733]: I0318 10:18:49.730955 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2c695847-f9ab-4d8f-8e05-868fec637b86-audit-dir\") pod \"oauth-openshift-68755f559b-s4zq9\" (UID: \"2c695847-f9ab-4d8f-8e05-868fec637b86\") " pod="openshift-authentication/oauth-openshift-68755f559b-s4zq9" Mar 18 10:18:49 crc kubenswrapper[4733]: I0318 10:18:49.730996 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2c695847-f9ab-4d8f-8e05-868fec637b86-v4-0-config-system-session\") pod \"oauth-openshift-68755f559b-s4zq9\" (UID: \"2c695847-f9ab-4d8f-8e05-868fec637b86\") " pod="openshift-authentication/oauth-openshift-68755f559b-s4zq9" Mar 18 10:18:49 crc kubenswrapper[4733]: I0318 10:18:49.731026 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2c695847-f9ab-4d8f-8e05-868fec637b86-v4-0-config-user-template-login\") pod \"oauth-openshift-68755f559b-s4zq9\" (UID: \"2c695847-f9ab-4d8f-8e05-868fec637b86\") " pod="openshift-authentication/oauth-openshift-68755f559b-s4zq9" Mar 18 10:18:49 crc kubenswrapper[4733]: I0318 10:18:49.731059 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2c695847-f9ab-4d8f-8e05-868fec637b86-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-68755f559b-s4zq9\" (UID: \"2c695847-f9ab-4d8f-8e05-868fec637b86\") " pod="openshift-authentication/oauth-openshift-68755f559b-s4zq9" Mar 18 10:18:49 crc kubenswrapper[4733]: I0318 10:18:49.731117 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2c695847-f9ab-4d8f-8e05-868fec637b86-v4-0-config-system-serving-cert\") pod \"oauth-openshift-68755f559b-s4zq9\" (UID: \"2c695847-f9ab-4d8f-8e05-868fec637b86\") " pod="openshift-authentication/oauth-openshift-68755f559b-s4zq9" Mar 18 10:18:49 crc kubenswrapper[4733]: I0318 10:18:49.731922 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2c695847-f9ab-4d8f-8e05-868fec637b86-v4-0-config-system-cliconfig\") pod \"oauth-openshift-68755f559b-s4zq9\" (UID: \"2c695847-f9ab-4d8f-8e05-868fec637b86\") " pod="openshift-authentication/oauth-openshift-68755f559b-s4zq9" Mar 18 10:18:49 crc kubenswrapper[4733]: I0318 10:18:49.732141 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2c695847-f9ab-4d8f-8e05-868fec637b86-audit-dir\") pod \"oauth-openshift-68755f559b-s4zq9\" (UID: \"2c695847-f9ab-4d8f-8e05-868fec637b86\") " pod="openshift-authentication/oauth-openshift-68755f559b-s4zq9" Mar 18 10:18:49 crc kubenswrapper[4733]: I0318 10:18:49.732668 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2c695847-f9ab-4d8f-8e05-868fec637b86-v4-0-config-system-service-ca\") pod \"oauth-openshift-68755f559b-s4zq9\" (UID: \"2c695847-f9ab-4d8f-8e05-868fec637b86\") " pod="openshift-authentication/oauth-openshift-68755f559b-s4zq9" Mar 18 10:18:49 crc kubenswrapper[4733]: I0318 10:18:49.733385 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2c695847-f9ab-4d8f-8e05-868fec637b86-audit-policies\") pod \"oauth-openshift-68755f559b-s4zq9\" (UID: \"2c695847-f9ab-4d8f-8e05-868fec637b86\") " pod="openshift-authentication/oauth-openshift-68755f559b-s4zq9" Mar 18 10:18:49 crc kubenswrapper[4733]: I0318 10:18:49.734637 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2c695847-f9ab-4d8f-8e05-868fec637b86-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-68755f559b-s4zq9\" (UID: \"2c695847-f9ab-4d8f-8e05-868fec637b86\") " pod="openshift-authentication/oauth-openshift-68755f559b-s4zq9" Mar 18 10:18:49 crc kubenswrapper[4733]: I0318 10:18:49.736371 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2c695847-f9ab-4d8f-8e05-868fec637b86-v4-0-config-system-serving-cert\") pod \"oauth-openshift-68755f559b-s4zq9\" (UID: \"2c695847-f9ab-4d8f-8e05-868fec637b86\") " pod="openshift-authentication/oauth-openshift-68755f559b-s4zq9" Mar 18 10:18:49 crc kubenswrapper[4733]: I0318 10:18:49.737085 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2c695847-f9ab-4d8f-8e05-868fec637b86-v4-0-config-user-template-login\") pod \"oauth-openshift-68755f559b-s4zq9\" (UID: \"2c695847-f9ab-4d8f-8e05-868fec637b86\") " pod="openshift-authentication/oauth-openshift-68755f559b-s4zq9" Mar 18 10:18:49 crc kubenswrapper[4733]: I0318 10:18:49.737973 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2c695847-f9ab-4d8f-8e05-868fec637b86-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-68755f559b-s4zq9\" (UID: \"2c695847-f9ab-4d8f-8e05-868fec637b86\") " pod="openshift-authentication/oauth-openshift-68755f559b-s4zq9" Mar 18 10:18:49 crc kubenswrapper[4733]: I0318 10:18:49.738529 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2c695847-f9ab-4d8f-8e05-868fec637b86-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-68755f559b-s4zq9\" (UID: \"2c695847-f9ab-4d8f-8e05-868fec637b86\") " pod="openshift-authentication/oauth-openshift-68755f559b-s4zq9" Mar 18 10:18:49 crc kubenswrapper[4733]: I0318 10:18:49.739091 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2c695847-f9ab-4d8f-8e05-868fec637b86-v4-0-config-system-router-certs\") pod \"oauth-openshift-68755f559b-s4zq9\" (UID: \"2c695847-f9ab-4d8f-8e05-868fec637b86\") " pod="openshift-authentication/oauth-openshift-68755f559b-s4zq9" Mar 18 10:18:49 crc kubenswrapper[4733]: I0318 10:18:49.740584 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2c695847-f9ab-4d8f-8e05-868fec637b86-v4-0-config-user-template-error\") pod \"oauth-openshift-68755f559b-s4zq9\" (UID: \"2c695847-f9ab-4d8f-8e05-868fec637b86\") " pod="openshift-authentication/oauth-openshift-68755f559b-s4zq9" Mar 18 10:18:49 crc kubenswrapper[4733]: I0318 10:18:49.746877 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2c695847-f9ab-4d8f-8e05-868fec637b86-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-68755f559b-s4zq9\" (UID: \"2c695847-f9ab-4d8f-8e05-868fec637b86\") " pod="openshift-authentication/oauth-openshift-68755f559b-s4zq9" Mar 18 10:18:49 crc kubenswrapper[4733]: I0318 10:18:49.756794 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2c695847-f9ab-4d8f-8e05-868fec637b86-v4-0-config-system-session\") pod \"oauth-openshift-68755f559b-s4zq9\" (UID: \"2c695847-f9ab-4d8f-8e05-868fec637b86\") " pod="openshift-authentication/oauth-openshift-68755f559b-s4zq9" Mar 18 10:18:49 crc kubenswrapper[4733]: I0318 10:18:49.756973 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khh9z\" (UniqueName: \"kubernetes.io/projected/2c695847-f9ab-4d8f-8e05-868fec637b86-kube-api-access-khh9z\") pod \"oauth-openshift-68755f559b-s4zq9\" (UID: \"2c695847-f9ab-4d8f-8e05-868fec637b86\") " pod="openshift-authentication/oauth-openshift-68755f559b-s4zq9" Mar 18 10:18:49 crc kubenswrapper[4733]: I0318 10:18:49.861882 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-68755f559b-s4zq9" Mar 18 10:18:50 crc kubenswrapper[4733]: I0318 10:18:50.300321 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-68755f559b-s4zq9"] Mar 18 10:18:50 crc kubenswrapper[4733]: W0318 10:18:50.312897 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c695847_f9ab_4d8f_8e05_868fec637b86.slice/crio-a8661530409955b5c494f8be4fcd86a397e0f4e70f27c610972ff3a633a2e60e WatchSource:0}: Error finding container a8661530409955b5c494f8be4fcd86a397e0f4e70f27c610972ff3a633a2e60e: Status 404 returned error can't find the container with id a8661530409955b5c494f8be4fcd86a397e0f4e70f27c610972ff3a633a2e60e Mar 18 10:18:51 crc kubenswrapper[4733]: I0318 10:18:51.271796 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-68755f559b-s4zq9" event={"ID":"2c695847-f9ab-4d8f-8e05-868fec637b86","Type":"ContainerStarted","Data":"4110f899e8265094fb2fbbd365d9ea24f37606fce5120f3e02a5bb0445cbe332"} Mar 18 10:18:51 crc kubenswrapper[4733]: I0318 10:18:51.272295 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-68755f559b-s4zq9" event={"ID":"2c695847-f9ab-4d8f-8e05-868fec637b86","Type":"ContainerStarted","Data":"a8661530409955b5c494f8be4fcd86a397e0f4e70f27c610972ff3a633a2e60e"} Mar 18 10:18:51 crc kubenswrapper[4733]: I0318 10:18:51.272340 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-68755f559b-s4zq9" Mar 18 10:18:51 crc kubenswrapper[4733]: I0318 10:18:51.281802 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-68755f559b-s4zq9" Mar 18 10:18:51 crc kubenswrapper[4733]: I0318 10:18:51.314945 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-68755f559b-s4zq9" podStartSLOduration=65.314917027 podStartE2EDuration="1m5.314917027s" podCreationTimestamp="2026-03-18 10:17:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:18:51.313821686 +0000 UTC m=+370.805556021" watchObservedRunningTime="2026-03-18 10:18:51.314917027 +0000 UTC m=+370.806651372" Mar 18 10:18:57 crc kubenswrapper[4733]: I0318 10:18:57.181129 4733 scope.go:117] "RemoveContainer" containerID="7157857ac224531b0050659a4ecc30a225cc12de5cd75c465a22ea80041c47d0" Mar 18 10:18:57 crc kubenswrapper[4733]: E0318 10:18:57.182449 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=check-endpoints pod=network-check-source-55646444c4-trplf_openshift-network-diagnostics(9d751cbb-f2e2-430d-9754-c882a5e924a5)\"" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 10:19:03 crc kubenswrapper[4733]: I0318 10:19:03.361777 4733 generic.go:334] "Generic (PLEG): container finished" podID="5192f67b-f2ab-45eb-9b1a-64bdff02437a" containerID="3d92f9fbfa1c8b8490e331060d587d908cf420777497bf90bb4815f3f49e79dd" exitCode=0 Mar 18 10:19:03 crc kubenswrapper[4733]: I0318 10:19:03.361864 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-9h9xr" event={"ID":"5192f67b-f2ab-45eb-9b1a-64bdff02437a","Type":"ContainerDied","Data":"3d92f9fbfa1c8b8490e331060d587d908cf420777497bf90bb4815f3f49e79dd"} Mar 18 10:19:03 crc kubenswrapper[4733]: I0318 10:19:03.363009 4733 scope.go:117] "RemoveContainer" containerID="3d92f9fbfa1c8b8490e331060d587d908cf420777497bf90bb4815f3f49e79dd" Mar 18 10:19:03 crc kubenswrapper[4733]: I0318 10:19:03.626157 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 10:19:04 crc kubenswrapper[4733]: I0318 10:19:04.369610 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-9h9xr" event={"ID":"5192f67b-f2ab-45eb-9b1a-64bdff02437a","Type":"ContainerStarted","Data":"e99c56e1939c6c49ea2bc0d06c119ca0495ae09507c35a951b28f4145d07b5a2"} Mar 18 10:19:04 crc kubenswrapper[4733]: I0318 10:19:04.369980 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-9h9xr" Mar 18 10:19:04 crc kubenswrapper[4733]: I0318 10:19:04.371417 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-9h9xr" Mar 18 10:19:08 crc kubenswrapper[4733]: I0318 10:19:08.175828 4733 scope.go:117] "RemoveContainer" containerID="7157857ac224531b0050659a4ecc30a225cc12de5cd75c465a22ea80041c47d0" Mar 18 10:19:08 crc kubenswrapper[4733]: I0318 10:19:08.404011 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/2.log" Mar 18 10:19:08 crc kubenswrapper[4733]: I0318 10:19:08.404502 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"8f0aa49adcee1882dbf9476e69a98aecde2bd5d5ac8dc2661bf8de9a9e923a4a"} Mar 18 10:19:43 crc kubenswrapper[4733]: I0318 10:19:43.572093 4733 patch_prober.go:28] interesting pod/machine-config-daemon-2h7dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:19:43 crc kubenswrapper[4733]: I0318 10:19:43.572999 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:20:00 crc kubenswrapper[4733]: I0318 10:20:00.137968 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563820-x8mq8"] Mar 18 10:20:00 crc kubenswrapper[4733]: I0318 10:20:00.139947 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563820-x8mq8" Mar 18 10:20:00 crc kubenswrapper[4733]: I0318 10:20:00.143976 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wmd5k" Mar 18 10:20:00 crc kubenswrapper[4733]: I0318 10:20:00.145352 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:20:00 crc kubenswrapper[4733]: I0318 10:20:00.145841 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:20:00 crc kubenswrapper[4733]: I0318 10:20:00.154899 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563820-x8mq8"] Mar 18 10:20:00 crc kubenswrapper[4733]: I0318 10:20:00.271871 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grxg2\" (UniqueName: \"kubernetes.io/projected/949d71ae-f754-4b5c-8c0b-fec8d374f27e-kube-api-access-grxg2\") pod \"auto-csr-approver-29563820-x8mq8\" (UID: \"949d71ae-f754-4b5c-8c0b-fec8d374f27e\") " pod="openshift-infra/auto-csr-approver-29563820-x8mq8" Mar 18 10:20:00 crc kubenswrapper[4733]: I0318 10:20:00.373330 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grxg2\" (UniqueName: \"kubernetes.io/projected/949d71ae-f754-4b5c-8c0b-fec8d374f27e-kube-api-access-grxg2\") pod \"auto-csr-approver-29563820-x8mq8\" (UID: \"949d71ae-f754-4b5c-8c0b-fec8d374f27e\") " pod="openshift-infra/auto-csr-approver-29563820-x8mq8" Mar 18 10:20:00 crc kubenswrapper[4733]: I0318 10:20:00.402144 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grxg2\" (UniqueName: \"kubernetes.io/projected/949d71ae-f754-4b5c-8c0b-fec8d374f27e-kube-api-access-grxg2\") pod \"auto-csr-approver-29563820-x8mq8\" (UID: \"949d71ae-f754-4b5c-8c0b-fec8d374f27e\") " pod="openshift-infra/auto-csr-approver-29563820-x8mq8" Mar 18 10:20:00 crc kubenswrapper[4733]: I0318 10:20:00.459566 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563820-x8mq8" Mar 18 10:20:00 crc kubenswrapper[4733]: I0318 10:20:00.946764 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563820-x8mq8"] Mar 18 10:20:01 crc kubenswrapper[4733]: I0318 10:20:01.774775 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563820-x8mq8" event={"ID":"949d71ae-f754-4b5c-8c0b-fec8d374f27e","Type":"ContainerStarted","Data":"651bd5a061b0713de78b4145e584b9b64603e782db5f257b9677273fe6364f7e"} Mar 18 10:20:02 crc kubenswrapper[4733]: I0318 10:20:02.787404 4733 generic.go:334] "Generic (PLEG): container finished" podID="949d71ae-f754-4b5c-8c0b-fec8d374f27e" containerID="4d1f85ec68f66c1e8dcc6134fd20cc9907c6036a83ddad6341fd815f0c10f145" exitCode=0 Mar 18 10:20:02 crc kubenswrapper[4733]: I0318 10:20:02.787497 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563820-x8mq8" event={"ID":"949d71ae-f754-4b5c-8c0b-fec8d374f27e","Type":"ContainerDied","Data":"4d1f85ec68f66c1e8dcc6134fd20cc9907c6036a83ddad6341fd815f0c10f145"} Mar 18 10:20:03 crc kubenswrapper[4733]: I0318 10:20:03.783884 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-mjkph"] Mar 18 10:20:03 crc kubenswrapper[4733]: I0318 10:20:03.785871 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-mjkph" Mar 18 10:20:03 crc kubenswrapper[4733]: I0318 10:20:03.803821 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-mjkph"] Mar 18 10:20:03 crc kubenswrapper[4733]: I0318 10:20:03.933249 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-mjkph\" (UID: \"c0a31e7f-de02-4d48-8581-99a2fbf6a34f\") " pod="openshift-image-registry/image-registry-66df7c8f76-mjkph" Mar 18 10:20:03 crc kubenswrapper[4733]: I0318 10:20:03.933321 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c0a31e7f-de02-4d48-8581-99a2fbf6a34f-registry-tls\") pod \"image-registry-66df7c8f76-mjkph\" (UID: \"c0a31e7f-de02-4d48-8581-99a2fbf6a34f\") " pod="openshift-image-registry/image-registry-66df7c8f76-mjkph" Mar 18 10:20:03 crc kubenswrapper[4733]: I0318 10:20:03.933369 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c0a31e7f-de02-4d48-8581-99a2fbf6a34f-installation-pull-secrets\") pod \"image-registry-66df7c8f76-mjkph\" (UID: \"c0a31e7f-de02-4d48-8581-99a2fbf6a34f\") " pod="openshift-image-registry/image-registry-66df7c8f76-mjkph" Mar 18 10:20:03 crc kubenswrapper[4733]: I0318 10:20:03.933397 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c0a31e7f-de02-4d48-8581-99a2fbf6a34f-bound-sa-token\") pod \"image-registry-66df7c8f76-mjkph\" (UID: \"c0a31e7f-de02-4d48-8581-99a2fbf6a34f\") " pod="openshift-image-registry/image-registry-66df7c8f76-mjkph" Mar 18 10:20:03 crc kubenswrapper[4733]: I0318 10:20:03.933430 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c0a31e7f-de02-4d48-8581-99a2fbf6a34f-registry-certificates\") pod \"image-registry-66df7c8f76-mjkph\" (UID: \"c0a31e7f-de02-4d48-8581-99a2fbf6a34f\") " pod="openshift-image-registry/image-registry-66df7c8f76-mjkph" Mar 18 10:20:03 crc kubenswrapper[4733]: I0318 10:20:03.933764 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c0a31e7f-de02-4d48-8581-99a2fbf6a34f-ca-trust-extracted\") pod \"image-registry-66df7c8f76-mjkph\" (UID: \"c0a31e7f-de02-4d48-8581-99a2fbf6a34f\") " pod="openshift-image-registry/image-registry-66df7c8f76-mjkph" Mar 18 10:20:03 crc kubenswrapper[4733]: I0318 10:20:03.933897 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c0a31e7f-de02-4d48-8581-99a2fbf6a34f-trusted-ca\") pod \"image-registry-66df7c8f76-mjkph\" (UID: \"c0a31e7f-de02-4d48-8581-99a2fbf6a34f\") " pod="openshift-image-registry/image-registry-66df7c8f76-mjkph" Mar 18 10:20:03 crc kubenswrapper[4733]: I0318 10:20:03.933991 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24zq9\" (UniqueName: \"kubernetes.io/projected/c0a31e7f-de02-4d48-8581-99a2fbf6a34f-kube-api-access-24zq9\") pod \"image-registry-66df7c8f76-mjkph\" (UID: \"c0a31e7f-de02-4d48-8581-99a2fbf6a34f\") " pod="openshift-image-registry/image-registry-66df7c8f76-mjkph" Mar 18 10:20:03 crc kubenswrapper[4733]: I0318 10:20:03.969637 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-mjkph\" (UID: \"c0a31e7f-de02-4d48-8581-99a2fbf6a34f\") " pod="openshift-image-registry/image-registry-66df7c8f76-mjkph" Mar 18 10:20:04 crc kubenswrapper[4733]: I0318 10:20:04.035949 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24zq9\" (UniqueName: \"kubernetes.io/projected/c0a31e7f-de02-4d48-8581-99a2fbf6a34f-kube-api-access-24zq9\") pod \"image-registry-66df7c8f76-mjkph\" (UID: \"c0a31e7f-de02-4d48-8581-99a2fbf6a34f\") " pod="openshift-image-registry/image-registry-66df7c8f76-mjkph" Mar 18 10:20:04 crc kubenswrapper[4733]: I0318 10:20:04.036020 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c0a31e7f-de02-4d48-8581-99a2fbf6a34f-registry-tls\") pod \"image-registry-66df7c8f76-mjkph\" (UID: \"c0a31e7f-de02-4d48-8581-99a2fbf6a34f\") " pod="openshift-image-registry/image-registry-66df7c8f76-mjkph" Mar 18 10:20:04 crc kubenswrapper[4733]: I0318 10:20:04.036046 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c0a31e7f-de02-4d48-8581-99a2fbf6a34f-installation-pull-secrets\") pod \"image-registry-66df7c8f76-mjkph\" (UID: \"c0a31e7f-de02-4d48-8581-99a2fbf6a34f\") " pod="openshift-image-registry/image-registry-66df7c8f76-mjkph" Mar 18 10:20:04 crc kubenswrapper[4733]: I0318 10:20:04.036069 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c0a31e7f-de02-4d48-8581-99a2fbf6a34f-bound-sa-token\") pod \"image-registry-66df7c8f76-mjkph\" (UID: \"c0a31e7f-de02-4d48-8581-99a2fbf6a34f\") " pod="openshift-image-registry/image-registry-66df7c8f76-mjkph" Mar 18 10:20:04 crc kubenswrapper[4733]: I0318 10:20:04.036095 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c0a31e7f-de02-4d48-8581-99a2fbf6a34f-registry-certificates\") pod \"image-registry-66df7c8f76-mjkph\" (UID: \"c0a31e7f-de02-4d48-8581-99a2fbf6a34f\") " pod="openshift-image-registry/image-registry-66df7c8f76-mjkph" Mar 18 10:20:04 crc kubenswrapper[4733]: I0318 10:20:04.036124 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c0a31e7f-de02-4d48-8581-99a2fbf6a34f-ca-trust-extracted\") pod \"image-registry-66df7c8f76-mjkph\" (UID: \"c0a31e7f-de02-4d48-8581-99a2fbf6a34f\") " pod="openshift-image-registry/image-registry-66df7c8f76-mjkph" Mar 18 10:20:04 crc kubenswrapper[4733]: I0318 10:20:04.036153 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c0a31e7f-de02-4d48-8581-99a2fbf6a34f-trusted-ca\") pod \"image-registry-66df7c8f76-mjkph\" (UID: \"c0a31e7f-de02-4d48-8581-99a2fbf6a34f\") " pod="openshift-image-registry/image-registry-66df7c8f76-mjkph" Mar 18 10:20:04 crc kubenswrapper[4733]: I0318 10:20:04.037651 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c0a31e7f-de02-4d48-8581-99a2fbf6a34f-ca-trust-extracted\") pod \"image-registry-66df7c8f76-mjkph\" (UID: \"c0a31e7f-de02-4d48-8581-99a2fbf6a34f\") " pod="openshift-image-registry/image-registry-66df7c8f76-mjkph" Mar 18 10:20:04 crc kubenswrapper[4733]: I0318 10:20:04.037765 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c0a31e7f-de02-4d48-8581-99a2fbf6a34f-trusted-ca\") pod \"image-registry-66df7c8f76-mjkph\" (UID: \"c0a31e7f-de02-4d48-8581-99a2fbf6a34f\") " pod="openshift-image-registry/image-registry-66df7c8f76-mjkph" Mar 18 10:20:04 crc kubenswrapper[4733]: I0318 10:20:04.038808 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c0a31e7f-de02-4d48-8581-99a2fbf6a34f-registry-certificates\") pod \"image-registry-66df7c8f76-mjkph\" (UID: \"c0a31e7f-de02-4d48-8581-99a2fbf6a34f\") " pod="openshift-image-registry/image-registry-66df7c8f76-mjkph" Mar 18 10:20:04 crc kubenswrapper[4733]: I0318 10:20:04.044233 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c0a31e7f-de02-4d48-8581-99a2fbf6a34f-installation-pull-secrets\") pod \"image-registry-66df7c8f76-mjkph\" (UID: \"c0a31e7f-de02-4d48-8581-99a2fbf6a34f\") " pod="openshift-image-registry/image-registry-66df7c8f76-mjkph" Mar 18 10:20:04 crc kubenswrapper[4733]: I0318 10:20:04.044235 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c0a31e7f-de02-4d48-8581-99a2fbf6a34f-registry-tls\") pod \"image-registry-66df7c8f76-mjkph\" (UID: \"c0a31e7f-de02-4d48-8581-99a2fbf6a34f\") " pod="openshift-image-registry/image-registry-66df7c8f76-mjkph" Mar 18 10:20:04 crc kubenswrapper[4733]: I0318 10:20:04.068172 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c0a31e7f-de02-4d48-8581-99a2fbf6a34f-bound-sa-token\") pod \"image-registry-66df7c8f76-mjkph\" (UID: \"c0a31e7f-de02-4d48-8581-99a2fbf6a34f\") " pod="openshift-image-registry/image-registry-66df7c8f76-mjkph" Mar 18 10:20:04 crc kubenswrapper[4733]: I0318 10:20:04.073925 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24zq9\" (UniqueName: \"kubernetes.io/projected/c0a31e7f-de02-4d48-8581-99a2fbf6a34f-kube-api-access-24zq9\") pod \"image-registry-66df7c8f76-mjkph\" (UID: \"c0a31e7f-de02-4d48-8581-99a2fbf6a34f\") " pod="openshift-image-registry/image-registry-66df7c8f76-mjkph" Mar 18 10:20:04 crc kubenswrapper[4733]: I0318 10:20:04.104884 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563820-x8mq8" Mar 18 10:20:04 crc kubenswrapper[4733]: I0318 10:20:04.120808 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-mjkph" Mar 18 10:20:04 crc kubenswrapper[4733]: I0318 10:20:04.238541 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grxg2\" (UniqueName: \"kubernetes.io/projected/949d71ae-f754-4b5c-8c0b-fec8d374f27e-kube-api-access-grxg2\") pod \"949d71ae-f754-4b5c-8c0b-fec8d374f27e\" (UID: \"949d71ae-f754-4b5c-8c0b-fec8d374f27e\") " Mar 18 10:20:04 crc kubenswrapper[4733]: I0318 10:20:04.242941 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/949d71ae-f754-4b5c-8c0b-fec8d374f27e-kube-api-access-grxg2" (OuterVolumeSpecName: "kube-api-access-grxg2") pod "949d71ae-f754-4b5c-8c0b-fec8d374f27e" (UID: "949d71ae-f754-4b5c-8c0b-fec8d374f27e"). InnerVolumeSpecName "kube-api-access-grxg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:20:04 crc kubenswrapper[4733]: I0318 10:20:04.340956 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grxg2\" (UniqueName: \"kubernetes.io/projected/949d71ae-f754-4b5c-8c0b-fec8d374f27e-kube-api-access-grxg2\") on node \"crc\" DevicePath \"\"" Mar 18 10:20:04 crc kubenswrapper[4733]: I0318 10:20:04.352428 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-mjkph"] Mar 18 10:20:04 crc kubenswrapper[4733]: W0318 10:20:04.356580 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0a31e7f_de02_4d48_8581_99a2fbf6a34f.slice/crio-7445fed31344d0de91797ecb9edaa7bed1664ab428aba817759b56ba37e6521f WatchSource:0}: Error finding container 7445fed31344d0de91797ecb9edaa7bed1664ab428aba817759b56ba37e6521f: Status 404 returned error can't find the container with id 7445fed31344d0de91797ecb9edaa7bed1664ab428aba817759b56ba37e6521f Mar 18 10:20:04 crc kubenswrapper[4733]: I0318 10:20:04.803025 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563820-x8mq8" event={"ID":"949d71ae-f754-4b5c-8c0b-fec8d374f27e","Type":"ContainerDied","Data":"651bd5a061b0713de78b4145e584b9b64603e782db5f257b9677273fe6364f7e"} Mar 18 10:20:04 crc kubenswrapper[4733]: I0318 10:20:04.803639 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="651bd5a061b0713de78b4145e584b9b64603e782db5f257b9677273fe6364f7e" Mar 18 10:20:04 crc kubenswrapper[4733]: I0318 10:20:04.803083 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563820-x8mq8" Mar 18 10:20:04 crc kubenswrapper[4733]: I0318 10:20:04.807007 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-mjkph" event={"ID":"c0a31e7f-de02-4d48-8581-99a2fbf6a34f","Type":"ContainerStarted","Data":"5fac677e322c63bc66da29cf2df8dccfe7c681a6387d8dcd8638f32db2639ad1"} Mar 18 10:20:04 crc kubenswrapper[4733]: I0318 10:20:04.807727 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-mjkph" Mar 18 10:20:04 crc kubenswrapper[4733]: I0318 10:20:04.807874 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-mjkph" event={"ID":"c0a31e7f-de02-4d48-8581-99a2fbf6a34f","Type":"ContainerStarted","Data":"7445fed31344d0de91797ecb9edaa7bed1664ab428aba817759b56ba37e6521f"} Mar 18 10:20:04 crc kubenswrapper[4733]: I0318 10:20:04.832740 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-mjkph" podStartSLOduration=1.8327092569999999 podStartE2EDuration="1.832709257s" podCreationTimestamp="2026-03-18 10:20:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:20:04.824418169 +0000 UTC m=+444.316152524" watchObservedRunningTime="2026-03-18 10:20:04.832709257 +0000 UTC m=+444.324443582" Mar 18 10:20:13 crc kubenswrapper[4733]: I0318 10:20:13.571250 4733 patch_prober.go:28] interesting pod/machine-config-daemon-2h7dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:20:13 crc kubenswrapper[4733]: I0318 10:20:13.572309 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:20:24 crc kubenswrapper[4733]: I0318 10:20:24.129486 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-mjkph" Mar 18 10:20:24 crc kubenswrapper[4733]: I0318 10:20:24.246768 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-nwhtg"] Mar 18 10:20:27 crc kubenswrapper[4733]: I0318 10:20:27.518744 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rls2r"] Mar 18 10:20:27 crc kubenswrapper[4733]: I0318 10:20:27.520050 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rls2r" podUID="92996997-080b-42c9-bc2c-19c2e68db896" containerName="registry-server" containerID="cri-o://7e9cf80fc09f50439f722c47d01b38f7d154cd5514d553f0573a5303858564f5" gracePeriod=30 Mar 18 10:20:27 crc kubenswrapper[4733]: I0318 10:20:27.536548 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f92nl"] Mar 18 10:20:27 crc kubenswrapper[4733]: I0318 10:20:27.537035 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-f92nl" podUID="527056ad-4daf-4dd5-9e31-887d55be0336" containerName="registry-server" containerID="cri-o://d88a014dfa4a61b3bdf527747022f6d4b6201eb43fb9d2c08a1918862483878b" gracePeriod=30 Mar 18 10:20:27 crc kubenswrapper[4733]: I0318 10:20:27.545239 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9h9xr"] Mar 18 10:20:27 crc kubenswrapper[4733]: I0318 10:20:27.545546 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-9h9xr" podUID="5192f67b-f2ab-45eb-9b1a-64bdff02437a" containerName="marketplace-operator" containerID="cri-o://e99c56e1939c6c49ea2bc0d06c119ca0495ae09507c35a951b28f4145d07b5a2" gracePeriod=30 Mar 18 10:20:27 crc kubenswrapper[4733]: I0318 10:20:27.553227 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jb86w"] Mar 18 10:20:27 crc kubenswrapper[4733]: I0318 10:20:27.553566 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jb86w" podUID="0fd306cb-05db-40e1-a1ec-9f811ce7fec0" containerName="registry-server" containerID="cri-o://36d15214eccc522b73ee0fe4b5f5b4531b1d0593c4e73af5bdcac8f8e55d7014" gracePeriod=30 Mar 18 10:20:27 crc kubenswrapper[4733]: I0318 10:20:27.567721 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-z6qb2"] Mar 18 10:20:27 crc kubenswrapper[4733]: E0318 10:20:27.567977 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="949d71ae-f754-4b5c-8c0b-fec8d374f27e" containerName="oc" Mar 18 10:20:27 crc kubenswrapper[4733]: I0318 10:20:27.567991 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="949d71ae-f754-4b5c-8c0b-fec8d374f27e" containerName="oc" Mar 18 10:20:27 crc kubenswrapper[4733]: I0318 10:20:27.568096 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="949d71ae-f754-4b5c-8c0b-fec8d374f27e" containerName="oc" Mar 18 10:20:27 crc kubenswrapper[4733]: I0318 10:20:27.568715 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-z6qb2" Mar 18 10:20:27 crc kubenswrapper[4733]: I0318 10:20:27.574065 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hrwxg"] Mar 18 10:20:27 crc kubenswrapper[4733]: I0318 10:20:27.574272 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hrwxg" podUID="fb7ed879-1474-4200-88d4-70e425e2bcb1" containerName="registry-server" containerID="cri-o://33010d46494372b311f8b2a190a49601d96469c4c865b75dc62dd08ddc447a47" gracePeriod=30 Mar 18 10:20:27 crc kubenswrapper[4733]: I0318 10:20:27.584253 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-z6qb2"] Mar 18 10:20:27 crc kubenswrapper[4733]: E0318 10:20:27.641432 4733 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 33010d46494372b311f8b2a190a49601d96469c4c865b75dc62dd08ddc447a47 is running failed: container process not found" containerID="33010d46494372b311f8b2a190a49601d96469c4c865b75dc62dd08ddc447a47" cmd=["grpc_health_probe","-addr=:50051"] Mar 18 10:20:27 crc kubenswrapper[4733]: E0318 10:20:27.642686 4733 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 33010d46494372b311f8b2a190a49601d96469c4c865b75dc62dd08ddc447a47 is running failed: container process not found" containerID="33010d46494372b311f8b2a190a49601d96469c4c865b75dc62dd08ddc447a47" cmd=["grpc_health_probe","-addr=:50051"] Mar 18 10:20:27 crc kubenswrapper[4733]: E0318 10:20:27.643280 4733 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 33010d46494372b311f8b2a190a49601d96469c4c865b75dc62dd08ddc447a47 is running failed: container process not found" containerID="33010d46494372b311f8b2a190a49601d96469c4c865b75dc62dd08ddc447a47" cmd=["grpc_health_probe","-addr=:50051"] Mar 18 10:20:27 crc kubenswrapper[4733]: E0318 10:20:27.643363 4733 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 33010d46494372b311f8b2a190a49601d96469c4c865b75dc62dd08ddc447a47 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-hrwxg" podUID="fb7ed879-1474-4200-88d4-70e425e2bcb1" containerName="registry-server" Mar 18 10:20:27 crc kubenswrapper[4733]: I0318 10:20:27.653317 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q486k\" (UniqueName: \"kubernetes.io/projected/8ae3847e-6357-46a1-9578-88deb6e1531b-kube-api-access-q486k\") pod \"marketplace-operator-79b997595-z6qb2\" (UID: \"8ae3847e-6357-46a1-9578-88deb6e1531b\") " pod="openshift-marketplace/marketplace-operator-79b997595-z6qb2" Mar 18 10:20:27 crc kubenswrapper[4733]: I0318 10:20:27.653366 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8ae3847e-6357-46a1-9578-88deb6e1531b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-z6qb2\" (UID: \"8ae3847e-6357-46a1-9578-88deb6e1531b\") " pod="openshift-marketplace/marketplace-operator-79b997595-z6qb2" Mar 18 10:20:27 crc kubenswrapper[4733]: I0318 10:20:27.653431 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8ae3847e-6357-46a1-9578-88deb6e1531b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-z6qb2\" (UID: \"8ae3847e-6357-46a1-9578-88deb6e1531b\") " pod="openshift-marketplace/marketplace-operator-79b997595-z6qb2" Mar 18 10:20:27 crc kubenswrapper[4733]: I0318 10:20:27.757036 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8ae3847e-6357-46a1-9578-88deb6e1531b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-z6qb2\" (UID: \"8ae3847e-6357-46a1-9578-88deb6e1531b\") " pod="openshift-marketplace/marketplace-operator-79b997595-z6qb2" Mar 18 10:20:27 crc kubenswrapper[4733]: I0318 10:20:27.757143 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q486k\" (UniqueName: \"kubernetes.io/projected/8ae3847e-6357-46a1-9578-88deb6e1531b-kube-api-access-q486k\") pod \"marketplace-operator-79b997595-z6qb2\" (UID: \"8ae3847e-6357-46a1-9578-88deb6e1531b\") " pod="openshift-marketplace/marketplace-operator-79b997595-z6qb2" Mar 18 10:20:27 crc kubenswrapper[4733]: I0318 10:20:27.757194 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8ae3847e-6357-46a1-9578-88deb6e1531b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-z6qb2\" (UID: \"8ae3847e-6357-46a1-9578-88deb6e1531b\") " pod="openshift-marketplace/marketplace-operator-79b997595-z6qb2" Mar 18 10:20:27 crc kubenswrapper[4733]: I0318 10:20:27.759190 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8ae3847e-6357-46a1-9578-88deb6e1531b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-z6qb2\" (UID: \"8ae3847e-6357-46a1-9578-88deb6e1531b\") " pod="openshift-marketplace/marketplace-operator-79b997595-z6qb2" Mar 18 10:20:27 crc kubenswrapper[4733]: I0318 10:20:27.766141 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8ae3847e-6357-46a1-9578-88deb6e1531b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-z6qb2\" (UID: \"8ae3847e-6357-46a1-9578-88deb6e1531b\") " pod="openshift-marketplace/marketplace-operator-79b997595-z6qb2" Mar 18 10:20:27 crc kubenswrapper[4733]: I0318 10:20:27.776328 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q486k\" (UniqueName: \"kubernetes.io/projected/8ae3847e-6357-46a1-9578-88deb6e1531b-kube-api-access-q486k\") pod \"marketplace-operator-79b997595-z6qb2\" (UID: \"8ae3847e-6357-46a1-9578-88deb6e1531b\") " pod="openshift-marketplace/marketplace-operator-79b997595-z6qb2" Mar 18 10:20:27 crc kubenswrapper[4733]: I0318 10:20:27.916411 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-z6qb2" Mar 18 10:20:27 crc kubenswrapper[4733]: I0318 10:20:27.975395 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jb86w" Mar 18 10:20:27 crc kubenswrapper[4733]: I0318 10:20:27.987383 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f92nl" Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.012449 4733 generic.go:334] "Generic (PLEG): container finished" podID="fb7ed879-1474-4200-88d4-70e425e2bcb1" containerID="33010d46494372b311f8b2a190a49601d96469c4c865b75dc62dd08ddc447a47" exitCode=0 Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.013245 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hrwxg" event={"ID":"fb7ed879-1474-4200-88d4-70e425e2bcb1","Type":"ContainerDied","Data":"33010d46494372b311f8b2a190a49601d96469c4c865b75dc62dd08ddc447a47"} Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.029760 4733 generic.go:334] "Generic (PLEG): container finished" podID="92996997-080b-42c9-bc2c-19c2e68db896" containerID="7e9cf80fc09f50439f722c47d01b38f7d154cd5514d553f0573a5303858564f5" exitCode=0 Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.029827 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rls2r" event={"ID":"92996997-080b-42c9-bc2c-19c2e68db896","Type":"ContainerDied","Data":"7e9cf80fc09f50439f722c47d01b38f7d154cd5514d553f0573a5303858564f5"} Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.031137 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hrwxg" Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.032068 4733 generic.go:334] "Generic (PLEG): container finished" podID="0fd306cb-05db-40e1-a1ec-9f811ce7fec0" containerID="36d15214eccc522b73ee0fe4b5f5b4531b1d0593c4e73af5bdcac8f8e55d7014" exitCode=0 Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.032107 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jb86w" event={"ID":"0fd306cb-05db-40e1-a1ec-9f811ce7fec0","Type":"ContainerDied","Data":"36d15214eccc522b73ee0fe4b5f5b4531b1d0593c4e73af5bdcac8f8e55d7014"} Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.032126 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jb86w" event={"ID":"0fd306cb-05db-40e1-a1ec-9f811ce7fec0","Type":"ContainerDied","Data":"152bb2d9d2d5d61c127ef6162804e32f4f4e993fb3a1aa90d7238cb79aedf035"} Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.032146 4733 scope.go:117] "RemoveContainer" containerID="36d15214eccc522b73ee0fe4b5f5b4531b1d0593c4e73af5bdcac8f8e55d7014" Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.032319 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jb86w" Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.039783 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-9h9xr" Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.044183 4733 generic.go:334] "Generic (PLEG): container finished" podID="5192f67b-f2ab-45eb-9b1a-64bdff02437a" containerID="e99c56e1939c6c49ea2bc0d06c119ca0495ae09507c35a951b28f4145d07b5a2" exitCode=0 Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.044290 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-9h9xr" event={"ID":"5192f67b-f2ab-45eb-9b1a-64bdff02437a","Type":"ContainerDied","Data":"e99c56e1939c6c49ea2bc0d06c119ca0495ae09507c35a951b28f4145d07b5a2"} Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.045563 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rls2r" Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.051170 4733 generic.go:334] "Generic (PLEG): container finished" podID="527056ad-4daf-4dd5-9e31-887d55be0336" containerID="d88a014dfa4a61b3bdf527747022f6d4b6201eb43fb9d2c08a1918862483878b" exitCode=0 Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.051211 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f92nl" event={"ID":"527056ad-4daf-4dd5-9e31-887d55be0336","Type":"ContainerDied","Data":"d88a014dfa4a61b3bdf527747022f6d4b6201eb43fb9d2c08a1918862483878b"} Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.051230 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f92nl" event={"ID":"527056ad-4daf-4dd5-9e31-887d55be0336","Type":"ContainerDied","Data":"5d9e5dab0932c3cd3cd8b8f12fa8d0d49db59eddcefaa706bd16f11d86be1eac"} Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.051326 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f92nl" Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.063351 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9j4k9\" (UniqueName: \"kubernetes.io/projected/527056ad-4daf-4dd5-9e31-887d55be0336-kube-api-access-9j4k9\") pod \"527056ad-4daf-4dd5-9e31-887d55be0336\" (UID: \"527056ad-4daf-4dd5-9e31-887d55be0336\") " Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.063404 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fd306cb-05db-40e1-a1ec-9f811ce7fec0-catalog-content\") pod \"0fd306cb-05db-40e1-a1ec-9f811ce7fec0\" (UID: \"0fd306cb-05db-40e1-a1ec-9f811ce7fec0\") " Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.063483 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fd306cb-05db-40e1-a1ec-9f811ce7fec0-utilities\") pod \"0fd306cb-05db-40e1-a1ec-9f811ce7fec0\" (UID: \"0fd306cb-05db-40e1-a1ec-9f811ce7fec0\") " Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.063507 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/527056ad-4daf-4dd5-9e31-887d55be0336-catalog-content\") pod \"527056ad-4daf-4dd5-9e31-887d55be0336\" (UID: \"527056ad-4daf-4dd5-9e31-887d55be0336\") " Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.063581 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5frtf\" (UniqueName: \"kubernetes.io/projected/0fd306cb-05db-40e1-a1ec-9f811ce7fec0-kube-api-access-5frtf\") pod \"0fd306cb-05db-40e1-a1ec-9f811ce7fec0\" (UID: \"0fd306cb-05db-40e1-a1ec-9f811ce7fec0\") " Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.063609 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/527056ad-4daf-4dd5-9e31-887d55be0336-utilities\") pod \"527056ad-4daf-4dd5-9e31-887d55be0336\" (UID: \"527056ad-4daf-4dd5-9e31-887d55be0336\") " Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.066472 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fd306cb-05db-40e1-a1ec-9f811ce7fec0-utilities" (OuterVolumeSpecName: "utilities") pod "0fd306cb-05db-40e1-a1ec-9f811ce7fec0" (UID: "0fd306cb-05db-40e1-a1ec-9f811ce7fec0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.068176 4733 scope.go:117] "RemoveContainer" containerID="4da80ec2ba0c104ba8616114aa62d195906b3ceb35fe815aeee6c6a50ba00bd9" Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.070804 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fd306cb-05db-40e1-a1ec-9f811ce7fec0-kube-api-access-5frtf" (OuterVolumeSpecName: "kube-api-access-5frtf") pod "0fd306cb-05db-40e1-a1ec-9f811ce7fec0" (UID: "0fd306cb-05db-40e1-a1ec-9f811ce7fec0"). InnerVolumeSpecName "kube-api-access-5frtf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.073125 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/527056ad-4daf-4dd5-9e31-887d55be0336-kube-api-access-9j4k9" (OuterVolumeSpecName: "kube-api-access-9j4k9") pod "527056ad-4daf-4dd5-9e31-887d55be0336" (UID: "527056ad-4daf-4dd5-9e31-887d55be0336"). InnerVolumeSpecName "kube-api-access-9j4k9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.078172 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/527056ad-4daf-4dd5-9e31-887d55be0336-utilities" (OuterVolumeSpecName: "utilities") pod "527056ad-4daf-4dd5-9e31-887d55be0336" (UID: "527056ad-4daf-4dd5-9e31-887d55be0336"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.120435 4733 scope.go:117] "RemoveContainer" containerID="deb249a09e24f844e1f0eaad077e13c564da63c225d86fe92c3b3e169a3f2a0e" Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.120629 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fd306cb-05db-40e1-a1ec-9f811ce7fec0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0fd306cb-05db-40e1-a1ec-9f811ce7fec0" (UID: "0fd306cb-05db-40e1-a1ec-9f811ce7fec0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.139875 4733 scope.go:117] "RemoveContainer" containerID="36d15214eccc522b73ee0fe4b5f5b4531b1d0593c4e73af5bdcac8f8e55d7014" Mar 18 10:20:28 crc kubenswrapper[4733]: E0318 10:20:28.140537 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36d15214eccc522b73ee0fe4b5f5b4531b1d0593c4e73af5bdcac8f8e55d7014\": container with ID starting with 36d15214eccc522b73ee0fe4b5f5b4531b1d0593c4e73af5bdcac8f8e55d7014 not found: ID does not exist" containerID="36d15214eccc522b73ee0fe4b5f5b4531b1d0593c4e73af5bdcac8f8e55d7014" Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.140586 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36d15214eccc522b73ee0fe4b5f5b4531b1d0593c4e73af5bdcac8f8e55d7014"} err="failed to get container status \"36d15214eccc522b73ee0fe4b5f5b4531b1d0593c4e73af5bdcac8f8e55d7014\": rpc error: code = NotFound desc = could not find container \"36d15214eccc522b73ee0fe4b5f5b4531b1d0593c4e73af5bdcac8f8e55d7014\": container with ID starting with 36d15214eccc522b73ee0fe4b5f5b4531b1d0593c4e73af5bdcac8f8e55d7014 not found: ID does not exist" Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.140623 4733 scope.go:117] "RemoveContainer" containerID="4da80ec2ba0c104ba8616114aa62d195906b3ceb35fe815aeee6c6a50ba00bd9" Mar 18 10:20:28 crc kubenswrapper[4733]: E0318 10:20:28.141109 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4da80ec2ba0c104ba8616114aa62d195906b3ceb35fe815aeee6c6a50ba00bd9\": container with ID starting with 4da80ec2ba0c104ba8616114aa62d195906b3ceb35fe815aeee6c6a50ba00bd9 not found: ID does not exist" containerID="4da80ec2ba0c104ba8616114aa62d195906b3ceb35fe815aeee6c6a50ba00bd9" Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.141141 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4da80ec2ba0c104ba8616114aa62d195906b3ceb35fe815aeee6c6a50ba00bd9"} err="failed to get container status \"4da80ec2ba0c104ba8616114aa62d195906b3ceb35fe815aeee6c6a50ba00bd9\": rpc error: code = NotFound desc = could not find container \"4da80ec2ba0c104ba8616114aa62d195906b3ceb35fe815aeee6c6a50ba00bd9\": container with ID starting with 4da80ec2ba0c104ba8616114aa62d195906b3ceb35fe815aeee6c6a50ba00bd9 not found: ID does not exist" Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.141170 4733 scope.go:117] "RemoveContainer" containerID="deb249a09e24f844e1f0eaad077e13c564da63c225d86fe92c3b3e169a3f2a0e" Mar 18 10:20:28 crc kubenswrapper[4733]: E0318 10:20:28.141529 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"deb249a09e24f844e1f0eaad077e13c564da63c225d86fe92c3b3e169a3f2a0e\": container with ID starting with deb249a09e24f844e1f0eaad077e13c564da63c225d86fe92c3b3e169a3f2a0e not found: ID does not exist" containerID="deb249a09e24f844e1f0eaad077e13c564da63c225d86fe92c3b3e169a3f2a0e" Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.141587 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"deb249a09e24f844e1f0eaad077e13c564da63c225d86fe92c3b3e169a3f2a0e"} err="failed to get container status \"deb249a09e24f844e1f0eaad077e13c564da63c225d86fe92c3b3e169a3f2a0e\": rpc error: code = NotFound desc = could not find container \"deb249a09e24f844e1f0eaad077e13c564da63c225d86fe92c3b3e169a3f2a0e\": container with ID starting with deb249a09e24f844e1f0eaad077e13c564da63c225d86fe92c3b3e169a3f2a0e not found: ID does not exist" Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.141627 4733 scope.go:117] "RemoveContainer" containerID="e99c56e1939c6c49ea2bc0d06c119ca0495ae09507c35a951b28f4145d07b5a2" Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.153794 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/527056ad-4daf-4dd5-9e31-887d55be0336-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "527056ad-4daf-4dd5-9e31-887d55be0336" (UID: "527056ad-4daf-4dd5-9e31-887d55be0336"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.164534 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jsw8d\" (UniqueName: \"kubernetes.io/projected/fb7ed879-1474-4200-88d4-70e425e2bcb1-kube-api-access-jsw8d\") pod \"fb7ed879-1474-4200-88d4-70e425e2bcb1\" (UID: \"fb7ed879-1474-4200-88d4-70e425e2bcb1\") " Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.164601 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mscv\" (UniqueName: \"kubernetes.io/projected/5192f67b-f2ab-45eb-9b1a-64bdff02437a-kube-api-access-8mscv\") pod \"5192f67b-f2ab-45eb-9b1a-64bdff02437a\" (UID: \"5192f67b-f2ab-45eb-9b1a-64bdff02437a\") " Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.164656 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92996997-080b-42c9-bc2c-19c2e68db896-utilities\") pod \"92996997-080b-42c9-bc2c-19c2e68db896\" (UID: \"92996997-080b-42c9-bc2c-19c2e68db896\") " Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.164703 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb7ed879-1474-4200-88d4-70e425e2bcb1-utilities\") pod \"fb7ed879-1474-4200-88d4-70e425e2bcb1\" (UID: \"fb7ed879-1474-4200-88d4-70e425e2bcb1\") " Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.164726 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5192f67b-f2ab-45eb-9b1a-64bdff02437a-marketplace-trusted-ca\") pod \"5192f67b-f2ab-45eb-9b1a-64bdff02437a\" (UID: \"5192f67b-f2ab-45eb-9b1a-64bdff02437a\") " Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.164748 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92996997-080b-42c9-bc2c-19c2e68db896-catalog-content\") pod \"92996997-080b-42c9-bc2c-19c2e68db896\" (UID: \"92996997-080b-42c9-bc2c-19c2e68db896\") " Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.164777 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7hv7\" (UniqueName: \"kubernetes.io/projected/92996997-080b-42c9-bc2c-19c2e68db896-kube-api-access-w7hv7\") pod \"92996997-080b-42c9-bc2c-19c2e68db896\" (UID: \"92996997-080b-42c9-bc2c-19c2e68db896\") " Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.164804 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb7ed879-1474-4200-88d4-70e425e2bcb1-catalog-content\") pod \"fb7ed879-1474-4200-88d4-70e425e2bcb1\" (UID: \"fb7ed879-1474-4200-88d4-70e425e2bcb1\") " Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.164843 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5192f67b-f2ab-45eb-9b1a-64bdff02437a-marketplace-operator-metrics\") pod \"5192f67b-f2ab-45eb-9b1a-64bdff02437a\" (UID: \"5192f67b-f2ab-45eb-9b1a-64bdff02437a\") " Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.165111 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9j4k9\" (UniqueName: \"kubernetes.io/projected/527056ad-4daf-4dd5-9e31-887d55be0336-kube-api-access-9j4k9\") on node \"crc\" DevicePath \"\"" Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.165123 4733 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fd306cb-05db-40e1-a1ec-9f811ce7fec0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.165133 4733 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fd306cb-05db-40e1-a1ec-9f811ce7fec0-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.165143 4733 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/527056ad-4daf-4dd5-9e31-887d55be0336-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.165152 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5frtf\" (UniqueName: \"kubernetes.io/projected/0fd306cb-05db-40e1-a1ec-9f811ce7fec0-kube-api-access-5frtf\") on node \"crc\" DevicePath \"\"" Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.165162 4733 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/527056ad-4daf-4dd5-9e31-887d55be0336-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.169990 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5192f67b-f2ab-45eb-9b1a-64bdff02437a-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "5192f67b-f2ab-45eb-9b1a-64bdff02437a" (UID: "5192f67b-f2ab-45eb-9b1a-64bdff02437a"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.170166 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb7ed879-1474-4200-88d4-70e425e2bcb1-utilities" (OuterVolumeSpecName: "utilities") pod "fb7ed879-1474-4200-88d4-70e425e2bcb1" (UID: "fb7ed879-1474-4200-88d4-70e425e2bcb1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.171571 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5192f67b-f2ab-45eb-9b1a-64bdff02437a-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "5192f67b-f2ab-45eb-9b1a-64bdff02437a" (UID: "5192f67b-f2ab-45eb-9b1a-64bdff02437a"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.172742 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92996997-080b-42c9-bc2c-19c2e68db896-utilities" (OuterVolumeSpecName: "utilities") pod "92996997-080b-42c9-bc2c-19c2e68db896" (UID: "92996997-080b-42c9-bc2c-19c2e68db896"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.172889 4733 scope.go:117] "RemoveContainer" containerID="3d92f9fbfa1c8b8490e331060d587d908cf420777497bf90bb4815f3f49e79dd" Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.173304 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb7ed879-1474-4200-88d4-70e425e2bcb1-kube-api-access-jsw8d" (OuterVolumeSpecName: "kube-api-access-jsw8d") pod "fb7ed879-1474-4200-88d4-70e425e2bcb1" (UID: "fb7ed879-1474-4200-88d4-70e425e2bcb1"). InnerVolumeSpecName "kube-api-access-jsw8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.175265 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5192f67b-f2ab-45eb-9b1a-64bdff02437a-kube-api-access-8mscv" (OuterVolumeSpecName: "kube-api-access-8mscv") pod "5192f67b-f2ab-45eb-9b1a-64bdff02437a" (UID: "5192f67b-f2ab-45eb-9b1a-64bdff02437a"). InnerVolumeSpecName "kube-api-access-8mscv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.184710 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92996997-080b-42c9-bc2c-19c2e68db896-kube-api-access-w7hv7" (OuterVolumeSpecName: "kube-api-access-w7hv7") pod "92996997-080b-42c9-bc2c-19c2e68db896" (UID: "92996997-080b-42c9-bc2c-19c2e68db896"). InnerVolumeSpecName "kube-api-access-w7hv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.193517 4733 scope.go:117] "RemoveContainer" containerID="d88a014dfa4a61b3bdf527747022f6d4b6201eb43fb9d2c08a1918862483878b" Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.225599 4733 scope.go:117] "RemoveContainer" containerID="29549f7b8e67a919f47e1ac510a621c5aca25e45afa1c1c52c5acdec0d566db4" Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.241813 4733 scope.go:117] "RemoveContainer" containerID="83fe7a9d478dddba70a4985b321c90b2fd18ace1a534bec99183ab383ee3f274" Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.242975 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92996997-080b-42c9-bc2c-19c2e68db896-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "92996997-080b-42c9-bc2c-19c2e68db896" (UID: "92996997-080b-42c9-bc2c-19c2e68db896"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.257577 4733 scope.go:117] "RemoveContainer" containerID="d88a014dfa4a61b3bdf527747022f6d4b6201eb43fb9d2c08a1918862483878b" Mar 18 10:20:28 crc kubenswrapper[4733]: E0318 10:20:28.257971 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d88a014dfa4a61b3bdf527747022f6d4b6201eb43fb9d2c08a1918862483878b\": container with ID starting with d88a014dfa4a61b3bdf527747022f6d4b6201eb43fb9d2c08a1918862483878b not found: ID does not exist" containerID="d88a014dfa4a61b3bdf527747022f6d4b6201eb43fb9d2c08a1918862483878b" Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.258043 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d88a014dfa4a61b3bdf527747022f6d4b6201eb43fb9d2c08a1918862483878b"} err="failed to get container status \"d88a014dfa4a61b3bdf527747022f6d4b6201eb43fb9d2c08a1918862483878b\": rpc error: code = NotFound desc = could not find container \"d88a014dfa4a61b3bdf527747022f6d4b6201eb43fb9d2c08a1918862483878b\": container with ID starting with d88a014dfa4a61b3bdf527747022f6d4b6201eb43fb9d2c08a1918862483878b not found: ID does not exist" Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.258079 4733 scope.go:117] "RemoveContainer" containerID="29549f7b8e67a919f47e1ac510a621c5aca25e45afa1c1c52c5acdec0d566db4" Mar 18 10:20:28 crc kubenswrapper[4733]: E0318 10:20:28.258449 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29549f7b8e67a919f47e1ac510a621c5aca25e45afa1c1c52c5acdec0d566db4\": container with ID starting with 29549f7b8e67a919f47e1ac510a621c5aca25e45afa1c1c52c5acdec0d566db4 not found: ID does not exist" containerID="29549f7b8e67a919f47e1ac510a621c5aca25e45afa1c1c52c5acdec0d566db4" Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.258493 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29549f7b8e67a919f47e1ac510a621c5aca25e45afa1c1c52c5acdec0d566db4"} err="failed to get container status \"29549f7b8e67a919f47e1ac510a621c5aca25e45afa1c1c52c5acdec0d566db4\": rpc error: code = NotFound desc = could not find container \"29549f7b8e67a919f47e1ac510a621c5aca25e45afa1c1c52c5acdec0d566db4\": container with ID starting with 29549f7b8e67a919f47e1ac510a621c5aca25e45afa1c1c52c5acdec0d566db4 not found: ID does not exist" Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.258525 4733 scope.go:117] "RemoveContainer" containerID="83fe7a9d478dddba70a4985b321c90b2fd18ace1a534bec99183ab383ee3f274" Mar 18 10:20:28 crc kubenswrapper[4733]: E0318 10:20:28.258835 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83fe7a9d478dddba70a4985b321c90b2fd18ace1a534bec99183ab383ee3f274\": container with ID starting with 83fe7a9d478dddba70a4985b321c90b2fd18ace1a534bec99183ab383ee3f274 not found: ID does not exist" containerID="83fe7a9d478dddba70a4985b321c90b2fd18ace1a534bec99183ab383ee3f274" Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.258862 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83fe7a9d478dddba70a4985b321c90b2fd18ace1a534bec99183ab383ee3f274"} err="failed to get container status \"83fe7a9d478dddba70a4985b321c90b2fd18ace1a534bec99183ab383ee3f274\": rpc error: code = NotFound desc = could not find container \"83fe7a9d478dddba70a4985b321c90b2fd18ace1a534bec99183ab383ee3f274\": container with ID starting with 83fe7a9d478dddba70a4985b321c90b2fd18ace1a534bec99183ab383ee3f274 not found: ID does not exist" Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.267022 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jsw8d\" (UniqueName: \"kubernetes.io/projected/fb7ed879-1474-4200-88d4-70e425e2bcb1-kube-api-access-jsw8d\") on node \"crc\" DevicePath \"\"" Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.267055 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mscv\" (UniqueName: \"kubernetes.io/projected/5192f67b-f2ab-45eb-9b1a-64bdff02437a-kube-api-access-8mscv\") on node \"crc\" DevicePath \"\"" Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.267065 4733 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92996997-080b-42c9-bc2c-19c2e68db896-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.267075 4733 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb7ed879-1474-4200-88d4-70e425e2bcb1-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.267086 4733 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5192f67b-f2ab-45eb-9b1a-64bdff02437a-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.267098 4733 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92996997-080b-42c9-bc2c-19c2e68db896-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.267107 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7hv7\" (UniqueName: \"kubernetes.io/projected/92996997-080b-42c9-bc2c-19c2e68db896-kube-api-access-w7hv7\") on node \"crc\" DevicePath \"\"" Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.267118 4733 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5192f67b-f2ab-45eb-9b1a-64bdff02437a-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.311224 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb7ed879-1474-4200-88d4-70e425e2bcb1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fb7ed879-1474-4200-88d4-70e425e2bcb1" (UID: "fb7ed879-1474-4200-88d4-70e425e2bcb1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.368500 4733 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb7ed879-1474-4200-88d4-70e425e2bcb1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.369001 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jb86w"] Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.372070 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jb86w"] Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.393073 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f92nl"] Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.396025 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-f92nl"] Mar 18 10:20:28 crc kubenswrapper[4733]: I0318 10:20:28.425246 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-z6qb2"] Mar 18 10:20:29 crc kubenswrapper[4733]: I0318 10:20:29.059351 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-z6qb2" event={"ID":"8ae3847e-6357-46a1-9578-88deb6e1531b","Type":"ContainerStarted","Data":"c456aa617e5a799611a817c36dd58fdd0a8f734f6e36771f134de22166e221d6"} Mar 18 10:20:29 crc kubenswrapper[4733]: I0318 10:20:29.059433 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-z6qb2" event={"ID":"8ae3847e-6357-46a1-9578-88deb6e1531b","Type":"ContainerStarted","Data":"a9d8c901de60212a4bb21405b7e26db54d301aa6c882e75bd3bdd11b27f7bb2d"} Mar 18 10:20:29 crc kubenswrapper[4733]: I0318 10:20:29.060074 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-z6qb2" Mar 18 10:20:29 crc kubenswrapper[4733]: I0318 10:20:29.063716 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hrwxg" event={"ID":"fb7ed879-1474-4200-88d4-70e425e2bcb1","Type":"ContainerDied","Data":"62aa2aa87c6f58e0a138486db1e0ff0949ce50a5eef4891759673935a2791e3b"} Mar 18 10:20:29 crc kubenswrapper[4733]: I0318 10:20:29.063758 4733 scope.go:117] "RemoveContainer" containerID="33010d46494372b311f8b2a190a49601d96469c4c865b75dc62dd08ddc447a47" Mar 18 10:20:29 crc kubenswrapper[4733]: I0318 10:20:29.063818 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hrwxg" Mar 18 10:20:29 crc kubenswrapper[4733]: I0318 10:20:29.064240 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-z6qb2" Mar 18 10:20:29 crc kubenswrapper[4733]: I0318 10:20:29.068138 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rls2r" event={"ID":"92996997-080b-42c9-bc2c-19c2e68db896","Type":"ContainerDied","Data":"448a9e96bdf06f234c1da361f4be5cda2d36bf670a134ff4f206711028d80cac"} Mar 18 10:20:29 crc kubenswrapper[4733]: I0318 10:20:29.068302 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rls2r" Mar 18 10:20:29 crc kubenswrapper[4733]: I0318 10:20:29.079062 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-9h9xr" Mar 18 10:20:29 crc kubenswrapper[4733]: I0318 10:20:29.079048 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-9h9xr" event={"ID":"5192f67b-f2ab-45eb-9b1a-64bdff02437a","Type":"ContainerDied","Data":"7e583c6a058ccd4e267ac556fbc1ecc397a1e062881c05b38f716c2d4a35947b"} Mar 18 10:20:29 crc kubenswrapper[4733]: I0318 10:20:29.088016 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-z6qb2" podStartSLOduration=2.08798978 podStartE2EDuration="2.08798978s" podCreationTimestamp="2026-03-18 10:20:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:20:29.08236508 +0000 UTC m=+468.574099405" watchObservedRunningTime="2026-03-18 10:20:29.08798978 +0000 UTC m=+468.579724105" Mar 18 10:20:29 crc kubenswrapper[4733]: I0318 10:20:29.091539 4733 scope.go:117] "RemoveContainer" containerID="340cfa7d2b8654b1dea28355651bf6f54381a8104d827e6d38142ffcaf93e8ae" Mar 18 10:20:29 crc kubenswrapper[4733]: I0318 10:20:29.135250 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hrwxg"] Mar 18 10:20:29 crc kubenswrapper[4733]: I0318 10:20:29.146775 4733 scope.go:117] "RemoveContainer" containerID="aa0522bdc088c10a6b3c5dba1e3ad5057a62e8ded941287c75083cef63e55041" Mar 18 10:20:29 crc kubenswrapper[4733]: I0318 10:20:29.158152 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hrwxg"] Mar 18 10:20:29 crc kubenswrapper[4733]: I0318 10:20:29.172241 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rls2r"] Mar 18 10:20:29 crc kubenswrapper[4733]: I0318 10:20:29.176829 4733 scope.go:117] "RemoveContainer" containerID="7e9cf80fc09f50439f722c47d01b38f7d154cd5514d553f0573a5303858564f5" Mar 18 10:20:29 crc kubenswrapper[4733]: I0318 10:20:29.196468 4733 scope.go:117] "RemoveContainer" containerID="9295312051c24cc07301903e63a22c698207253e2dd4d338c0be4c6fd4de6dec" Mar 18 10:20:29 crc kubenswrapper[4733]: I0318 10:20:29.226981 4733 scope.go:117] "RemoveContainer" containerID="a9bf744158dbc316b120322e1385bd5232386e738d2db0f1d91d2ac7d8a7ad1a" Mar 18 10:20:29 crc kubenswrapper[4733]: I0318 10:20:29.235283 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fd306cb-05db-40e1-a1ec-9f811ce7fec0" path="/var/lib/kubelet/pods/0fd306cb-05db-40e1-a1ec-9f811ce7fec0/volumes" Mar 18 10:20:29 crc kubenswrapper[4733]: I0318 10:20:29.236306 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="527056ad-4daf-4dd5-9e31-887d55be0336" path="/var/lib/kubelet/pods/527056ad-4daf-4dd5-9e31-887d55be0336/volumes" Mar 18 10:20:29 crc kubenswrapper[4733]: I0318 10:20:29.237005 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb7ed879-1474-4200-88d4-70e425e2bcb1" path="/var/lib/kubelet/pods/fb7ed879-1474-4200-88d4-70e425e2bcb1/volumes" Mar 18 10:20:29 crc kubenswrapper[4733]: I0318 10:20:29.238195 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rls2r"] Mar 18 10:20:29 crc kubenswrapper[4733]: I0318 10:20:29.238331 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kcbhw"] Mar 18 10:20:29 crc kubenswrapper[4733]: E0318 10:20:29.238607 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5192f67b-f2ab-45eb-9b1a-64bdff02437a" containerName="marketplace-operator" Mar 18 10:20:29 crc kubenswrapper[4733]: I0318 10:20:29.238676 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="5192f67b-f2ab-45eb-9b1a-64bdff02437a" containerName="marketplace-operator" Mar 18 10:20:29 crc kubenswrapper[4733]: E0318 10:20:29.238743 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92996997-080b-42c9-bc2c-19c2e68db896" containerName="extract-utilities" Mar 18 10:20:29 crc kubenswrapper[4733]: I0318 10:20:29.238797 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="92996997-080b-42c9-bc2c-19c2e68db896" containerName="extract-utilities" Mar 18 10:20:29 crc kubenswrapper[4733]: E0318 10:20:29.238885 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="527056ad-4daf-4dd5-9e31-887d55be0336" containerName="extract-content" Mar 18 10:20:29 crc kubenswrapper[4733]: I0318 10:20:29.238947 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="527056ad-4daf-4dd5-9e31-887d55be0336" containerName="extract-content" Mar 18 10:20:29 crc kubenswrapper[4733]: E0318 10:20:29.239018 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fd306cb-05db-40e1-a1ec-9f811ce7fec0" containerName="registry-server" Mar 18 10:20:29 crc kubenswrapper[4733]: I0318 10:20:29.239072 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fd306cb-05db-40e1-a1ec-9f811ce7fec0" containerName="registry-server" Mar 18 10:20:29 crc kubenswrapper[4733]: E0318 10:20:29.239133 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92996997-080b-42c9-bc2c-19c2e68db896" containerName="registry-server" Mar 18 10:20:29 crc kubenswrapper[4733]: I0318 10:20:29.239189 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="92996997-080b-42c9-bc2c-19c2e68db896" containerName="registry-server" Mar 18 10:20:29 crc kubenswrapper[4733]: E0318 10:20:29.239368 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb7ed879-1474-4200-88d4-70e425e2bcb1" containerName="extract-utilities" Mar 18 10:20:29 crc kubenswrapper[4733]: I0318 10:20:29.239442 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb7ed879-1474-4200-88d4-70e425e2bcb1" containerName="extract-utilities" Mar 18 10:20:29 crc kubenswrapper[4733]: E0318 10:20:29.239533 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="527056ad-4daf-4dd5-9e31-887d55be0336" containerName="registry-server" Mar 18 10:20:29 crc kubenswrapper[4733]: I0318 10:20:29.239592 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="527056ad-4daf-4dd5-9e31-887d55be0336" containerName="registry-server" Mar 18 10:20:29 crc kubenswrapper[4733]: E0318 10:20:29.239648 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fd306cb-05db-40e1-a1ec-9f811ce7fec0" containerName="extract-content" Mar 18 10:20:29 crc kubenswrapper[4733]: I0318 10:20:29.239702 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fd306cb-05db-40e1-a1ec-9f811ce7fec0" containerName="extract-content" Mar 18 10:20:29 crc kubenswrapper[4733]: E0318 10:20:29.239765 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb7ed879-1474-4200-88d4-70e425e2bcb1" containerName="extract-content" Mar 18 10:20:29 crc kubenswrapper[4733]: I0318 10:20:29.239824 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb7ed879-1474-4200-88d4-70e425e2bcb1" containerName="extract-content" Mar 18 10:20:29 crc kubenswrapper[4733]: E0318 10:20:29.239894 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="527056ad-4daf-4dd5-9e31-887d55be0336" containerName="extract-utilities" Mar 18 10:20:29 crc kubenswrapper[4733]: I0318 10:20:29.239958 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="527056ad-4daf-4dd5-9e31-887d55be0336" containerName="extract-utilities" Mar 18 10:20:29 crc kubenswrapper[4733]: E0318 10:20:29.240023 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fd306cb-05db-40e1-a1ec-9f811ce7fec0" containerName="extract-utilities" Mar 18 10:20:29 crc kubenswrapper[4733]: I0318 10:20:29.240087 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fd306cb-05db-40e1-a1ec-9f811ce7fec0" containerName="extract-utilities" Mar 18 10:20:29 crc kubenswrapper[4733]: E0318 10:20:29.240168 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb7ed879-1474-4200-88d4-70e425e2bcb1" containerName="registry-server" Mar 18 10:20:29 crc kubenswrapper[4733]: I0318 10:20:29.240252 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb7ed879-1474-4200-88d4-70e425e2bcb1" containerName="registry-server" Mar 18 10:20:29 crc kubenswrapper[4733]: E0318 10:20:29.240368 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92996997-080b-42c9-bc2c-19c2e68db896" containerName="extract-content" Mar 18 10:20:29 crc kubenswrapper[4733]: I0318 10:20:29.244421 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="92996997-080b-42c9-bc2c-19c2e68db896" containerName="extract-content" Mar 18 10:20:29 crc kubenswrapper[4733]: I0318 10:20:29.244663 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="5192f67b-f2ab-45eb-9b1a-64bdff02437a" containerName="marketplace-operator" Mar 18 10:20:29 crc kubenswrapper[4733]: I0318 10:20:29.244738 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fd306cb-05db-40e1-a1ec-9f811ce7fec0" containerName="registry-server" Mar 18 10:20:29 crc kubenswrapper[4733]: I0318 10:20:29.244832 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="92996997-080b-42c9-bc2c-19c2e68db896" containerName="registry-server" Mar 18 10:20:29 crc kubenswrapper[4733]: I0318 10:20:29.244900 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="527056ad-4daf-4dd5-9e31-887d55be0336" containerName="registry-server" Mar 18 10:20:29 crc kubenswrapper[4733]: I0318 10:20:29.244967 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb7ed879-1474-4200-88d4-70e425e2bcb1" containerName="registry-server" Mar 18 10:20:29 crc kubenswrapper[4733]: E0318 10:20:29.245143 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5192f67b-f2ab-45eb-9b1a-64bdff02437a" containerName="marketplace-operator" Mar 18 10:20:29 crc kubenswrapper[4733]: I0318 10:20:29.245231 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="5192f67b-f2ab-45eb-9b1a-64bdff02437a" containerName="marketplace-operator" Mar 18 10:20:29 crc kubenswrapper[4733]: I0318 10:20:29.245399 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="5192f67b-f2ab-45eb-9b1a-64bdff02437a" containerName="marketplace-operator" Mar 18 10:20:29 crc kubenswrapper[4733]: I0318 10:20:29.246141 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9h9xr"] Mar 18 10:20:29 crc kubenswrapper[4733]: I0318 10:20:29.246259 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9h9xr"] Mar 18 10:20:29 crc kubenswrapper[4733]: I0318 10:20:29.246464 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kcbhw"] Mar 18 10:20:29 crc kubenswrapper[4733]: I0318 10:20:29.246520 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kcbhw" Mar 18 10:20:29 crc kubenswrapper[4733]: I0318 10:20:29.248854 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 18 10:20:29 crc kubenswrapper[4733]: I0318 10:20:29.384348 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20640f37-bf35-4f24-abbb-b31cd00e5c9c-utilities\") pod \"redhat-marketplace-kcbhw\" (UID: \"20640f37-bf35-4f24-abbb-b31cd00e5c9c\") " pod="openshift-marketplace/redhat-marketplace-kcbhw" Mar 18 10:20:29 crc kubenswrapper[4733]: I0318 10:20:29.384438 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20640f37-bf35-4f24-abbb-b31cd00e5c9c-catalog-content\") pod \"redhat-marketplace-kcbhw\" (UID: \"20640f37-bf35-4f24-abbb-b31cd00e5c9c\") " pod="openshift-marketplace/redhat-marketplace-kcbhw" Mar 18 10:20:29 crc kubenswrapper[4733]: I0318 10:20:29.384476 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jzxm\" (UniqueName: \"kubernetes.io/projected/20640f37-bf35-4f24-abbb-b31cd00e5c9c-kube-api-access-5jzxm\") pod \"redhat-marketplace-kcbhw\" (UID: \"20640f37-bf35-4f24-abbb-b31cd00e5c9c\") " pod="openshift-marketplace/redhat-marketplace-kcbhw" Mar 18 10:20:29 crc kubenswrapper[4733]: I0318 10:20:29.487542 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20640f37-bf35-4f24-abbb-b31cd00e5c9c-catalog-content\") pod \"redhat-marketplace-kcbhw\" (UID: \"20640f37-bf35-4f24-abbb-b31cd00e5c9c\") " pod="openshift-marketplace/redhat-marketplace-kcbhw" Mar 18 10:20:29 crc kubenswrapper[4733]: I0318 10:20:29.487650 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jzxm\" (UniqueName: \"kubernetes.io/projected/20640f37-bf35-4f24-abbb-b31cd00e5c9c-kube-api-access-5jzxm\") pod \"redhat-marketplace-kcbhw\" (UID: \"20640f37-bf35-4f24-abbb-b31cd00e5c9c\") " pod="openshift-marketplace/redhat-marketplace-kcbhw" Mar 18 10:20:29 crc kubenswrapper[4733]: I0318 10:20:29.487862 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20640f37-bf35-4f24-abbb-b31cd00e5c9c-utilities\") pod \"redhat-marketplace-kcbhw\" (UID: \"20640f37-bf35-4f24-abbb-b31cd00e5c9c\") " pod="openshift-marketplace/redhat-marketplace-kcbhw" Mar 18 10:20:29 crc kubenswrapper[4733]: I0318 10:20:29.488539 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20640f37-bf35-4f24-abbb-b31cd00e5c9c-catalog-content\") pod \"redhat-marketplace-kcbhw\" (UID: \"20640f37-bf35-4f24-abbb-b31cd00e5c9c\") " pod="openshift-marketplace/redhat-marketplace-kcbhw" Mar 18 10:20:29 crc kubenswrapper[4733]: I0318 10:20:29.488747 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20640f37-bf35-4f24-abbb-b31cd00e5c9c-utilities\") pod \"redhat-marketplace-kcbhw\" (UID: \"20640f37-bf35-4f24-abbb-b31cd00e5c9c\") " pod="openshift-marketplace/redhat-marketplace-kcbhw" Mar 18 10:20:29 crc kubenswrapper[4733]: I0318 10:20:29.524751 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jzxm\" (UniqueName: \"kubernetes.io/projected/20640f37-bf35-4f24-abbb-b31cd00e5c9c-kube-api-access-5jzxm\") pod \"redhat-marketplace-kcbhw\" (UID: \"20640f37-bf35-4f24-abbb-b31cd00e5c9c\") " pod="openshift-marketplace/redhat-marketplace-kcbhw" Mar 18 10:20:29 crc kubenswrapper[4733]: I0318 10:20:29.571141 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kcbhw" Mar 18 10:20:29 crc kubenswrapper[4733]: I0318 10:20:29.832136 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kcbhw"] Mar 18 10:20:29 crc kubenswrapper[4733]: W0318 10:20:29.845358 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20640f37_bf35_4f24_abbb_b31cd00e5c9c.slice/crio-675eafbf68a5f68c2b8e076e71fb67c1665b94db59299648dbebe38a01d77c6e WatchSource:0}: Error finding container 675eafbf68a5f68c2b8e076e71fb67c1665b94db59299648dbebe38a01d77c6e: Status 404 returned error can't find the container with id 675eafbf68a5f68c2b8e076e71fb67c1665b94db59299648dbebe38a01d77c6e Mar 18 10:20:30 crc kubenswrapper[4733]: I0318 10:20:30.096236 4733 generic.go:334] "Generic (PLEG): container finished" podID="20640f37-bf35-4f24-abbb-b31cd00e5c9c" containerID="a6279a4b196a2b80b7ac0476edb850e8a5114b94481124dfdd5dd9a1afe30a1a" exitCode=0 Mar 18 10:20:30 crc kubenswrapper[4733]: I0318 10:20:30.096349 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kcbhw" event={"ID":"20640f37-bf35-4f24-abbb-b31cd00e5c9c","Type":"ContainerDied","Data":"a6279a4b196a2b80b7ac0476edb850e8a5114b94481124dfdd5dd9a1afe30a1a"} Mar 18 10:20:30 crc kubenswrapper[4733]: I0318 10:20:30.098585 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kcbhw" event={"ID":"20640f37-bf35-4f24-abbb-b31cd00e5c9c","Type":"ContainerStarted","Data":"675eafbf68a5f68c2b8e076e71fb67c1665b94db59299648dbebe38a01d77c6e"} Mar 18 10:20:30 crc kubenswrapper[4733]: I0318 10:20:30.139525 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rs2b6"] Mar 18 10:20:30 crc kubenswrapper[4733]: I0318 10:20:30.140926 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rs2b6" Mar 18 10:20:30 crc kubenswrapper[4733]: I0318 10:20:30.144959 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 18 10:20:30 crc kubenswrapper[4733]: I0318 10:20:30.155128 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rs2b6"] Mar 18 10:20:30 crc kubenswrapper[4733]: I0318 10:20:30.323028 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3018dd18-ee9f-44a1-ab22-a6bddde19b31-utilities\") pod \"redhat-operators-rs2b6\" (UID: \"3018dd18-ee9f-44a1-ab22-a6bddde19b31\") " pod="openshift-marketplace/redhat-operators-rs2b6" Mar 18 10:20:30 crc kubenswrapper[4733]: I0318 10:20:30.323074 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3018dd18-ee9f-44a1-ab22-a6bddde19b31-catalog-content\") pod \"redhat-operators-rs2b6\" (UID: \"3018dd18-ee9f-44a1-ab22-a6bddde19b31\") " pod="openshift-marketplace/redhat-operators-rs2b6" Mar 18 10:20:30 crc kubenswrapper[4733]: I0318 10:20:30.323358 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrr52\" (UniqueName: \"kubernetes.io/projected/3018dd18-ee9f-44a1-ab22-a6bddde19b31-kube-api-access-jrr52\") pod \"redhat-operators-rs2b6\" (UID: \"3018dd18-ee9f-44a1-ab22-a6bddde19b31\") " pod="openshift-marketplace/redhat-operators-rs2b6" Mar 18 10:20:30 crc kubenswrapper[4733]: I0318 10:20:30.424648 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3018dd18-ee9f-44a1-ab22-a6bddde19b31-utilities\") pod \"redhat-operators-rs2b6\" (UID: \"3018dd18-ee9f-44a1-ab22-a6bddde19b31\") " pod="openshift-marketplace/redhat-operators-rs2b6" Mar 18 10:20:30 crc kubenswrapper[4733]: I0318 10:20:30.424696 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3018dd18-ee9f-44a1-ab22-a6bddde19b31-catalog-content\") pod \"redhat-operators-rs2b6\" (UID: \"3018dd18-ee9f-44a1-ab22-a6bddde19b31\") " pod="openshift-marketplace/redhat-operators-rs2b6" Mar 18 10:20:30 crc kubenswrapper[4733]: I0318 10:20:30.424758 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrr52\" (UniqueName: \"kubernetes.io/projected/3018dd18-ee9f-44a1-ab22-a6bddde19b31-kube-api-access-jrr52\") pod \"redhat-operators-rs2b6\" (UID: \"3018dd18-ee9f-44a1-ab22-a6bddde19b31\") " pod="openshift-marketplace/redhat-operators-rs2b6" Mar 18 10:20:30 crc kubenswrapper[4733]: I0318 10:20:30.425741 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3018dd18-ee9f-44a1-ab22-a6bddde19b31-utilities\") pod \"redhat-operators-rs2b6\" (UID: \"3018dd18-ee9f-44a1-ab22-a6bddde19b31\") " pod="openshift-marketplace/redhat-operators-rs2b6" Mar 18 10:20:30 crc kubenswrapper[4733]: I0318 10:20:30.426053 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3018dd18-ee9f-44a1-ab22-a6bddde19b31-catalog-content\") pod \"redhat-operators-rs2b6\" (UID: \"3018dd18-ee9f-44a1-ab22-a6bddde19b31\") " pod="openshift-marketplace/redhat-operators-rs2b6" Mar 18 10:20:30 crc kubenswrapper[4733]: I0318 10:20:30.446118 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrr52\" (UniqueName: \"kubernetes.io/projected/3018dd18-ee9f-44a1-ab22-a6bddde19b31-kube-api-access-jrr52\") pod \"redhat-operators-rs2b6\" (UID: \"3018dd18-ee9f-44a1-ab22-a6bddde19b31\") " pod="openshift-marketplace/redhat-operators-rs2b6" Mar 18 10:20:30 crc kubenswrapper[4733]: I0318 10:20:30.467728 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rs2b6" Mar 18 10:20:30 crc kubenswrapper[4733]: I0318 10:20:30.698978 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rs2b6"] Mar 18 10:20:31 crc kubenswrapper[4733]: I0318 10:20:31.125554 4733 generic.go:334] "Generic (PLEG): container finished" podID="20640f37-bf35-4f24-abbb-b31cd00e5c9c" containerID="174d079b92cada4004abb1338a1270574f6ce4e8b3c98eedfa1143baf90d2d39" exitCode=0 Mar 18 10:20:31 crc kubenswrapper[4733]: I0318 10:20:31.125626 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kcbhw" event={"ID":"20640f37-bf35-4f24-abbb-b31cd00e5c9c","Type":"ContainerDied","Data":"174d079b92cada4004abb1338a1270574f6ce4e8b3c98eedfa1143baf90d2d39"} Mar 18 10:20:31 crc kubenswrapper[4733]: I0318 10:20:31.133428 4733 generic.go:334] "Generic (PLEG): container finished" podID="3018dd18-ee9f-44a1-ab22-a6bddde19b31" containerID="56f3163d6c13192463a53571cfbc21806b8894bc2c6d8bd6947c8fd7104dd2c3" exitCode=0 Mar 18 10:20:31 crc kubenswrapper[4733]: I0318 10:20:31.134564 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rs2b6" event={"ID":"3018dd18-ee9f-44a1-ab22-a6bddde19b31","Type":"ContainerDied","Data":"56f3163d6c13192463a53571cfbc21806b8894bc2c6d8bd6947c8fd7104dd2c3"} Mar 18 10:20:31 crc kubenswrapper[4733]: I0318 10:20:31.134649 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rs2b6" event={"ID":"3018dd18-ee9f-44a1-ab22-a6bddde19b31","Type":"ContainerStarted","Data":"10566242ba9e1966bd518bca1f234dab4e3ed98028c6d262dba24228d30cb8d1"} Mar 18 10:20:31 crc kubenswrapper[4733]: I0318 10:20:31.191937 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5192f67b-f2ab-45eb-9b1a-64bdff02437a" path="/var/lib/kubelet/pods/5192f67b-f2ab-45eb-9b1a-64bdff02437a/volumes" Mar 18 10:20:31 crc kubenswrapper[4733]: I0318 10:20:31.193001 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92996997-080b-42c9-bc2c-19c2e68db896" path="/var/lib/kubelet/pods/92996997-080b-42c9-bc2c-19c2e68db896/volumes" Mar 18 10:20:31 crc kubenswrapper[4733]: I0318 10:20:31.537447 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-c9s27"] Mar 18 10:20:31 crc kubenswrapper[4733]: I0318 10:20:31.539539 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c9s27" Mar 18 10:20:31 crc kubenswrapper[4733]: I0318 10:20:31.544615 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 18 10:20:31 crc kubenswrapper[4733]: I0318 10:20:31.551548 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c9s27"] Mar 18 10:20:31 crc kubenswrapper[4733]: I0318 10:20:31.695144 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3f55919-82b3-4117-8734-cb9a26364d83-utilities\") pod \"certified-operators-c9s27\" (UID: \"a3f55919-82b3-4117-8734-cb9a26364d83\") " pod="openshift-marketplace/certified-operators-c9s27" Mar 18 10:20:31 crc kubenswrapper[4733]: I0318 10:20:31.695289 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxw6z\" (UniqueName: \"kubernetes.io/projected/a3f55919-82b3-4117-8734-cb9a26364d83-kube-api-access-jxw6z\") pod \"certified-operators-c9s27\" (UID: \"a3f55919-82b3-4117-8734-cb9a26364d83\") " pod="openshift-marketplace/certified-operators-c9s27" Mar 18 10:20:31 crc kubenswrapper[4733]: I0318 10:20:31.695333 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3f55919-82b3-4117-8734-cb9a26364d83-catalog-content\") pod \"certified-operators-c9s27\" (UID: \"a3f55919-82b3-4117-8734-cb9a26364d83\") " pod="openshift-marketplace/certified-operators-c9s27" Mar 18 10:20:31 crc kubenswrapper[4733]: I0318 10:20:31.797380 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3f55919-82b3-4117-8734-cb9a26364d83-utilities\") pod \"certified-operators-c9s27\" (UID: \"a3f55919-82b3-4117-8734-cb9a26364d83\") " pod="openshift-marketplace/certified-operators-c9s27" Mar 18 10:20:31 crc kubenswrapper[4733]: I0318 10:20:31.797454 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxw6z\" (UniqueName: \"kubernetes.io/projected/a3f55919-82b3-4117-8734-cb9a26364d83-kube-api-access-jxw6z\") pod \"certified-operators-c9s27\" (UID: \"a3f55919-82b3-4117-8734-cb9a26364d83\") " pod="openshift-marketplace/certified-operators-c9s27" Mar 18 10:20:31 crc kubenswrapper[4733]: I0318 10:20:31.797490 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3f55919-82b3-4117-8734-cb9a26364d83-catalog-content\") pod \"certified-operators-c9s27\" (UID: \"a3f55919-82b3-4117-8734-cb9a26364d83\") " pod="openshift-marketplace/certified-operators-c9s27" Mar 18 10:20:31 crc kubenswrapper[4733]: I0318 10:20:31.798091 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3f55919-82b3-4117-8734-cb9a26364d83-catalog-content\") pod \"certified-operators-c9s27\" (UID: \"a3f55919-82b3-4117-8734-cb9a26364d83\") " pod="openshift-marketplace/certified-operators-c9s27" Mar 18 10:20:31 crc kubenswrapper[4733]: I0318 10:20:31.798090 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3f55919-82b3-4117-8734-cb9a26364d83-utilities\") pod \"certified-operators-c9s27\" (UID: \"a3f55919-82b3-4117-8734-cb9a26364d83\") " pod="openshift-marketplace/certified-operators-c9s27" Mar 18 10:20:31 crc kubenswrapper[4733]: I0318 10:20:31.820667 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxw6z\" (UniqueName: \"kubernetes.io/projected/a3f55919-82b3-4117-8734-cb9a26364d83-kube-api-access-jxw6z\") pod \"certified-operators-c9s27\" (UID: \"a3f55919-82b3-4117-8734-cb9a26364d83\") " pod="openshift-marketplace/certified-operators-c9s27" Mar 18 10:20:31 crc kubenswrapper[4733]: I0318 10:20:31.925387 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c9s27" Mar 18 10:20:32 crc kubenswrapper[4733]: I0318 10:20:32.145831 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rs2b6" event={"ID":"3018dd18-ee9f-44a1-ab22-a6bddde19b31","Type":"ContainerStarted","Data":"bd982dd2a6e8441453380cd896f110daacf18a1d8350c32f90c9d5ee2597cdc9"} Mar 18 10:20:32 crc kubenswrapper[4733]: I0318 10:20:32.155890 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kcbhw" event={"ID":"20640f37-bf35-4f24-abbb-b31cd00e5c9c","Type":"ContainerStarted","Data":"59625b60cde32a2df996e3f69224cf4c700e120ecd187c8f2bddd3a209e7c5d7"} Mar 18 10:20:32 crc kubenswrapper[4733]: I0318 10:20:32.212850 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kcbhw" podStartSLOduration=1.7134267140000001 podStartE2EDuration="3.212821665s" podCreationTimestamp="2026-03-18 10:20:29 +0000 UTC" firstStartedPulling="2026-03-18 10:20:30.098387848 +0000 UTC m=+469.590122183" lastFinishedPulling="2026-03-18 10:20:31.597782799 +0000 UTC m=+471.089517134" observedRunningTime="2026-03-18 10:20:32.206863527 +0000 UTC m=+471.698597852" watchObservedRunningTime="2026-03-18 10:20:32.212821665 +0000 UTC m=+471.704555990" Mar 18 10:20:32 crc kubenswrapper[4733]: I0318 10:20:32.215028 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c9s27"] Mar 18 10:20:32 crc kubenswrapper[4733]: W0318 10:20:32.224158 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3f55919_82b3_4117_8734_cb9a26364d83.slice/crio-efa271b227a89b0d5a1d77b150c9d7877c28dcc24ca0532ddf853caac8cbb6f9 WatchSource:0}: Error finding container efa271b227a89b0d5a1d77b150c9d7877c28dcc24ca0532ddf853caac8cbb6f9: Status 404 returned error can't find the container with id efa271b227a89b0d5a1d77b150c9d7877c28dcc24ca0532ddf853caac8cbb6f9 Mar 18 10:20:32 crc kubenswrapper[4733]: I0318 10:20:32.534142 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-smkxx"] Mar 18 10:20:32 crc kubenswrapper[4733]: I0318 10:20:32.535669 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-smkxx" Mar 18 10:20:32 crc kubenswrapper[4733]: I0318 10:20:32.539251 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 18 10:20:32 crc kubenswrapper[4733]: I0318 10:20:32.545917 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-smkxx"] Mar 18 10:20:32 crc kubenswrapper[4733]: I0318 10:20:32.713729 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b618f79-3791-49a8-a6aa-307fb25af727-utilities\") pod \"community-operators-smkxx\" (UID: \"3b618f79-3791-49a8-a6aa-307fb25af727\") " pod="openshift-marketplace/community-operators-smkxx" Mar 18 10:20:32 crc kubenswrapper[4733]: I0318 10:20:32.713831 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8lxn\" (UniqueName: \"kubernetes.io/projected/3b618f79-3791-49a8-a6aa-307fb25af727-kube-api-access-s8lxn\") pod \"community-operators-smkxx\" (UID: \"3b618f79-3791-49a8-a6aa-307fb25af727\") " pod="openshift-marketplace/community-operators-smkxx" Mar 18 10:20:32 crc kubenswrapper[4733]: I0318 10:20:32.713866 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b618f79-3791-49a8-a6aa-307fb25af727-catalog-content\") pod \"community-operators-smkxx\" (UID: \"3b618f79-3791-49a8-a6aa-307fb25af727\") " pod="openshift-marketplace/community-operators-smkxx" Mar 18 10:20:32 crc kubenswrapper[4733]: I0318 10:20:32.816773 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8lxn\" (UniqueName: \"kubernetes.io/projected/3b618f79-3791-49a8-a6aa-307fb25af727-kube-api-access-s8lxn\") pod \"community-operators-smkxx\" (UID: \"3b618f79-3791-49a8-a6aa-307fb25af727\") " pod="openshift-marketplace/community-operators-smkxx" Mar 18 10:20:32 crc kubenswrapper[4733]: I0318 10:20:32.816843 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b618f79-3791-49a8-a6aa-307fb25af727-catalog-content\") pod \"community-operators-smkxx\" (UID: \"3b618f79-3791-49a8-a6aa-307fb25af727\") " pod="openshift-marketplace/community-operators-smkxx" Mar 18 10:20:32 crc kubenswrapper[4733]: I0318 10:20:32.816895 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b618f79-3791-49a8-a6aa-307fb25af727-utilities\") pod \"community-operators-smkxx\" (UID: \"3b618f79-3791-49a8-a6aa-307fb25af727\") " pod="openshift-marketplace/community-operators-smkxx" Mar 18 10:20:32 crc kubenswrapper[4733]: I0318 10:20:32.817433 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b618f79-3791-49a8-a6aa-307fb25af727-utilities\") pod \"community-operators-smkxx\" (UID: \"3b618f79-3791-49a8-a6aa-307fb25af727\") " pod="openshift-marketplace/community-operators-smkxx" Mar 18 10:20:32 crc kubenswrapper[4733]: I0318 10:20:32.817626 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b618f79-3791-49a8-a6aa-307fb25af727-catalog-content\") pod \"community-operators-smkxx\" (UID: \"3b618f79-3791-49a8-a6aa-307fb25af727\") " pod="openshift-marketplace/community-operators-smkxx" Mar 18 10:20:32 crc kubenswrapper[4733]: I0318 10:20:32.850570 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8lxn\" (UniqueName: \"kubernetes.io/projected/3b618f79-3791-49a8-a6aa-307fb25af727-kube-api-access-s8lxn\") pod \"community-operators-smkxx\" (UID: \"3b618f79-3791-49a8-a6aa-307fb25af727\") " pod="openshift-marketplace/community-operators-smkxx" Mar 18 10:20:33 crc kubenswrapper[4733]: I0318 10:20:33.150503 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-smkxx" Mar 18 10:20:33 crc kubenswrapper[4733]: I0318 10:20:33.181528 4733 generic.go:334] "Generic (PLEG): container finished" podID="a3f55919-82b3-4117-8734-cb9a26364d83" containerID="e606dfee6a97efc0a2e0b4fe618cac06e1e645ae870d8e8c6b065cc31aafe0d4" exitCode=0 Mar 18 10:20:33 crc kubenswrapper[4733]: I0318 10:20:33.187776 4733 generic.go:334] "Generic (PLEG): container finished" podID="3018dd18-ee9f-44a1-ab22-a6bddde19b31" containerID="bd982dd2a6e8441453380cd896f110daacf18a1d8350c32f90c9d5ee2597cdc9" exitCode=0 Mar 18 10:20:33 crc kubenswrapper[4733]: I0318 10:20:33.208429 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c9s27" event={"ID":"a3f55919-82b3-4117-8734-cb9a26364d83","Type":"ContainerDied","Data":"e606dfee6a97efc0a2e0b4fe618cac06e1e645ae870d8e8c6b065cc31aafe0d4"} Mar 18 10:20:33 crc kubenswrapper[4733]: I0318 10:20:33.208923 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c9s27" event={"ID":"a3f55919-82b3-4117-8734-cb9a26364d83","Type":"ContainerStarted","Data":"efa271b227a89b0d5a1d77b150c9d7877c28dcc24ca0532ddf853caac8cbb6f9"} Mar 18 10:20:33 crc kubenswrapper[4733]: I0318 10:20:33.208941 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rs2b6" event={"ID":"3018dd18-ee9f-44a1-ab22-a6bddde19b31","Type":"ContainerDied","Data":"bd982dd2a6e8441453380cd896f110daacf18a1d8350c32f90c9d5ee2597cdc9"} Mar 18 10:20:33 crc kubenswrapper[4733]: I0318 10:20:33.399421 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-smkxx"] Mar 18 10:20:33 crc kubenswrapper[4733]: W0318 10:20:33.423642 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b618f79_3791_49a8_a6aa_307fb25af727.slice/crio-a1a400c266e61794ede357d7a38da9c390230097dfc9e30144046a9753588466 WatchSource:0}: Error finding container a1a400c266e61794ede357d7a38da9c390230097dfc9e30144046a9753588466: Status 404 returned error can't find the container with id a1a400c266e61794ede357d7a38da9c390230097dfc9e30144046a9753588466 Mar 18 10:20:34 crc kubenswrapper[4733]: I0318 10:20:34.209268 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rs2b6" event={"ID":"3018dd18-ee9f-44a1-ab22-a6bddde19b31","Type":"ContainerStarted","Data":"276b2767a0e52da4afffdbb245552b0f62633b7d2d1bbed274dbc46ec34e8057"} Mar 18 10:20:34 crc kubenswrapper[4733]: I0318 10:20:34.211068 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c9s27" event={"ID":"a3f55919-82b3-4117-8734-cb9a26364d83","Type":"ContainerStarted","Data":"4165647e230870e79d6ec6d3439c6520c6ee58ecbce291ff2fced2a81650c005"} Mar 18 10:20:34 crc kubenswrapper[4733]: I0318 10:20:34.220514 4733 generic.go:334] "Generic (PLEG): container finished" podID="3b618f79-3791-49a8-a6aa-307fb25af727" containerID="a6d690c2b27ccb116b3ec078eacfdb29e10df8cced9c0cc50fb4f8814235e950" exitCode=0 Mar 18 10:20:34 crc kubenswrapper[4733]: I0318 10:20:34.220577 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-smkxx" event={"ID":"3b618f79-3791-49a8-a6aa-307fb25af727","Type":"ContainerDied","Data":"a6d690c2b27ccb116b3ec078eacfdb29e10df8cced9c0cc50fb4f8814235e950"} Mar 18 10:20:34 crc kubenswrapper[4733]: I0318 10:20:34.220613 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-smkxx" event={"ID":"3b618f79-3791-49a8-a6aa-307fb25af727","Type":"ContainerStarted","Data":"a1a400c266e61794ede357d7a38da9c390230097dfc9e30144046a9753588466"} Mar 18 10:20:34 crc kubenswrapper[4733]: I0318 10:20:34.231953 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rs2b6" podStartSLOduration=1.748036549 podStartE2EDuration="4.231938735s" podCreationTimestamp="2026-03-18 10:20:30 +0000 UTC" firstStartedPulling="2026-03-18 10:20:31.13538435 +0000 UTC m=+470.627118715" lastFinishedPulling="2026-03-18 10:20:33.619286576 +0000 UTC m=+473.111020901" observedRunningTime="2026-03-18 10:20:34.230929207 +0000 UTC m=+473.722663532" watchObservedRunningTime="2026-03-18 10:20:34.231938735 +0000 UTC m=+473.723673080" Mar 18 10:20:35 crc kubenswrapper[4733]: I0318 10:20:35.229419 4733 generic.go:334] "Generic (PLEG): container finished" podID="a3f55919-82b3-4117-8734-cb9a26364d83" containerID="4165647e230870e79d6ec6d3439c6520c6ee58ecbce291ff2fced2a81650c005" exitCode=0 Mar 18 10:20:35 crc kubenswrapper[4733]: I0318 10:20:35.229498 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c9s27" event={"ID":"a3f55919-82b3-4117-8734-cb9a26364d83","Type":"ContainerDied","Data":"4165647e230870e79d6ec6d3439c6520c6ee58ecbce291ff2fced2a81650c005"} Mar 18 10:20:36 crc kubenswrapper[4733]: I0318 10:20:36.240642 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c9s27" event={"ID":"a3f55919-82b3-4117-8734-cb9a26364d83","Type":"ContainerStarted","Data":"233210e901efd5f8a61568c3871e0630b9a0f5d9872050ea82660e858137b2ee"} Mar 18 10:20:36 crc kubenswrapper[4733]: I0318 10:20:36.243394 4733 generic.go:334] "Generic (PLEG): container finished" podID="3b618f79-3791-49a8-a6aa-307fb25af727" containerID="4242ac68926bca06f761881b3ad94794f94a8733f43e8e77ca87a194635b909f" exitCode=0 Mar 18 10:20:36 crc kubenswrapper[4733]: I0318 10:20:36.243451 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-smkxx" event={"ID":"3b618f79-3791-49a8-a6aa-307fb25af727","Type":"ContainerDied","Data":"4242ac68926bca06f761881b3ad94794f94a8733f43e8e77ca87a194635b909f"} Mar 18 10:20:36 crc kubenswrapper[4733]: I0318 10:20:36.271721 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-c9s27" podStartSLOduration=2.844417524 podStartE2EDuration="5.27169448s" podCreationTimestamp="2026-03-18 10:20:31 +0000 UTC" firstStartedPulling="2026-03-18 10:20:33.183919622 +0000 UTC m=+472.675653957" lastFinishedPulling="2026-03-18 10:20:35.611196588 +0000 UTC m=+475.102930913" observedRunningTime="2026-03-18 10:20:36.268808919 +0000 UTC m=+475.760543244" watchObservedRunningTime="2026-03-18 10:20:36.27169448 +0000 UTC m=+475.763428805" Mar 18 10:20:37 crc kubenswrapper[4733]: I0318 10:20:37.253630 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-smkxx" event={"ID":"3b618f79-3791-49a8-a6aa-307fb25af727","Type":"ContainerStarted","Data":"de1f82b2a6315c18123f4ab94fb2f046135a4829f1ab2eff1b0c278227e5cd61"} Mar 18 10:20:39 crc kubenswrapper[4733]: I0318 10:20:39.571389 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kcbhw" Mar 18 10:20:39 crc kubenswrapper[4733]: I0318 10:20:39.571598 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kcbhw" Mar 18 10:20:39 crc kubenswrapper[4733]: I0318 10:20:39.637313 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kcbhw" Mar 18 10:20:39 crc kubenswrapper[4733]: I0318 10:20:39.657672 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-smkxx" podStartSLOduration=5.109467928 podStartE2EDuration="7.657647102s" podCreationTimestamp="2026-03-18 10:20:32 +0000 UTC" firstStartedPulling="2026-03-18 10:20:34.222296183 +0000 UTC m=+473.714030518" lastFinishedPulling="2026-03-18 10:20:36.770475367 +0000 UTC m=+476.262209692" observedRunningTime="2026-03-18 10:20:37.289654583 +0000 UTC m=+476.781388908" watchObservedRunningTime="2026-03-18 10:20:39.657647102 +0000 UTC m=+479.149381427" Mar 18 10:20:40 crc kubenswrapper[4733]: I0318 10:20:40.362227 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kcbhw" Mar 18 10:20:40 crc kubenswrapper[4733]: I0318 10:20:40.469070 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rs2b6" Mar 18 10:20:40 crc kubenswrapper[4733]: I0318 10:20:40.469132 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rs2b6" Mar 18 10:20:41 crc kubenswrapper[4733]: I0318 10:20:41.525485 4733 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rs2b6" podUID="3018dd18-ee9f-44a1-ab22-a6bddde19b31" containerName="registry-server" probeResult="failure" output=< Mar 18 10:20:41 crc kubenswrapper[4733]: timeout: failed to connect service ":50051" within 1s Mar 18 10:20:41 crc kubenswrapper[4733]: > Mar 18 10:20:41 crc kubenswrapper[4733]: I0318 10:20:41.926440 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-c9s27" Mar 18 10:20:41 crc kubenswrapper[4733]: I0318 10:20:41.926896 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-c9s27" Mar 18 10:20:41 crc kubenswrapper[4733]: I0318 10:20:41.991967 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-c9s27" Mar 18 10:20:42 crc kubenswrapper[4733]: I0318 10:20:42.363302 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-c9s27" Mar 18 10:20:43 crc kubenswrapper[4733]: I0318 10:20:43.150996 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-smkxx" Mar 18 10:20:43 crc kubenswrapper[4733]: I0318 10:20:43.151059 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-smkxx" Mar 18 10:20:43 crc kubenswrapper[4733]: I0318 10:20:43.244522 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-smkxx" Mar 18 10:20:43 crc kubenswrapper[4733]: I0318 10:20:43.364345 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-smkxx" Mar 18 10:20:43 crc kubenswrapper[4733]: I0318 10:20:43.571487 4733 patch_prober.go:28] interesting pod/machine-config-daemon-2h7dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:20:43 crc kubenswrapper[4733]: I0318 10:20:43.571584 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:20:43 crc kubenswrapper[4733]: I0318 10:20:43.571652 4733 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" Mar 18 10:20:43 crc kubenswrapper[4733]: I0318 10:20:43.572824 4733 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2dcc5035fa17fe3e92cf26ce37e02cacce4ad31a0593e6e1184b98062f31f028"} pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 10:20:43 crc kubenswrapper[4733]: I0318 10:20:43.572932 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" containerName="machine-config-daemon" containerID="cri-o://2dcc5035fa17fe3e92cf26ce37e02cacce4ad31a0593e6e1184b98062f31f028" gracePeriod=600 Mar 18 10:20:44 crc kubenswrapper[4733]: I0318 10:20:44.337573 4733 generic.go:334] "Generic (PLEG): container finished" podID="6f75e1c5-e0c5-43df-944f-77b734070793" containerID="2dcc5035fa17fe3e92cf26ce37e02cacce4ad31a0593e6e1184b98062f31f028" exitCode=0 Mar 18 10:20:44 crc kubenswrapper[4733]: I0318 10:20:44.337736 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" event={"ID":"6f75e1c5-e0c5-43df-944f-77b734070793","Type":"ContainerDied","Data":"2dcc5035fa17fe3e92cf26ce37e02cacce4ad31a0593e6e1184b98062f31f028"} Mar 18 10:20:44 crc kubenswrapper[4733]: I0318 10:20:44.338457 4733 scope.go:117] "RemoveContainer" containerID="615e7a90421535b4f8ff5e3b3a0ad9c958710094ffa4e3e4eb3eb41c79f80830" Mar 18 10:20:45 crc kubenswrapper[4733]: I0318 10:20:45.346897 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" event={"ID":"6f75e1c5-e0c5-43df-944f-77b734070793","Type":"ContainerStarted","Data":"bff727181393f1168072f98fbfc5cda5acfb0782a9ae8a688a8335ed7323a527"} Mar 18 10:20:49 crc kubenswrapper[4733]: I0318 10:20:49.306338 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" podUID="7b5dc098-4a15-429b-8243-1ac75ce2e0c1" containerName="registry" containerID="cri-o://d196ee7bec70e95ed9ff3308e0424855deaa072d0c4faba514ae98e1dcaec085" gracePeriod=30 Mar 18 10:20:49 crc kubenswrapper[4733]: I0318 10:20:49.683115 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:20:49 crc kubenswrapper[4733]: I0318 10:20:49.838958 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qqrn\" (UniqueName: \"kubernetes.io/projected/7b5dc098-4a15-429b-8243-1ac75ce2e0c1-kube-api-access-2qqrn\") pod \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " Mar 18 10:20:49 crc kubenswrapper[4733]: I0318 10:20:49.839016 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7b5dc098-4a15-429b-8243-1ac75ce2e0c1-registry-tls\") pod \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " Mar 18 10:20:49 crc kubenswrapper[4733]: I0318 10:20:49.839081 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7b5dc098-4a15-429b-8243-1ac75ce2e0c1-ca-trust-extracted\") pod \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " Mar 18 10:20:49 crc kubenswrapper[4733]: I0318 10:20:49.839118 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7b5dc098-4a15-429b-8243-1ac75ce2e0c1-installation-pull-secrets\") pod \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " Mar 18 10:20:49 crc kubenswrapper[4733]: I0318 10:20:49.839316 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " Mar 18 10:20:49 crc kubenswrapper[4733]: I0318 10:20:49.839399 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7b5dc098-4a15-429b-8243-1ac75ce2e0c1-trusted-ca\") pod \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " Mar 18 10:20:49 crc kubenswrapper[4733]: I0318 10:20:49.839437 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7b5dc098-4a15-429b-8243-1ac75ce2e0c1-bound-sa-token\") pod \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " Mar 18 10:20:49 crc kubenswrapper[4733]: I0318 10:20:49.839524 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7b5dc098-4a15-429b-8243-1ac75ce2e0c1-registry-certificates\") pod \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\" (UID: \"7b5dc098-4a15-429b-8243-1ac75ce2e0c1\") " Mar 18 10:20:49 crc kubenswrapper[4733]: I0318 10:20:49.840722 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b5dc098-4a15-429b-8243-1ac75ce2e0c1-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "7b5dc098-4a15-429b-8243-1ac75ce2e0c1" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:20:49 crc kubenswrapper[4733]: I0318 10:20:49.844034 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b5dc098-4a15-429b-8243-1ac75ce2e0c1-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "7b5dc098-4a15-429b-8243-1ac75ce2e0c1" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:20:49 crc kubenswrapper[4733]: I0318 10:20:49.851782 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b5dc098-4a15-429b-8243-1ac75ce2e0c1-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "7b5dc098-4a15-429b-8243-1ac75ce2e0c1" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:20:49 crc kubenswrapper[4733]: I0318 10:20:49.854876 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b5dc098-4a15-429b-8243-1ac75ce2e0c1-kube-api-access-2qqrn" (OuterVolumeSpecName: "kube-api-access-2qqrn") pod "7b5dc098-4a15-429b-8243-1ac75ce2e0c1" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1"). InnerVolumeSpecName "kube-api-access-2qqrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:20:49 crc kubenswrapper[4733]: I0318 10:20:49.855397 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b5dc098-4a15-429b-8243-1ac75ce2e0c1-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "7b5dc098-4a15-429b-8243-1ac75ce2e0c1" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:20:49 crc kubenswrapper[4733]: I0318 10:20:49.855652 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b5dc098-4a15-429b-8243-1ac75ce2e0c1-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "7b5dc098-4a15-429b-8243-1ac75ce2e0c1" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:20:49 crc kubenswrapper[4733]: I0318 10:20:49.859161 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b5dc098-4a15-429b-8243-1ac75ce2e0c1-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "7b5dc098-4a15-429b-8243-1ac75ce2e0c1" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:20:49 crc kubenswrapper[4733]: I0318 10:20:49.870886 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "7b5dc098-4a15-429b-8243-1ac75ce2e0c1" (UID: "7b5dc098-4a15-429b-8243-1ac75ce2e0c1"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 18 10:20:49 crc kubenswrapper[4733]: I0318 10:20:49.941789 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qqrn\" (UniqueName: \"kubernetes.io/projected/7b5dc098-4a15-429b-8243-1ac75ce2e0c1-kube-api-access-2qqrn\") on node \"crc\" DevicePath \"\"" Mar 18 10:20:49 crc kubenswrapper[4733]: I0318 10:20:49.942238 4733 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7b5dc098-4a15-429b-8243-1ac75ce2e0c1-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 18 10:20:49 crc kubenswrapper[4733]: I0318 10:20:49.942345 4733 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7b5dc098-4a15-429b-8243-1ac75ce2e0c1-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 18 10:20:49 crc kubenswrapper[4733]: I0318 10:20:49.942420 4733 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7b5dc098-4a15-429b-8243-1ac75ce2e0c1-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 18 10:20:49 crc kubenswrapper[4733]: I0318 10:20:49.942502 4733 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7b5dc098-4a15-429b-8243-1ac75ce2e0c1-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 10:20:49 crc kubenswrapper[4733]: I0318 10:20:49.943021 4733 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7b5dc098-4a15-429b-8243-1ac75ce2e0c1-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 18 10:20:49 crc kubenswrapper[4733]: I0318 10:20:49.943116 4733 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7b5dc098-4a15-429b-8243-1ac75ce2e0c1-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 18 10:20:50 crc kubenswrapper[4733]: I0318 10:20:50.405136 4733 generic.go:334] "Generic (PLEG): container finished" podID="7b5dc098-4a15-429b-8243-1ac75ce2e0c1" containerID="d196ee7bec70e95ed9ff3308e0424855deaa072d0c4faba514ae98e1dcaec085" exitCode=0 Mar 18 10:20:50 crc kubenswrapper[4733]: I0318 10:20:50.405219 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" Mar 18 10:20:50 crc kubenswrapper[4733]: I0318 10:20:50.405219 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" event={"ID":"7b5dc098-4a15-429b-8243-1ac75ce2e0c1","Type":"ContainerDied","Data":"d196ee7bec70e95ed9ff3308e0424855deaa072d0c4faba514ae98e1dcaec085"} Mar 18 10:20:50 crc kubenswrapper[4733]: I0318 10:20:50.405354 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-nwhtg" event={"ID":"7b5dc098-4a15-429b-8243-1ac75ce2e0c1","Type":"ContainerDied","Data":"bf9beab436bdff3f99c6c06c629fb5de1f2bcd079250aacd7d55627140dc6e11"} Mar 18 10:20:50 crc kubenswrapper[4733]: I0318 10:20:50.405380 4733 scope.go:117] "RemoveContainer" containerID="d196ee7bec70e95ed9ff3308e0424855deaa072d0c4faba514ae98e1dcaec085" Mar 18 10:20:50 crc kubenswrapper[4733]: I0318 10:20:50.441761 4733 scope.go:117] "RemoveContainer" containerID="d196ee7bec70e95ed9ff3308e0424855deaa072d0c4faba514ae98e1dcaec085" Mar 18 10:20:50 crc kubenswrapper[4733]: E0318 10:20:50.442645 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d196ee7bec70e95ed9ff3308e0424855deaa072d0c4faba514ae98e1dcaec085\": container with ID starting with d196ee7bec70e95ed9ff3308e0424855deaa072d0c4faba514ae98e1dcaec085 not found: ID does not exist" containerID="d196ee7bec70e95ed9ff3308e0424855deaa072d0c4faba514ae98e1dcaec085" Mar 18 10:20:50 crc kubenswrapper[4733]: I0318 10:20:50.442893 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d196ee7bec70e95ed9ff3308e0424855deaa072d0c4faba514ae98e1dcaec085"} err="failed to get container status \"d196ee7bec70e95ed9ff3308e0424855deaa072d0c4faba514ae98e1dcaec085\": rpc error: code = NotFound desc = could not find container \"d196ee7bec70e95ed9ff3308e0424855deaa072d0c4faba514ae98e1dcaec085\": container with ID starting with d196ee7bec70e95ed9ff3308e0424855deaa072d0c4faba514ae98e1dcaec085 not found: ID does not exist" Mar 18 10:20:50 crc kubenswrapper[4733]: I0318 10:20:50.463749 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-nwhtg"] Mar 18 10:20:50 crc kubenswrapper[4733]: I0318 10:20:50.470676 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-nwhtg"] Mar 18 10:20:50 crc kubenswrapper[4733]: I0318 10:20:50.538470 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rs2b6" Mar 18 10:20:50 crc kubenswrapper[4733]: I0318 10:20:50.599578 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rs2b6" Mar 18 10:20:51 crc kubenswrapper[4733]: I0318 10:20:51.190226 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b5dc098-4a15-429b-8243-1ac75ce2e0c1" path="/var/lib/kubelet/pods/7b5dc098-4a15-429b-8243-1ac75ce2e0c1/volumes" Mar 18 10:22:00 crc kubenswrapper[4733]: I0318 10:22:00.152981 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563822-4fvb8"] Mar 18 10:22:00 crc kubenswrapper[4733]: E0318 10:22:00.154668 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b5dc098-4a15-429b-8243-1ac75ce2e0c1" containerName="registry" Mar 18 10:22:00 crc kubenswrapper[4733]: I0318 10:22:00.154697 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b5dc098-4a15-429b-8243-1ac75ce2e0c1" containerName="registry" Mar 18 10:22:00 crc kubenswrapper[4733]: I0318 10:22:00.155609 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b5dc098-4a15-429b-8243-1ac75ce2e0c1" containerName="registry" Mar 18 10:22:00 crc kubenswrapper[4733]: I0318 10:22:00.156301 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563822-4fvb8" Mar 18 10:22:00 crc kubenswrapper[4733]: I0318 10:22:00.159987 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:22:00 crc kubenswrapper[4733]: I0318 10:22:00.160380 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:22:00 crc kubenswrapper[4733]: I0318 10:22:00.160693 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wmd5k" Mar 18 10:22:00 crc kubenswrapper[4733]: I0318 10:22:00.165797 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563822-4fvb8"] Mar 18 10:22:00 crc kubenswrapper[4733]: I0318 10:22:00.295529 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whsqs\" (UniqueName: \"kubernetes.io/projected/3f93d05a-41f2-4422-88aa-9dfddb13191f-kube-api-access-whsqs\") pod \"auto-csr-approver-29563822-4fvb8\" (UID: \"3f93d05a-41f2-4422-88aa-9dfddb13191f\") " pod="openshift-infra/auto-csr-approver-29563822-4fvb8" Mar 18 10:22:00 crc kubenswrapper[4733]: I0318 10:22:00.397877 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whsqs\" (UniqueName: \"kubernetes.io/projected/3f93d05a-41f2-4422-88aa-9dfddb13191f-kube-api-access-whsqs\") pod \"auto-csr-approver-29563822-4fvb8\" (UID: \"3f93d05a-41f2-4422-88aa-9dfddb13191f\") " pod="openshift-infra/auto-csr-approver-29563822-4fvb8" Mar 18 10:22:00 crc kubenswrapper[4733]: I0318 10:22:00.428803 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whsqs\" (UniqueName: \"kubernetes.io/projected/3f93d05a-41f2-4422-88aa-9dfddb13191f-kube-api-access-whsqs\") pod \"auto-csr-approver-29563822-4fvb8\" (UID: \"3f93d05a-41f2-4422-88aa-9dfddb13191f\") " pod="openshift-infra/auto-csr-approver-29563822-4fvb8" Mar 18 10:22:00 crc kubenswrapper[4733]: I0318 10:22:00.493427 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563822-4fvb8" Mar 18 10:22:01 crc kubenswrapper[4733]: I0318 10:22:01.393176 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563822-4fvb8"] Mar 18 10:22:01 crc kubenswrapper[4733]: W0318 10:22:01.408684 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f93d05a_41f2_4422_88aa_9dfddb13191f.slice/crio-31735bf21c4d4370dc615feda0ad1646f4d6cdd954f38a88ca325149075006fa WatchSource:0}: Error finding container 31735bf21c4d4370dc615feda0ad1646f4d6cdd954f38a88ca325149075006fa: Status 404 returned error can't find the container with id 31735bf21c4d4370dc615feda0ad1646f4d6cdd954f38a88ca325149075006fa Mar 18 10:22:01 crc kubenswrapper[4733]: I0318 10:22:01.413558 4733 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 10:22:01 crc kubenswrapper[4733]: I0318 10:22:01.996750 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563822-4fvb8" event={"ID":"3f93d05a-41f2-4422-88aa-9dfddb13191f","Type":"ContainerStarted","Data":"31735bf21c4d4370dc615feda0ad1646f4d6cdd954f38a88ca325149075006fa"} Mar 18 10:22:03 crc kubenswrapper[4733]: I0318 10:22:03.007591 4733 generic.go:334] "Generic (PLEG): container finished" podID="3f93d05a-41f2-4422-88aa-9dfddb13191f" containerID="36c7a80bc1a34092c9183dbd958b5c05ea904377be8cffacb7112a1b4663e6a6" exitCode=0 Mar 18 10:22:03 crc kubenswrapper[4733]: I0318 10:22:03.007910 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563822-4fvb8" event={"ID":"3f93d05a-41f2-4422-88aa-9dfddb13191f","Type":"ContainerDied","Data":"36c7a80bc1a34092c9183dbd958b5c05ea904377be8cffacb7112a1b4663e6a6"} Mar 18 10:22:04 crc kubenswrapper[4733]: I0318 10:22:04.338771 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563822-4fvb8" Mar 18 10:22:04 crc kubenswrapper[4733]: I0318 10:22:04.458751 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whsqs\" (UniqueName: \"kubernetes.io/projected/3f93d05a-41f2-4422-88aa-9dfddb13191f-kube-api-access-whsqs\") pod \"3f93d05a-41f2-4422-88aa-9dfddb13191f\" (UID: \"3f93d05a-41f2-4422-88aa-9dfddb13191f\") " Mar 18 10:22:04 crc kubenswrapper[4733]: I0318 10:22:04.466627 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f93d05a-41f2-4422-88aa-9dfddb13191f-kube-api-access-whsqs" (OuterVolumeSpecName: "kube-api-access-whsqs") pod "3f93d05a-41f2-4422-88aa-9dfddb13191f" (UID: "3f93d05a-41f2-4422-88aa-9dfddb13191f"). InnerVolumeSpecName "kube-api-access-whsqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:22:04 crc kubenswrapper[4733]: I0318 10:22:04.560887 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whsqs\" (UniqueName: \"kubernetes.io/projected/3f93d05a-41f2-4422-88aa-9dfddb13191f-kube-api-access-whsqs\") on node \"crc\" DevicePath \"\"" Mar 18 10:22:05 crc kubenswrapper[4733]: I0318 10:22:05.025181 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563822-4fvb8" event={"ID":"3f93d05a-41f2-4422-88aa-9dfddb13191f","Type":"ContainerDied","Data":"31735bf21c4d4370dc615feda0ad1646f4d6cdd954f38a88ca325149075006fa"} Mar 18 10:22:05 crc kubenswrapper[4733]: I0318 10:22:05.025299 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563822-4fvb8" Mar 18 10:22:05 crc kubenswrapper[4733]: I0318 10:22:05.025312 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31735bf21c4d4370dc615feda0ad1646f4d6cdd954f38a88ca325149075006fa" Mar 18 10:22:05 crc kubenswrapper[4733]: I0318 10:22:05.421666 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563816-4582s"] Mar 18 10:22:05 crc kubenswrapper[4733]: I0318 10:22:05.428119 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563816-4582s"] Mar 18 10:22:07 crc kubenswrapper[4733]: I0318 10:22:07.186130 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71a70c3c-d483-43f4-9f54-10978c7f8cc8" path="/var/lib/kubelet/pods/71a70c3c-d483-43f4-9f54-10978c7f8cc8/volumes" Mar 18 10:23:13 crc kubenswrapper[4733]: I0318 10:23:13.571771 4733 patch_prober.go:28] interesting pod/machine-config-daemon-2h7dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:23:13 crc kubenswrapper[4733]: I0318 10:23:13.572824 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:23:43 crc kubenswrapper[4733]: I0318 10:23:43.571734 4733 patch_prober.go:28] interesting pod/machine-config-daemon-2h7dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:23:43 crc kubenswrapper[4733]: I0318 10:23:43.572489 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:23:54 crc kubenswrapper[4733]: I0318 10:23:54.538673 4733 scope.go:117] "RemoveContainer" containerID="869578488a5526adb52c0d5efeb676ea68e5c20e95b1cf2d208fa00dbd02baca" Mar 18 10:24:00 crc kubenswrapper[4733]: I0318 10:24:00.152033 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563824-l49xk"] Mar 18 10:24:00 crc kubenswrapper[4733]: E0318 10:24:00.153450 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f93d05a-41f2-4422-88aa-9dfddb13191f" containerName="oc" Mar 18 10:24:00 crc kubenswrapper[4733]: I0318 10:24:00.153483 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f93d05a-41f2-4422-88aa-9dfddb13191f" containerName="oc" Mar 18 10:24:00 crc kubenswrapper[4733]: I0318 10:24:00.153723 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f93d05a-41f2-4422-88aa-9dfddb13191f" containerName="oc" Mar 18 10:24:00 crc kubenswrapper[4733]: I0318 10:24:00.154642 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563824-l49xk" Mar 18 10:24:00 crc kubenswrapper[4733]: I0318 10:24:00.158745 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wmd5k" Mar 18 10:24:00 crc kubenswrapper[4733]: I0318 10:24:00.159285 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:24:00 crc kubenswrapper[4733]: I0318 10:24:00.159411 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:24:00 crc kubenswrapper[4733]: I0318 10:24:00.167403 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563824-l49xk"] Mar 18 10:24:00 crc kubenswrapper[4733]: I0318 10:24:00.325655 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89gfz\" (UniqueName: \"kubernetes.io/projected/6077c15f-e285-4625-b336-a84327b1af2d-kube-api-access-89gfz\") pod \"auto-csr-approver-29563824-l49xk\" (UID: \"6077c15f-e285-4625-b336-a84327b1af2d\") " pod="openshift-infra/auto-csr-approver-29563824-l49xk" Mar 18 10:24:00 crc kubenswrapper[4733]: I0318 10:24:00.427508 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89gfz\" (UniqueName: \"kubernetes.io/projected/6077c15f-e285-4625-b336-a84327b1af2d-kube-api-access-89gfz\") pod \"auto-csr-approver-29563824-l49xk\" (UID: \"6077c15f-e285-4625-b336-a84327b1af2d\") " pod="openshift-infra/auto-csr-approver-29563824-l49xk" Mar 18 10:24:00 crc kubenswrapper[4733]: I0318 10:24:00.454923 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89gfz\" (UniqueName: \"kubernetes.io/projected/6077c15f-e285-4625-b336-a84327b1af2d-kube-api-access-89gfz\") pod \"auto-csr-approver-29563824-l49xk\" (UID: \"6077c15f-e285-4625-b336-a84327b1af2d\") " pod="openshift-infra/auto-csr-approver-29563824-l49xk" Mar 18 10:24:00 crc kubenswrapper[4733]: I0318 10:24:00.484336 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563824-l49xk" Mar 18 10:24:00 crc kubenswrapper[4733]: I0318 10:24:00.711674 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563824-l49xk"] Mar 18 10:24:00 crc kubenswrapper[4733]: I0318 10:24:00.869961 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563824-l49xk" event={"ID":"6077c15f-e285-4625-b336-a84327b1af2d","Type":"ContainerStarted","Data":"6f54837142a70aebab3aa9bcb8b9b38eb2ba43f3e59b7b9ba5aa264c07ff2706"} Mar 18 10:24:01 crc kubenswrapper[4733]: I0318 10:24:01.880101 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563824-l49xk" event={"ID":"6077c15f-e285-4625-b336-a84327b1af2d","Type":"ContainerStarted","Data":"6f50555f9faf96f94c8c33f53803364eb9620cbe1dd5e27e68cba9056a299fa1"} Mar 18 10:24:01 crc kubenswrapper[4733]: I0318 10:24:01.894116 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563824-l49xk" podStartSLOduration=1.116388676 podStartE2EDuration="1.89409588s" podCreationTimestamp="2026-03-18 10:24:00 +0000 UTC" firstStartedPulling="2026-03-18 10:24:00.713224348 +0000 UTC m=+680.204958683" lastFinishedPulling="2026-03-18 10:24:01.490931542 +0000 UTC m=+680.982665887" observedRunningTime="2026-03-18 10:24:01.891578329 +0000 UTC m=+681.383312654" watchObservedRunningTime="2026-03-18 10:24:01.89409588 +0000 UTC m=+681.385830205" Mar 18 10:24:02 crc kubenswrapper[4733]: I0318 10:24:02.891435 4733 generic.go:334] "Generic (PLEG): container finished" podID="6077c15f-e285-4625-b336-a84327b1af2d" containerID="6f50555f9faf96f94c8c33f53803364eb9620cbe1dd5e27e68cba9056a299fa1" exitCode=0 Mar 18 10:24:02 crc kubenswrapper[4733]: I0318 10:24:02.891507 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563824-l49xk" event={"ID":"6077c15f-e285-4625-b336-a84327b1af2d","Type":"ContainerDied","Data":"6f50555f9faf96f94c8c33f53803364eb9620cbe1dd5e27e68cba9056a299fa1"} Mar 18 10:24:04 crc kubenswrapper[4733]: I0318 10:24:04.152516 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563824-l49xk" Mar 18 10:24:04 crc kubenswrapper[4733]: I0318 10:24:04.270829 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563818-44h4f"] Mar 18 10:24:04 crc kubenswrapper[4733]: I0318 10:24:04.276980 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563818-44h4f"] Mar 18 10:24:04 crc kubenswrapper[4733]: I0318 10:24:04.281493 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89gfz\" (UniqueName: \"kubernetes.io/projected/6077c15f-e285-4625-b336-a84327b1af2d-kube-api-access-89gfz\") pod \"6077c15f-e285-4625-b336-a84327b1af2d\" (UID: \"6077c15f-e285-4625-b336-a84327b1af2d\") " Mar 18 10:24:04 crc kubenswrapper[4733]: I0318 10:24:04.290469 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6077c15f-e285-4625-b336-a84327b1af2d-kube-api-access-89gfz" (OuterVolumeSpecName: "kube-api-access-89gfz") pod "6077c15f-e285-4625-b336-a84327b1af2d" (UID: "6077c15f-e285-4625-b336-a84327b1af2d"). InnerVolumeSpecName "kube-api-access-89gfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:24:04 crc kubenswrapper[4733]: I0318 10:24:04.383837 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-89gfz\" (UniqueName: \"kubernetes.io/projected/6077c15f-e285-4625-b336-a84327b1af2d-kube-api-access-89gfz\") on node \"crc\" DevicePath \"\"" Mar 18 10:24:04 crc kubenswrapper[4733]: I0318 10:24:04.908005 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563824-l49xk" event={"ID":"6077c15f-e285-4625-b336-a84327b1af2d","Type":"ContainerDied","Data":"6f54837142a70aebab3aa9bcb8b9b38eb2ba43f3e59b7b9ba5aa264c07ff2706"} Mar 18 10:24:04 crc kubenswrapper[4733]: I0318 10:24:04.908068 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f54837142a70aebab3aa9bcb8b9b38eb2ba43f3e59b7b9ba5aa264c07ff2706" Mar 18 10:24:04 crc kubenswrapper[4733]: I0318 10:24:04.908140 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563824-l49xk" Mar 18 10:24:05 crc kubenswrapper[4733]: I0318 10:24:05.191150 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7d7efa6-dd10-4ee1-a93b-13ae5f74ebe2" path="/var/lib/kubelet/pods/c7d7efa6-dd10-4ee1-a93b-13ae5f74ebe2/volumes" Mar 18 10:24:13 crc kubenswrapper[4733]: I0318 10:24:13.571953 4733 patch_prober.go:28] interesting pod/machine-config-daemon-2h7dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:24:13 crc kubenswrapper[4733]: I0318 10:24:13.574365 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:24:13 crc kubenswrapper[4733]: I0318 10:24:13.574580 4733 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" Mar 18 10:24:13 crc kubenswrapper[4733]: I0318 10:24:13.575538 4733 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bff727181393f1168072f98fbfc5cda5acfb0782a9ae8a688a8335ed7323a527"} pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 10:24:13 crc kubenswrapper[4733]: I0318 10:24:13.575773 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" containerName="machine-config-daemon" containerID="cri-o://bff727181393f1168072f98fbfc5cda5acfb0782a9ae8a688a8335ed7323a527" gracePeriod=600 Mar 18 10:24:13 crc kubenswrapper[4733]: I0318 10:24:13.982947 4733 generic.go:334] "Generic (PLEG): container finished" podID="6f75e1c5-e0c5-43df-944f-77b734070793" containerID="bff727181393f1168072f98fbfc5cda5acfb0782a9ae8a688a8335ed7323a527" exitCode=0 Mar 18 10:24:13 crc kubenswrapper[4733]: I0318 10:24:13.983144 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" event={"ID":"6f75e1c5-e0c5-43df-944f-77b734070793","Type":"ContainerDied","Data":"bff727181393f1168072f98fbfc5cda5acfb0782a9ae8a688a8335ed7323a527"} Mar 18 10:24:13 crc kubenswrapper[4733]: I0318 10:24:13.983429 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" event={"ID":"6f75e1c5-e0c5-43df-944f-77b734070793","Type":"ContainerStarted","Data":"a11e956cdd33846b5919c35822db029436f82987d5e2c2bb6427c6d1dfd2098c"} Mar 18 10:24:13 crc kubenswrapper[4733]: I0318 10:24:13.983461 4733 scope.go:117] "RemoveContainer" containerID="2dcc5035fa17fe3e92cf26ce37e02cacce4ad31a0593e6e1184b98062f31f028" Mar 18 10:24:54 crc kubenswrapper[4733]: I0318 10:24:54.614360 4733 scope.go:117] "RemoveContainer" containerID="4287f6e7720d29c2928f6ce2bc4de5dd996378a83ad9d6dd58331a0b52048815" Mar 18 10:25:35 crc kubenswrapper[4733]: I0318 10:25:35.129242 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-8ds68"] Mar 18 10:25:35 crc kubenswrapper[4733]: E0318 10:25:35.130256 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6077c15f-e285-4625-b336-a84327b1af2d" containerName="oc" Mar 18 10:25:35 crc kubenswrapper[4733]: I0318 10:25:35.130275 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="6077c15f-e285-4625-b336-a84327b1af2d" containerName="oc" Mar 18 10:25:35 crc kubenswrapper[4733]: I0318 10:25:35.130405 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="6077c15f-e285-4625-b336-a84327b1af2d" containerName="oc" Mar 18 10:25:35 crc kubenswrapper[4733]: I0318 10:25:35.130834 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-8ds68" Mar 18 10:25:35 crc kubenswrapper[4733]: I0318 10:25:35.136428 4733 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-rjbj6" Mar 18 10:25:35 crc kubenswrapper[4733]: I0318 10:25:35.136506 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 18 10:25:35 crc kubenswrapper[4733]: I0318 10:25:35.143139 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-rd2dh"] Mar 18 10:25:35 crc kubenswrapper[4733]: I0318 10:25:35.144886 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-rd2dh" Mar 18 10:25:35 crc kubenswrapper[4733]: I0318 10:25:35.152793 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 18 10:25:35 crc kubenswrapper[4733]: I0318 10:25:35.153635 4733 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-jd7pl" Mar 18 10:25:35 crc kubenswrapper[4733]: I0318 10:25:35.166849 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-8ds68"] Mar 18 10:25:35 crc kubenswrapper[4733]: I0318 10:25:35.173818 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-2nr27"] Mar 18 10:25:35 crc kubenswrapper[4733]: I0318 10:25:35.177973 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-2nr27" Mar 18 10:25:35 crc kubenswrapper[4733]: I0318 10:25:35.183545 4733 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-mcqkw" Mar 18 10:25:35 crc kubenswrapper[4733]: I0318 10:25:35.196415 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-rd2dh"] Mar 18 10:25:35 crc kubenswrapper[4733]: I0318 10:25:35.196488 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-2nr27"] Mar 18 10:25:35 crc kubenswrapper[4733]: I0318 10:25:35.228948 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cq9p9\" (UniqueName: \"kubernetes.io/projected/534b0ac6-c9b1-4940-9e6e-ed36de1ec1e8-kube-api-access-cq9p9\") pod \"cert-manager-webhook-687f57d79b-2nr27\" (UID: \"534b0ac6-c9b1-4940-9e6e-ed36de1ec1e8\") " pod="cert-manager/cert-manager-webhook-687f57d79b-2nr27" Mar 18 10:25:35 crc kubenswrapper[4733]: I0318 10:25:35.331281 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cq9p9\" (UniqueName: \"kubernetes.io/projected/534b0ac6-c9b1-4940-9e6e-ed36de1ec1e8-kube-api-access-cq9p9\") pod \"cert-manager-webhook-687f57d79b-2nr27\" (UID: \"534b0ac6-c9b1-4940-9e6e-ed36de1ec1e8\") " pod="cert-manager/cert-manager-webhook-687f57d79b-2nr27" Mar 18 10:25:35 crc kubenswrapper[4733]: I0318 10:25:35.331432 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9q6g\" (UniqueName: \"kubernetes.io/projected/ce77d29d-b82e-46be-a694-b6eea5da9379-kube-api-access-f9q6g\") pod \"cert-manager-858654f9db-rd2dh\" (UID: \"ce77d29d-b82e-46be-a694-b6eea5da9379\") " pod="cert-manager/cert-manager-858654f9db-rd2dh" Mar 18 10:25:35 crc kubenswrapper[4733]: I0318 10:25:35.331528 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8x85\" (UniqueName: \"kubernetes.io/projected/585c06be-86bd-48b7-954e-9aec01b08874-kube-api-access-s8x85\") pod \"cert-manager-cainjector-cf98fcc89-8ds68\" (UID: \"585c06be-86bd-48b7-954e-9aec01b08874\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-8ds68" Mar 18 10:25:35 crc kubenswrapper[4733]: I0318 10:25:35.360545 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cq9p9\" (UniqueName: \"kubernetes.io/projected/534b0ac6-c9b1-4940-9e6e-ed36de1ec1e8-kube-api-access-cq9p9\") pod \"cert-manager-webhook-687f57d79b-2nr27\" (UID: \"534b0ac6-c9b1-4940-9e6e-ed36de1ec1e8\") " pod="cert-manager/cert-manager-webhook-687f57d79b-2nr27" Mar 18 10:25:35 crc kubenswrapper[4733]: I0318 10:25:35.432856 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9q6g\" (UniqueName: \"kubernetes.io/projected/ce77d29d-b82e-46be-a694-b6eea5da9379-kube-api-access-f9q6g\") pod \"cert-manager-858654f9db-rd2dh\" (UID: \"ce77d29d-b82e-46be-a694-b6eea5da9379\") " pod="cert-manager/cert-manager-858654f9db-rd2dh" Mar 18 10:25:35 crc kubenswrapper[4733]: I0318 10:25:35.432947 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8x85\" (UniqueName: \"kubernetes.io/projected/585c06be-86bd-48b7-954e-9aec01b08874-kube-api-access-s8x85\") pod \"cert-manager-cainjector-cf98fcc89-8ds68\" (UID: \"585c06be-86bd-48b7-954e-9aec01b08874\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-8ds68" Mar 18 10:25:35 crc kubenswrapper[4733]: I0318 10:25:35.450602 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9q6g\" (UniqueName: \"kubernetes.io/projected/ce77d29d-b82e-46be-a694-b6eea5da9379-kube-api-access-f9q6g\") pod \"cert-manager-858654f9db-rd2dh\" (UID: \"ce77d29d-b82e-46be-a694-b6eea5da9379\") " pod="cert-manager/cert-manager-858654f9db-rd2dh" Mar 18 10:25:35 crc kubenswrapper[4733]: I0318 10:25:35.463532 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8x85\" (UniqueName: \"kubernetes.io/projected/585c06be-86bd-48b7-954e-9aec01b08874-kube-api-access-s8x85\") pod \"cert-manager-cainjector-cf98fcc89-8ds68\" (UID: \"585c06be-86bd-48b7-954e-9aec01b08874\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-8ds68" Mar 18 10:25:35 crc kubenswrapper[4733]: I0318 10:25:35.474299 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-8ds68" Mar 18 10:25:35 crc kubenswrapper[4733]: I0318 10:25:35.486180 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-rd2dh" Mar 18 10:25:35 crc kubenswrapper[4733]: I0318 10:25:35.499749 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-2nr27" Mar 18 10:25:35 crc kubenswrapper[4733]: I0318 10:25:35.761307 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-rd2dh"] Mar 18 10:25:36 crc kubenswrapper[4733]: I0318 10:25:36.014047 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-8ds68"] Mar 18 10:25:36 crc kubenswrapper[4733]: W0318 10:25:36.019366 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod585c06be_86bd_48b7_954e_9aec01b08874.slice/crio-5c8591c08d39a2c1dcf39bc2b94cf19e43b4ae811739fc182daf63959295a448 WatchSource:0}: Error finding container 5c8591c08d39a2c1dcf39bc2b94cf19e43b4ae811739fc182daf63959295a448: Status 404 returned error can't find the container with id 5c8591c08d39a2c1dcf39bc2b94cf19e43b4ae811739fc182daf63959295a448 Mar 18 10:25:36 crc kubenswrapper[4733]: I0318 10:25:36.019743 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-2nr27"] Mar 18 10:25:36 crc kubenswrapper[4733]: W0318 10:25:36.031909 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod534b0ac6_c9b1_4940_9e6e_ed36de1ec1e8.slice/crio-498f44e558fbd34d8865d29469eb430458fbcd5e3192f9caad9e81c412b6775e WatchSource:0}: Error finding container 498f44e558fbd34d8865d29469eb430458fbcd5e3192f9caad9e81c412b6775e: Status 404 returned error can't find the container with id 498f44e558fbd34d8865d29469eb430458fbcd5e3192f9caad9e81c412b6775e Mar 18 10:25:36 crc kubenswrapper[4733]: I0318 10:25:36.684098 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-rd2dh" event={"ID":"ce77d29d-b82e-46be-a694-b6eea5da9379","Type":"ContainerStarted","Data":"670eec05213c29a80328173b18779c9fe6a9cb2e660ea26278fbeec449be53fe"} Mar 18 10:25:36 crc kubenswrapper[4733]: I0318 10:25:36.686929 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-8ds68" event={"ID":"585c06be-86bd-48b7-954e-9aec01b08874","Type":"ContainerStarted","Data":"5c8591c08d39a2c1dcf39bc2b94cf19e43b4ae811739fc182daf63959295a448"} Mar 18 10:25:36 crc kubenswrapper[4733]: I0318 10:25:36.689668 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-2nr27" event={"ID":"534b0ac6-c9b1-4940-9e6e-ed36de1ec1e8","Type":"ContainerStarted","Data":"498f44e558fbd34d8865d29469eb430458fbcd5e3192f9caad9e81c412b6775e"} Mar 18 10:25:40 crc kubenswrapper[4733]: I0318 10:25:40.720332 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-8ds68" event={"ID":"585c06be-86bd-48b7-954e-9aec01b08874","Type":"ContainerStarted","Data":"08df0d2f67ec74d809ebd16ed32107787b740a0e4aeff6dcfb9418a3464f8be9"} Mar 18 10:25:40 crc kubenswrapper[4733]: I0318 10:25:40.722422 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-2nr27" event={"ID":"534b0ac6-c9b1-4940-9e6e-ed36de1ec1e8","Type":"ContainerStarted","Data":"adb53a592c227c2d4b28596b60bab28de1a00c8006702f74cfbf7e0127b715b9"} Mar 18 10:25:40 crc kubenswrapper[4733]: I0318 10:25:40.722559 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-2nr27" Mar 18 10:25:40 crc kubenswrapper[4733]: I0318 10:25:40.724451 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-rd2dh" event={"ID":"ce77d29d-b82e-46be-a694-b6eea5da9379","Type":"ContainerStarted","Data":"d9fed630241159e6db14d549fb1efe33161d7357f061ec3aa85a6de0b53aa9a0"} Mar 18 10:25:40 crc kubenswrapper[4733]: I0318 10:25:40.743085 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-8ds68" podStartSLOduration=1.8752565460000001 podStartE2EDuration="5.743071781s" podCreationTimestamp="2026-03-18 10:25:35 +0000 UTC" firstStartedPulling="2026-03-18 10:25:36.021523854 +0000 UTC m=+775.513258179" lastFinishedPulling="2026-03-18 10:25:39.889339079 +0000 UTC m=+779.381073414" observedRunningTime="2026-03-18 10:25:40.740704094 +0000 UTC m=+780.232438429" watchObservedRunningTime="2026-03-18 10:25:40.743071781 +0000 UTC m=+780.234806106" Mar 18 10:25:40 crc kubenswrapper[4733]: I0318 10:25:40.763678 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-2nr27" podStartSLOduration=1.989877022 podStartE2EDuration="5.76366297s" podCreationTimestamp="2026-03-18 10:25:35 +0000 UTC" firstStartedPulling="2026-03-18 10:25:36.034011238 +0000 UTC m=+775.525745563" lastFinishedPulling="2026-03-18 10:25:39.807797146 +0000 UTC m=+779.299531511" observedRunningTime="2026-03-18 10:25:40.763413753 +0000 UTC m=+780.255148088" watchObservedRunningTime="2026-03-18 10:25:40.76366297 +0000 UTC m=+780.255397285" Mar 18 10:25:40 crc kubenswrapper[4733]: I0318 10:25:40.794111 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-rd2dh" podStartSLOduration=1.756700613 podStartE2EDuration="5.794081959s" podCreationTimestamp="2026-03-18 10:25:35 +0000 UTC" firstStartedPulling="2026-03-18 10:25:35.769613747 +0000 UTC m=+775.261348072" lastFinishedPulling="2026-03-18 10:25:39.806995093 +0000 UTC m=+779.298729418" observedRunningTime="2026-03-18 10:25:40.787424088 +0000 UTC m=+780.279158413" watchObservedRunningTime="2026-03-18 10:25:40.794081959 +0000 UTC m=+780.285816324" Mar 18 10:25:45 crc kubenswrapper[4733]: I0318 10:25:45.504586 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-2nr27" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.351631 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-7pxwd"] Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.353261 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" podUID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" containerName="nbdb" containerID="cri-o://10bb15e1466307914440acf630eb4ea4a442cf4354a9cc5e6ddb40d8147a4d77" gracePeriod=30 Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.353403 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" podUID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" containerName="ovn-acl-logging" containerID="cri-o://9eb388a698a6a2ee1963deeba09459d7190f60ed189ee20a2ba24de317604226" gracePeriod=30 Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.353293 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" podUID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://c6233b06ff88fa77ac74becea9ef44fd4aa09b0ae718390c1b73c78d353ecbc1" gracePeriod=30 Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.353497 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" podUID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" containerName="kube-rbac-proxy-node" containerID="cri-o://8f1102a1e204850c8494267640da7ec93ec67e7341c4eb60d22a7f0772058cb4" gracePeriod=30 Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.353227 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" podUID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" containerName="ovn-controller" containerID="cri-o://e3043711b80807f025d3bd2e7b4593f22d78dd3e458aa185c18c065af4aca503" gracePeriod=30 Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.353579 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" podUID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" containerName="sbdb" containerID="cri-o://de2d3be12ab406039374efc6a0094e21103be62b51ef65c4ccf5529d6ef05291" gracePeriod=30 Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.353324 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" podUID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" containerName="northd" containerID="cri-o://3c7ac487e3e86e35e8a1d6ddf67975eb4a67657b219938a69a90ccd5774ee0ea" gracePeriod=30 Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.412804 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" podUID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" containerName="ovnkube-controller" containerID="cri-o://850880b1c00b2f5a5a32f08989e49cc1406960901b41de4ee69b92f38458d395" gracePeriod=30 Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.734396 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pxwd_73327417-4d3b-45f1-b3b6-575fdeeaa31a/ovnkube-controller/3.log" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.742683 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pxwd_73327417-4d3b-45f1-b3b6-575fdeeaa31a/ovn-acl-logging/0.log" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.743688 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pxwd_73327417-4d3b-45f1-b3b6-575fdeeaa31a/ovn-controller/0.log" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.744736 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.783622 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-host-cni-bin\") pod \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.783694 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-node-log\") pod \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.783751 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/73327417-4d3b-45f1-b3b6-575fdeeaa31a-ovnkube-config\") pod \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.783798 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-host-cni-netd\") pod \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.783831 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-systemd-units\") pod \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.783863 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-host-run-netns\") pod \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.783907 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/73327417-4d3b-45f1-b3b6-575fdeeaa31a-ovn-node-metrics-cert\") pod \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.783944 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-etc-openvswitch\") pod \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.783971 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-host-slash\") pod \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.784000 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-run-openvswitch\") pod \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.784028 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-run-systemd\") pod \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.784075 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.784108 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-host-run-ovn-kubernetes\") pod \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.784160 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-run-ovn\") pod \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.784229 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-log-socket\") pod \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.784282 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/73327417-4d3b-45f1-b3b6-575fdeeaa31a-env-overrides\") pod \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.784318 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/73327417-4d3b-45f1-b3b6-575fdeeaa31a-ovnkube-script-lib\") pod \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.784350 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-host-kubelet\") pod \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.784379 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-var-lib-openvswitch\") pod \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.784411 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqxdr\" (UniqueName: \"kubernetes.io/projected/73327417-4d3b-45f1-b3b6-575fdeeaa31a-kube-api-access-zqxdr\") pod \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\" (UID: \"73327417-4d3b-45f1-b3b6-575fdeeaa31a\") " Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.785046 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "73327417-4d3b-45f1-b3b6-575fdeeaa31a" (UID: "73327417-4d3b-45f1-b3b6-575fdeeaa31a"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.785050 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-log-socket" (OuterVolumeSpecName: "log-socket") pod "73327417-4d3b-45f1-b3b6-575fdeeaa31a" (UID: "73327417-4d3b-45f1-b3b6-575fdeeaa31a"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.785228 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "73327417-4d3b-45f1-b3b6-575fdeeaa31a" (UID: "73327417-4d3b-45f1-b3b6-575fdeeaa31a"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.785301 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "73327417-4d3b-45f1-b3b6-575fdeeaa31a" (UID: "73327417-4d3b-45f1-b3b6-575fdeeaa31a"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.785343 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "73327417-4d3b-45f1-b3b6-575fdeeaa31a" (UID: "73327417-4d3b-45f1-b3b6-575fdeeaa31a"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.785755 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "73327417-4d3b-45f1-b3b6-575fdeeaa31a" (UID: "73327417-4d3b-45f1-b3b6-575fdeeaa31a"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.785812 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "73327417-4d3b-45f1-b3b6-575fdeeaa31a" (UID: "73327417-4d3b-45f1-b3b6-575fdeeaa31a"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.785851 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "73327417-4d3b-45f1-b3b6-575fdeeaa31a" (UID: "73327417-4d3b-45f1-b3b6-575fdeeaa31a"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.785887 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-node-log" (OuterVolumeSpecName: "node-log") pod "73327417-4d3b-45f1-b3b6-575fdeeaa31a" (UID: "73327417-4d3b-45f1-b3b6-575fdeeaa31a"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.785873 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "73327417-4d3b-45f1-b3b6-575fdeeaa31a" (UID: "73327417-4d3b-45f1-b3b6-575fdeeaa31a"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.785914 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "73327417-4d3b-45f1-b3b6-575fdeeaa31a" (UID: "73327417-4d3b-45f1-b3b6-575fdeeaa31a"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.785970 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "73327417-4d3b-45f1-b3b6-575fdeeaa31a" (UID: "73327417-4d3b-45f1-b3b6-575fdeeaa31a"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.786008 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "73327417-4d3b-45f1-b3b6-575fdeeaa31a" (UID: "73327417-4d3b-45f1-b3b6-575fdeeaa31a"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.786047 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-host-slash" (OuterVolumeSpecName: "host-slash") pod "73327417-4d3b-45f1-b3b6-575fdeeaa31a" (UID: "73327417-4d3b-45f1-b3b6-575fdeeaa31a"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.786631 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73327417-4d3b-45f1-b3b6-575fdeeaa31a-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "73327417-4d3b-45f1-b3b6-575fdeeaa31a" (UID: "73327417-4d3b-45f1-b3b6-575fdeeaa31a"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.786896 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73327417-4d3b-45f1-b3b6-575fdeeaa31a-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "73327417-4d3b-45f1-b3b6-575fdeeaa31a" (UID: "73327417-4d3b-45f1-b3b6-575fdeeaa31a"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.787468 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73327417-4d3b-45f1-b3b6-575fdeeaa31a-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "73327417-4d3b-45f1-b3b6-575fdeeaa31a" (UID: "73327417-4d3b-45f1-b3b6-575fdeeaa31a"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.794310 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73327417-4d3b-45f1-b3b6-575fdeeaa31a-kube-api-access-zqxdr" (OuterVolumeSpecName: "kube-api-access-zqxdr") pod "73327417-4d3b-45f1-b3b6-575fdeeaa31a" (UID: "73327417-4d3b-45f1-b3b6-575fdeeaa31a"). InnerVolumeSpecName "kube-api-access-zqxdr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.796830 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73327417-4d3b-45f1-b3b6-575fdeeaa31a-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "73327417-4d3b-45f1-b3b6-575fdeeaa31a" (UID: "73327417-4d3b-45f1-b3b6-575fdeeaa31a"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.805268 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "73327417-4d3b-45f1-b3b6-575fdeeaa31a" (UID: "73327417-4d3b-45f1-b3b6-575fdeeaa31a"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.815316 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-g6j2q_cc85b0d4-15a5-4894-9f07-9aaeb28f63fa/kube-multus/2.log" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.815965 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-g6j2q_cc85b0d4-15a5-4894-9f07-9aaeb28f63fa/kube-multus/1.log" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.816034 4733 generic.go:334] "Generic (PLEG): container finished" podID="cc85b0d4-15a5-4894-9f07-9aaeb28f63fa" containerID="e6e4d066d930397d09ab341b832e9b1659ca8d82f0e6fdc83f2d3f3738f5c64d" exitCode=2 Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.816129 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-g6j2q" event={"ID":"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa","Type":"ContainerDied","Data":"e6e4d066d930397d09ab341b832e9b1659ca8d82f0e6fdc83f2d3f3738f5c64d"} Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.816182 4733 scope.go:117] "RemoveContainer" containerID="b6a4e9643a717b3f38fc1bed5c534e12bb873f0ffcf3c504cb4395c11621a73a" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.817370 4733 scope.go:117] "RemoveContainer" containerID="e6e4d066d930397d09ab341b832e9b1659ca8d82f0e6fdc83f2d3f3738f5c64d" Mar 18 10:25:51 crc kubenswrapper[4733]: E0318 10:25:51.818830 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-g6j2q_openshift-multus(cc85b0d4-15a5-4894-9f07-9aaeb28f63fa)\"" pod="openshift-multus/multus-g6j2q" podUID="cc85b0d4-15a5-4894-9f07-9aaeb28f63fa" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.823449 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pxwd_73327417-4d3b-45f1-b3b6-575fdeeaa31a/ovnkube-controller/3.log" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.827122 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pxwd_73327417-4d3b-45f1-b3b6-575fdeeaa31a/ovn-acl-logging/0.log" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.828015 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pxwd_73327417-4d3b-45f1-b3b6-575fdeeaa31a/ovn-controller/0.log" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.829565 4733 generic.go:334] "Generic (PLEG): container finished" podID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" containerID="850880b1c00b2f5a5a32f08989e49cc1406960901b41de4ee69b92f38458d395" exitCode=0 Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.829803 4733 generic.go:334] "Generic (PLEG): container finished" podID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" containerID="de2d3be12ab406039374efc6a0094e21103be62b51ef65c4ccf5529d6ef05291" exitCode=0 Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.829978 4733 generic.go:334] "Generic (PLEG): container finished" podID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" containerID="10bb15e1466307914440acf630eb4ea4a442cf4354a9cc5e6ddb40d8147a4d77" exitCode=0 Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.830132 4733 generic.go:334] "Generic (PLEG): container finished" podID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" containerID="3c7ac487e3e86e35e8a1d6ddf67975eb4a67657b219938a69a90ccd5774ee0ea" exitCode=0 Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.830328 4733 generic.go:334] "Generic (PLEG): container finished" podID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" containerID="c6233b06ff88fa77ac74becea9ef44fd4aa09b0ae718390c1b73c78d353ecbc1" exitCode=0 Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.830476 4733 generic.go:334] "Generic (PLEG): container finished" podID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" containerID="8f1102a1e204850c8494267640da7ec93ec67e7341c4eb60d22a7f0772058cb4" exitCode=0 Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.830667 4733 generic.go:334] "Generic (PLEG): container finished" podID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" containerID="9eb388a698a6a2ee1963deeba09459d7190f60ed189ee20a2ba24de317604226" exitCode=143 Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.830854 4733 generic.go:334] "Generic (PLEG): container finished" podID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" containerID="e3043711b80807f025d3bd2e7b4593f22d78dd3e458aa185c18c065af4aca503" exitCode=143 Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.829717 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" event={"ID":"73327417-4d3b-45f1-b3b6-575fdeeaa31a","Type":"ContainerDied","Data":"850880b1c00b2f5a5a32f08989e49cc1406960901b41de4ee69b92f38458d395"} Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.831145 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" event={"ID":"73327417-4d3b-45f1-b3b6-575fdeeaa31a","Type":"ContainerDied","Data":"de2d3be12ab406039374efc6a0094e21103be62b51ef65c4ccf5529d6ef05291"} Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.831315 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" event={"ID":"73327417-4d3b-45f1-b3b6-575fdeeaa31a","Type":"ContainerDied","Data":"10bb15e1466307914440acf630eb4ea4a442cf4354a9cc5e6ddb40d8147a4d77"} Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.831458 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" event={"ID":"73327417-4d3b-45f1-b3b6-575fdeeaa31a","Type":"ContainerDied","Data":"3c7ac487e3e86e35e8a1d6ddf67975eb4a67657b219938a69a90ccd5774ee0ea"} Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.831587 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" event={"ID":"73327417-4d3b-45f1-b3b6-575fdeeaa31a","Type":"ContainerDied","Data":"c6233b06ff88fa77ac74becea9ef44fd4aa09b0ae718390c1b73c78d353ecbc1"} Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.831706 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" event={"ID":"73327417-4d3b-45f1-b3b6-575fdeeaa31a","Type":"ContainerDied","Data":"8f1102a1e204850c8494267640da7ec93ec67e7341c4eb60d22a7f0772058cb4"} Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.831855 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"850880b1c00b2f5a5a32f08989e49cc1406960901b41de4ee69b92f38458d395"} Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.832000 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f271860bb80800ec82f217effead5b1e9475829bbf78baea857aa7639eea7291"} Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.832133 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"de2d3be12ab406039374efc6a0094e21103be62b51ef65c4ccf5529d6ef05291"} Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.832289 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"10bb15e1466307914440acf630eb4ea4a442cf4354a9cc5e6ddb40d8147a4d77"} Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.832412 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3c7ac487e3e86e35e8a1d6ddf67975eb4a67657b219938a69a90ccd5774ee0ea"} Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.832523 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c6233b06ff88fa77ac74becea9ef44fd4aa09b0ae718390c1b73c78d353ecbc1"} Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.832620 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8f1102a1e204850c8494267640da7ec93ec67e7341c4eb60d22a7f0772058cb4"} Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.832749 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9eb388a698a6a2ee1963deeba09459d7190f60ed189ee20a2ba24de317604226"} Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.832872 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e3043711b80807f025d3bd2e7b4593f22d78dd3e458aa185c18c065af4aca503"} Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.832987 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378"} Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.833103 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" event={"ID":"73327417-4d3b-45f1-b3b6-575fdeeaa31a","Type":"ContainerDied","Data":"9eb388a698a6a2ee1963deeba09459d7190f60ed189ee20a2ba24de317604226"} Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.833262 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"850880b1c00b2f5a5a32f08989e49cc1406960901b41de4ee69b92f38458d395"} Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.833406 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f271860bb80800ec82f217effead5b1e9475829bbf78baea857aa7639eea7291"} Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.833522 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"de2d3be12ab406039374efc6a0094e21103be62b51ef65c4ccf5529d6ef05291"} Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.833620 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"10bb15e1466307914440acf630eb4ea4a442cf4354a9cc5e6ddb40d8147a4d77"} Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.833728 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3c7ac487e3e86e35e8a1d6ddf67975eb4a67657b219938a69a90ccd5774ee0ea"} Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.833854 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c6233b06ff88fa77ac74becea9ef44fd4aa09b0ae718390c1b73c78d353ecbc1"} Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.833969 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8f1102a1e204850c8494267640da7ec93ec67e7341c4eb60d22a7f0772058cb4"} Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.834077 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9eb388a698a6a2ee1963deeba09459d7190f60ed189ee20a2ba24de317604226"} Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.834220 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e3043711b80807f025d3bd2e7b4593f22d78dd3e458aa185c18c065af4aca503"} Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.834344 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378"} Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.834463 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" event={"ID":"73327417-4d3b-45f1-b3b6-575fdeeaa31a","Type":"ContainerDied","Data":"e3043711b80807f025d3bd2e7b4593f22d78dd3e458aa185c18c065af4aca503"} Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.834763 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"850880b1c00b2f5a5a32f08989e49cc1406960901b41de4ee69b92f38458d395"} Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.835033 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f271860bb80800ec82f217effead5b1e9475829bbf78baea857aa7639eea7291"} Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.835451 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"de2d3be12ab406039374efc6a0094e21103be62b51ef65c4ccf5529d6ef05291"} Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.835670 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"10bb15e1466307914440acf630eb4ea4a442cf4354a9cc5e6ddb40d8147a4d77"} Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.835881 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3c7ac487e3e86e35e8a1d6ddf67975eb4a67657b219938a69a90ccd5774ee0ea"} Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.836093 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c6233b06ff88fa77ac74becea9ef44fd4aa09b0ae718390c1b73c78d353ecbc1"} Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.836333 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8f1102a1e204850c8494267640da7ec93ec67e7341c4eb60d22a7f0772058cb4"} Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.836532 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9eb388a698a6a2ee1963deeba09459d7190f60ed189ee20a2ba24de317604226"} Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.836733 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e3043711b80807f025d3bd2e7b4593f22d78dd3e458aa185c18c065af4aca503"} Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.836997 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378"} Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.837164 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" event={"ID":"73327417-4d3b-45f1-b3b6-575fdeeaa31a","Type":"ContainerDied","Data":"35bea9a3e63456f3c4522f7b18c54f2df3fc823d29bd3059264ea8e5f121d012"} Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.837333 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"850880b1c00b2f5a5a32f08989e49cc1406960901b41de4ee69b92f38458d395"} Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.837452 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f271860bb80800ec82f217effead5b1e9475829bbf78baea857aa7639eea7291"} Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.837592 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"de2d3be12ab406039374efc6a0094e21103be62b51ef65c4ccf5529d6ef05291"} Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.837755 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"10bb15e1466307914440acf630eb4ea4a442cf4354a9cc5e6ddb40d8147a4d77"} Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.838107 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3c7ac487e3e86e35e8a1d6ddf67975eb4a67657b219938a69a90ccd5774ee0ea"} Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.838258 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c6233b06ff88fa77ac74becea9ef44fd4aa09b0ae718390c1b73c78d353ecbc1"} Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.838457 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8f1102a1e204850c8494267640da7ec93ec67e7341c4eb60d22a7f0772058cb4"} Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.838700 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9eb388a698a6a2ee1963deeba09459d7190f60ed189ee20a2ba24de317604226"} Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.838874 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e3043711b80807f025d3bd2e7b4593f22d78dd3e458aa185c18c065af4aca503"} Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.838997 4733 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378"} Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.839121 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-pwtdp"] Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.829695 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7pxwd" Mar 18 10:25:51 crc kubenswrapper[4733]: E0318 10:25:51.839795 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" containerName="ovnkube-controller" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.839937 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" containerName="ovnkube-controller" Mar 18 10:25:51 crc kubenswrapper[4733]: E0318 10:25:51.840056 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" containerName="ovnkube-controller" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.840341 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" containerName="ovnkube-controller" Mar 18 10:25:51 crc kubenswrapper[4733]: E0318 10:25:51.841015 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" containerName="kube-rbac-proxy-ovn-metrics" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.841160 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" containerName="kube-rbac-proxy-ovn-metrics" Mar 18 10:25:51 crc kubenswrapper[4733]: E0318 10:25:51.841321 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" containerName="northd" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.841427 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" containerName="northd" Mar 18 10:25:51 crc kubenswrapper[4733]: E0318 10:25:51.841551 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" containerName="ovnkube-controller" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.841676 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" containerName="ovnkube-controller" Mar 18 10:25:51 crc kubenswrapper[4733]: E0318 10:25:51.841858 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" containerName="nbdb" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.841980 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" containerName="nbdb" Mar 18 10:25:51 crc kubenswrapper[4733]: E0318 10:25:51.842095 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" containerName="sbdb" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.842236 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" containerName="sbdb" Mar 18 10:25:51 crc kubenswrapper[4733]: E0318 10:25:51.842376 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" containerName="ovn-acl-logging" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.842492 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" containerName="ovn-acl-logging" Mar 18 10:25:51 crc kubenswrapper[4733]: E0318 10:25:51.842836 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" containerName="kube-rbac-proxy-node" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.842974 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" containerName="kube-rbac-proxy-node" Mar 18 10:25:51 crc kubenswrapper[4733]: E0318 10:25:51.843096 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" containerName="kubecfg-setup" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.843239 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" containerName="kubecfg-setup" Mar 18 10:25:51 crc kubenswrapper[4733]: E0318 10:25:51.843374 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" containerName="ovn-controller" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.843489 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" containerName="ovn-controller" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.843810 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" containerName="northd" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.843965 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" containerName="ovnkube-controller" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.844083 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" containerName="ovnkube-controller" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.844254 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" containerName="kube-rbac-proxy-ovn-metrics" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.844371 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" containerName="kube-rbac-proxy-node" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.844503 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" containerName="ovnkube-controller" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.844611 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" containerName="sbdb" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.844729 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" containerName="ovn-acl-logging" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.844856 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" containerName="ovnkube-controller" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.845143 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" containerName="ovn-controller" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.845339 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" containerName="ovnkube-controller" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.845451 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" containerName="nbdb" Mar 18 10:25:51 crc kubenswrapper[4733]: E0318 10:25:51.845843 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" containerName="ovnkube-controller" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.845996 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" containerName="ovnkube-controller" Mar 18 10:25:51 crc kubenswrapper[4733]: E0318 10:25:51.847078 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" containerName="ovnkube-controller" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.847308 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" containerName="ovnkube-controller" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.850117 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.851965 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.854633 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.855407 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.865756 4733 scope.go:117] "RemoveContainer" containerID="850880b1c00b2f5a5a32f08989e49cc1406960901b41de4ee69b92f38458d395" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.887912 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a10a6717-677d-43b8-8d78-8f60b16836ed-host-cni-bin\") pod \"ovnkube-node-pwtdp\" (UID: \"a10a6717-677d-43b8-8d78-8f60b16836ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.888032 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a10a6717-677d-43b8-8d78-8f60b16836ed-env-overrides\") pod \"ovnkube-node-pwtdp\" (UID: \"a10a6717-677d-43b8-8d78-8f60b16836ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.888068 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a10a6717-677d-43b8-8d78-8f60b16836ed-host-kubelet\") pod \"ovnkube-node-pwtdp\" (UID: \"a10a6717-677d-43b8-8d78-8f60b16836ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.888102 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24b8k\" (UniqueName: \"kubernetes.io/projected/a10a6717-677d-43b8-8d78-8f60b16836ed-kube-api-access-24b8k\") pod \"ovnkube-node-pwtdp\" (UID: \"a10a6717-677d-43b8-8d78-8f60b16836ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.888887 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a10a6717-677d-43b8-8d78-8f60b16836ed-run-ovn\") pod \"ovnkube-node-pwtdp\" (UID: \"a10a6717-677d-43b8-8d78-8f60b16836ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.889000 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a10a6717-677d-43b8-8d78-8f60b16836ed-ovnkube-config\") pod \"ovnkube-node-pwtdp\" (UID: \"a10a6717-677d-43b8-8d78-8f60b16836ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.889067 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a10a6717-677d-43b8-8d78-8f60b16836ed-run-systemd\") pod \"ovnkube-node-pwtdp\" (UID: \"a10a6717-677d-43b8-8d78-8f60b16836ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.889098 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a10a6717-677d-43b8-8d78-8f60b16836ed-ovnkube-script-lib\") pod \"ovnkube-node-pwtdp\" (UID: \"a10a6717-677d-43b8-8d78-8f60b16836ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.889137 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a10a6717-677d-43b8-8d78-8f60b16836ed-host-cni-netd\") pod \"ovnkube-node-pwtdp\" (UID: \"a10a6717-677d-43b8-8d78-8f60b16836ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.889168 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a10a6717-677d-43b8-8d78-8f60b16836ed-etc-openvswitch\") pod \"ovnkube-node-pwtdp\" (UID: \"a10a6717-677d-43b8-8d78-8f60b16836ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.891334 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a10a6717-677d-43b8-8d78-8f60b16836ed-node-log\") pod \"ovnkube-node-pwtdp\" (UID: \"a10a6717-677d-43b8-8d78-8f60b16836ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.891673 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a10a6717-677d-43b8-8d78-8f60b16836ed-var-lib-openvswitch\") pod \"ovnkube-node-pwtdp\" (UID: \"a10a6717-677d-43b8-8d78-8f60b16836ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.891745 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a10a6717-677d-43b8-8d78-8f60b16836ed-host-run-ovn-kubernetes\") pod \"ovnkube-node-pwtdp\" (UID: \"a10a6717-677d-43b8-8d78-8f60b16836ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.891824 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a10a6717-677d-43b8-8d78-8f60b16836ed-systemd-units\") pod \"ovnkube-node-pwtdp\" (UID: \"a10a6717-677d-43b8-8d78-8f60b16836ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.891878 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a10a6717-677d-43b8-8d78-8f60b16836ed-run-openvswitch\") pod \"ovnkube-node-pwtdp\" (UID: \"a10a6717-677d-43b8-8d78-8f60b16836ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.891956 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a10a6717-677d-43b8-8d78-8f60b16836ed-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pwtdp\" (UID: \"a10a6717-677d-43b8-8d78-8f60b16836ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.892017 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a10a6717-677d-43b8-8d78-8f60b16836ed-ovn-node-metrics-cert\") pod \"ovnkube-node-pwtdp\" (UID: \"a10a6717-677d-43b8-8d78-8f60b16836ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.892077 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a10a6717-677d-43b8-8d78-8f60b16836ed-host-run-netns\") pod \"ovnkube-node-pwtdp\" (UID: \"a10a6717-677d-43b8-8d78-8f60b16836ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.892141 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a10a6717-677d-43b8-8d78-8f60b16836ed-host-slash\") pod \"ovnkube-node-pwtdp\" (UID: \"a10a6717-677d-43b8-8d78-8f60b16836ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.892300 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a10a6717-677d-43b8-8d78-8f60b16836ed-log-socket\") pod \"ovnkube-node-pwtdp\" (UID: \"a10a6717-677d-43b8-8d78-8f60b16836ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.892645 4733 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.892750 4733 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-node-log\") on node \"crc\" DevicePath \"\"" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.892824 4733 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/73327417-4d3b-45f1-b3b6-575fdeeaa31a-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.892900 4733 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.892993 4733 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.893094 4733 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.893172 4733 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/73327417-4d3b-45f1-b3b6-575fdeeaa31a-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.893278 4733 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.893359 4733 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.893435 4733 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-host-slash\") on node \"crc\" DevicePath \"\"" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.893515 4733 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.893595 4733 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.893736 4733 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.893848 4733 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.893943 4733 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-log-socket\") on node \"crc\" DevicePath \"\"" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.894038 4733 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/73327417-4d3b-45f1-b3b6-575fdeeaa31a-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.894152 4733 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.894277 4733 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/73327417-4d3b-45f1-b3b6-575fdeeaa31a-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.894396 4733 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/73327417-4d3b-45f1-b3b6-575fdeeaa31a-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.894573 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqxdr\" (UniqueName: \"kubernetes.io/projected/73327417-4d3b-45f1-b3b6-575fdeeaa31a-kube-api-access-zqxdr\") on node \"crc\" DevicePath \"\"" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.897375 4733 scope.go:117] "RemoveContainer" containerID="f271860bb80800ec82f217effead5b1e9475829bbf78baea857aa7639eea7291" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.914927 4733 scope.go:117] "RemoveContainer" containerID="de2d3be12ab406039374efc6a0094e21103be62b51ef65c4ccf5529d6ef05291" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.932009 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-7pxwd"] Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.937692 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-7pxwd"] Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.944311 4733 scope.go:117] "RemoveContainer" containerID="10bb15e1466307914440acf630eb4ea4a442cf4354a9cc5e6ddb40d8147a4d77" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.956879 4733 scope.go:117] "RemoveContainer" containerID="3c7ac487e3e86e35e8a1d6ddf67975eb4a67657b219938a69a90ccd5774ee0ea" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.967819 4733 scope.go:117] "RemoveContainer" containerID="c6233b06ff88fa77ac74becea9ef44fd4aa09b0ae718390c1b73c78d353ecbc1" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.987437 4733 scope.go:117] "RemoveContainer" containerID="8f1102a1e204850c8494267640da7ec93ec67e7341c4eb60d22a7f0772058cb4" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.995686 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a10a6717-677d-43b8-8d78-8f60b16836ed-run-openvswitch\") pod \"ovnkube-node-pwtdp\" (UID: \"a10a6717-677d-43b8-8d78-8f60b16836ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.995764 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a10a6717-677d-43b8-8d78-8f60b16836ed-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pwtdp\" (UID: \"a10a6717-677d-43b8-8d78-8f60b16836ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.995800 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a10a6717-677d-43b8-8d78-8f60b16836ed-ovn-node-metrics-cert\") pod \"ovnkube-node-pwtdp\" (UID: \"a10a6717-677d-43b8-8d78-8f60b16836ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.995840 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a10a6717-677d-43b8-8d78-8f60b16836ed-host-run-netns\") pod \"ovnkube-node-pwtdp\" (UID: \"a10a6717-677d-43b8-8d78-8f60b16836ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.995852 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a10a6717-677d-43b8-8d78-8f60b16836ed-run-openvswitch\") pod \"ovnkube-node-pwtdp\" (UID: \"a10a6717-677d-43b8-8d78-8f60b16836ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.995870 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a10a6717-677d-43b8-8d78-8f60b16836ed-host-slash\") pod \"ovnkube-node-pwtdp\" (UID: \"a10a6717-677d-43b8-8d78-8f60b16836ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.996254 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a10a6717-677d-43b8-8d78-8f60b16836ed-log-socket\") pod \"ovnkube-node-pwtdp\" (UID: \"a10a6717-677d-43b8-8d78-8f60b16836ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.996302 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a10a6717-677d-43b8-8d78-8f60b16836ed-host-cni-bin\") pod \"ovnkube-node-pwtdp\" (UID: \"a10a6717-677d-43b8-8d78-8f60b16836ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.996354 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a10a6717-677d-43b8-8d78-8f60b16836ed-env-overrides\") pod \"ovnkube-node-pwtdp\" (UID: \"a10a6717-677d-43b8-8d78-8f60b16836ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.996393 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24b8k\" (UniqueName: \"kubernetes.io/projected/a10a6717-677d-43b8-8d78-8f60b16836ed-kube-api-access-24b8k\") pod \"ovnkube-node-pwtdp\" (UID: \"a10a6717-677d-43b8-8d78-8f60b16836ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.996422 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a10a6717-677d-43b8-8d78-8f60b16836ed-host-kubelet\") pod \"ovnkube-node-pwtdp\" (UID: \"a10a6717-677d-43b8-8d78-8f60b16836ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.996457 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a10a6717-677d-43b8-8d78-8f60b16836ed-run-ovn\") pod \"ovnkube-node-pwtdp\" (UID: \"a10a6717-677d-43b8-8d78-8f60b16836ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.996519 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a10a6717-677d-43b8-8d78-8f60b16836ed-ovnkube-config\") pod \"ovnkube-node-pwtdp\" (UID: \"a10a6717-677d-43b8-8d78-8f60b16836ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.996563 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a10a6717-677d-43b8-8d78-8f60b16836ed-run-systemd\") pod \"ovnkube-node-pwtdp\" (UID: \"a10a6717-677d-43b8-8d78-8f60b16836ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.996596 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a10a6717-677d-43b8-8d78-8f60b16836ed-ovnkube-script-lib\") pod \"ovnkube-node-pwtdp\" (UID: \"a10a6717-677d-43b8-8d78-8f60b16836ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.996634 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a10a6717-677d-43b8-8d78-8f60b16836ed-host-cni-netd\") pod \"ovnkube-node-pwtdp\" (UID: \"a10a6717-677d-43b8-8d78-8f60b16836ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.996665 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a10a6717-677d-43b8-8d78-8f60b16836ed-node-log\") pod \"ovnkube-node-pwtdp\" (UID: \"a10a6717-677d-43b8-8d78-8f60b16836ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.996694 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a10a6717-677d-43b8-8d78-8f60b16836ed-etc-openvswitch\") pod \"ovnkube-node-pwtdp\" (UID: \"a10a6717-677d-43b8-8d78-8f60b16836ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.996734 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a10a6717-677d-43b8-8d78-8f60b16836ed-var-lib-openvswitch\") pod \"ovnkube-node-pwtdp\" (UID: \"a10a6717-677d-43b8-8d78-8f60b16836ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.996766 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a10a6717-677d-43b8-8d78-8f60b16836ed-host-run-ovn-kubernetes\") pod \"ovnkube-node-pwtdp\" (UID: \"a10a6717-677d-43b8-8d78-8f60b16836ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.996811 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a10a6717-677d-43b8-8d78-8f60b16836ed-systemd-units\") pod \"ovnkube-node-pwtdp\" (UID: \"a10a6717-677d-43b8-8d78-8f60b16836ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.996908 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a10a6717-677d-43b8-8d78-8f60b16836ed-systemd-units\") pod \"ovnkube-node-pwtdp\" (UID: \"a10a6717-677d-43b8-8d78-8f60b16836ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.996970 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a10a6717-677d-43b8-8d78-8f60b16836ed-host-cni-netd\") pod \"ovnkube-node-pwtdp\" (UID: \"a10a6717-677d-43b8-8d78-8f60b16836ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.997017 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a10a6717-677d-43b8-8d78-8f60b16836ed-node-log\") pod \"ovnkube-node-pwtdp\" (UID: \"a10a6717-677d-43b8-8d78-8f60b16836ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.997035 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a10a6717-677d-43b8-8d78-8f60b16836ed-env-overrides\") pod \"ovnkube-node-pwtdp\" (UID: \"a10a6717-677d-43b8-8d78-8f60b16836ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.997061 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a10a6717-677d-43b8-8d78-8f60b16836ed-etc-openvswitch\") pod \"ovnkube-node-pwtdp\" (UID: \"a10a6717-677d-43b8-8d78-8f60b16836ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.995910 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a10a6717-677d-43b8-8d78-8f60b16836ed-host-slash\") pod \"ovnkube-node-pwtdp\" (UID: \"a10a6717-677d-43b8-8d78-8f60b16836ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.997090 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a10a6717-677d-43b8-8d78-8f60b16836ed-log-socket\") pod \"ovnkube-node-pwtdp\" (UID: \"a10a6717-677d-43b8-8d78-8f60b16836ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.995932 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a10a6717-677d-43b8-8d78-8f60b16836ed-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pwtdp\" (UID: \"a10a6717-677d-43b8-8d78-8f60b16836ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.997123 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a10a6717-677d-43b8-8d78-8f60b16836ed-var-lib-openvswitch\") pod \"ovnkube-node-pwtdp\" (UID: \"a10a6717-677d-43b8-8d78-8f60b16836ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.997165 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a10a6717-677d-43b8-8d78-8f60b16836ed-run-ovn\") pod \"ovnkube-node-pwtdp\" (UID: \"a10a6717-677d-43b8-8d78-8f60b16836ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.995983 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a10a6717-677d-43b8-8d78-8f60b16836ed-host-run-netns\") pod \"ovnkube-node-pwtdp\" (UID: \"a10a6717-677d-43b8-8d78-8f60b16836ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.997254 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a10a6717-677d-43b8-8d78-8f60b16836ed-host-run-ovn-kubernetes\") pod \"ovnkube-node-pwtdp\" (UID: \"a10a6717-677d-43b8-8d78-8f60b16836ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.997130 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a10a6717-677d-43b8-8d78-8f60b16836ed-host-cni-bin\") pod \"ovnkube-node-pwtdp\" (UID: \"a10a6717-677d-43b8-8d78-8f60b16836ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.997417 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a10a6717-677d-43b8-8d78-8f60b16836ed-host-kubelet\") pod \"ovnkube-node-pwtdp\" (UID: \"a10a6717-677d-43b8-8d78-8f60b16836ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.997448 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a10a6717-677d-43b8-8d78-8f60b16836ed-run-systemd\") pod \"ovnkube-node-pwtdp\" (UID: \"a10a6717-677d-43b8-8d78-8f60b16836ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.997865 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a10a6717-677d-43b8-8d78-8f60b16836ed-ovnkube-script-lib\") pod \"ovnkube-node-pwtdp\" (UID: \"a10a6717-677d-43b8-8d78-8f60b16836ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.998401 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a10a6717-677d-43b8-8d78-8f60b16836ed-ovnkube-config\") pod \"ovnkube-node-pwtdp\" (UID: \"a10a6717-677d-43b8-8d78-8f60b16836ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.998994 4733 scope.go:117] "RemoveContainer" containerID="9eb388a698a6a2ee1963deeba09459d7190f60ed189ee20a2ba24de317604226" Mar 18 10:25:51 crc kubenswrapper[4733]: I0318 10:25:51.999345 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a10a6717-677d-43b8-8d78-8f60b16836ed-ovn-node-metrics-cert\") pod \"ovnkube-node-pwtdp\" (UID: \"a10a6717-677d-43b8-8d78-8f60b16836ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.013507 4733 scope.go:117] "RemoveContainer" containerID="e3043711b80807f025d3bd2e7b4593f22d78dd3e458aa185c18c065af4aca503" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.016814 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24b8k\" (UniqueName: \"kubernetes.io/projected/a10a6717-677d-43b8-8d78-8f60b16836ed-kube-api-access-24b8k\") pod \"ovnkube-node-pwtdp\" (UID: \"a10a6717-677d-43b8-8d78-8f60b16836ed\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.032783 4733 scope.go:117] "RemoveContainer" containerID="d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.053403 4733 scope.go:117] "RemoveContainer" containerID="850880b1c00b2f5a5a32f08989e49cc1406960901b41de4ee69b92f38458d395" Mar 18 10:25:52 crc kubenswrapper[4733]: E0318 10:25:52.054035 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"850880b1c00b2f5a5a32f08989e49cc1406960901b41de4ee69b92f38458d395\": container with ID starting with 850880b1c00b2f5a5a32f08989e49cc1406960901b41de4ee69b92f38458d395 not found: ID does not exist" containerID="850880b1c00b2f5a5a32f08989e49cc1406960901b41de4ee69b92f38458d395" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.054078 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"850880b1c00b2f5a5a32f08989e49cc1406960901b41de4ee69b92f38458d395"} err="failed to get container status \"850880b1c00b2f5a5a32f08989e49cc1406960901b41de4ee69b92f38458d395\": rpc error: code = NotFound desc = could not find container \"850880b1c00b2f5a5a32f08989e49cc1406960901b41de4ee69b92f38458d395\": container with ID starting with 850880b1c00b2f5a5a32f08989e49cc1406960901b41de4ee69b92f38458d395 not found: ID does not exist" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.054111 4733 scope.go:117] "RemoveContainer" containerID="f271860bb80800ec82f217effead5b1e9475829bbf78baea857aa7639eea7291" Mar 18 10:25:52 crc kubenswrapper[4733]: E0318 10:25:52.054713 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f271860bb80800ec82f217effead5b1e9475829bbf78baea857aa7639eea7291\": container with ID starting with f271860bb80800ec82f217effead5b1e9475829bbf78baea857aa7639eea7291 not found: ID does not exist" containerID="f271860bb80800ec82f217effead5b1e9475829bbf78baea857aa7639eea7291" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.054750 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f271860bb80800ec82f217effead5b1e9475829bbf78baea857aa7639eea7291"} err="failed to get container status \"f271860bb80800ec82f217effead5b1e9475829bbf78baea857aa7639eea7291\": rpc error: code = NotFound desc = could not find container \"f271860bb80800ec82f217effead5b1e9475829bbf78baea857aa7639eea7291\": container with ID starting with f271860bb80800ec82f217effead5b1e9475829bbf78baea857aa7639eea7291 not found: ID does not exist" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.054776 4733 scope.go:117] "RemoveContainer" containerID="de2d3be12ab406039374efc6a0094e21103be62b51ef65c4ccf5529d6ef05291" Mar 18 10:25:52 crc kubenswrapper[4733]: E0318 10:25:52.055427 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de2d3be12ab406039374efc6a0094e21103be62b51ef65c4ccf5529d6ef05291\": container with ID starting with de2d3be12ab406039374efc6a0094e21103be62b51ef65c4ccf5529d6ef05291 not found: ID does not exist" containerID="de2d3be12ab406039374efc6a0094e21103be62b51ef65c4ccf5529d6ef05291" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.055463 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de2d3be12ab406039374efc6a0094e21103be62b51ef65c4ccf5529d6ef05291"} err="failed to get container status \"de2d3be12ab406039374efc6a0094e21103be62b51ef65c4ccf5529d6ef05291\": rpc error: code = NotFound desc = could not find container \"de2d3be12ab406039374efc6a0094e21103be62b51ef65c4ccf5529d6ef05291\": container with ID starting with de2d3be12ab406039374efc6a0094e21103be62b51ef65c4ccf5529d6ef05291 not found: ID does not exist" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.055490 4733 scope.go:117] "RemoveContainer" containerID="10bb15e1466307914440acf630eb4ea4a442cf4354a9cc5e6ddb40d8147a4d77" Mar 18 10:25:52 crc kubenswrapper[4733]: E0318 10:25:52.057355 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10bb15e1466307914440acf630eb4ea4a442cf4354a9cc5e6ddb40d8147a4d77\": container with ID starting with 10bb15e1466307914440acf630eb4ea4a442cf4354a9cc5e6ddb40d8147a4d77 not found: ID does not exist" containerID="10bb15e1466307914440acf630eb4ea4a442cf4354a9cc5e6ddb40d8147a4d77" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.057391 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10bb15e1466307914440acf630eb4ea4a442cf4354a9cc5e6ddb40d8147a4d77"} err="failed to get container status \"10bb15e1466307914440acf630eb4ea4a442cf4354a9cc5e6ddb40d8147a4d77\": rpc error: code = NotFound desc = could not find container \"10bb15e1466307914440acf630eb4ea4a442cf4354a9cc5e6ddb40d8147a4d77\": container with ID starting with 10bb15e1466307914440acf630eb4ea4a442cf4354a9cc5e6ddb40d8147a4d77 not found: ID does not exist" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.057422 4733 scope.go:117] "RemoveContainer" containerID="3c7ac487e3e86e35e8a1d6ddf67975eb4a67657b219938a69a90ccd5774ee0ea" Mar 18 10:25:52 crc kubenswrapper[4733]: E0318 10:25:52.058065 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c7ac487e3e86e35e8a1d6ddf67975eb4a67657b219938a69a90ccd5774ee0ea\": container with ID starting with 3c7ac487e3e86e35e8a1d6ddf67975eb4a67657b219938a69a90ccd5774ee0ea not found: ID does not exist" containerID="3c7ac487e3e86e35e8a1d6ddf67975eb4a67657b219938a69a90ccd5774ee0ea" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.058101 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c7ac487e3e86e35e8a1d6ddf67975eb4a67657b219938a69a90ccd5774ee0ea"} err="failed to get container status \"3c7ac487e3e86e35e8a1d6ddf67975eb4a67657b219938a69a90ccd5774ee0ea\": rpc error: code = NotFound desc = could not find container \"3c7ac487e3e86e35e8a1d6ddf67975eb4a67657b219938a69a90ccd5774ee0ea\": container with ID starting with 3c7ac487e3e86e35e8a1d6ddf67975eb4a67657b219938a69a90ccd5774ee0ea not found: ID does not exist" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.058125 4733 scope.go:117] "RemoveContainer" containerID="c6233b06ff88fa77ac74becea9ef44fd4aa09b0ae718390c1b73c78d353ecbc1" Mar 18 10:25:52 crc kubenswrapper[4733]: E0318 10:25:52.058634 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6233b06ff88fa77ac74becea9ef44fd4aa09b0ae718390c1b73c78d353ecbc1\": container with ID starting with c6233b06ff88fa77ac74becea9ef44fd4aa09b0ae718390c1b73c78d353ecbc1 not found: ID does not exist" containerID="c6233b06ff88fa77ac74becea9ef44fd4aa09b0ae718390c1b73c78d353ecbc1" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.058665 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6233b06ff88fa77ac74becea9ef44fd4aa09b0ae718390c1b73c78d353ecbc1"} err="failed to get container status \"c6233b06ff88fa77ac74becea9ef44fd4aa09b0ae718390c1b73c78d353ecbc1\": rpc error: code = NotFound desc = could not find container \"c6233b06ff88fa77ac74becea9ef44fd4aa09b0ae718390c1b73c78d353ecbc1\": container with ID starting with c6233b06ff88fa77ac74becea9ef44fd4aa09b0ae718390c1b73c78d353ecbc1 not found: ID does not exist" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.058687 4733 scope.go:117] "RemoveContainer" containerID="8f1102a1e204850c8494267640da7ec93ec67e7341c4eb60d22a7f0772058cb4" Mar 18 10:25:52 crc kubenswrapper[4733]: E0318 10:25:52.058985 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f1102a1e204850c8494267640da7ec93ec67e7341c4eb60d22a7f0772058cb4\": container with ID starting with 8f1102a1e204850c8494267640da7ec93ec67e7341c4eb60d22a7f0772058cb4 not found: ID does not exist" containerID="8f1102a1e204850c8494267640da7ec93ec67e7341c4eb60d22a7f0772058cb4" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.059008 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f1102a1e204850c8494267640da7ec93ec67e7341c4eb60d22a7f0772058cb4"} err="failed to get container status \"8f1102a1e204850c8494267640da7ec93ec67e7341c4eb60d22a7f0772058cb4\": rpc error: code = NotFound desc = could not find container \"8f1102a1e204850c8494267640da7ec93ec67e7341c4eb60d22a7f0772058cb4\": container with ID starting with 8f1102a1e204850c8494267640da7ec93ec67e7341c4eb60d22a7f0772058cb4 not found: ID does not exist" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.059025 4733 scope.go:117] "RemoveContainer" containerID="9eb388a698a6a2ee1963deeba09459d7190f60ed189ee20a2ba24de317604226" Mar 18 10:25:52 crc kubenswrapper[4733]: E0318 10:25:52.059268 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9eb388a698a6a2ee1963deeba09459d7190f60ed189ee20a2ba24de317604226\": container with ID starting with 9eb388a698a6a2ee1963deeba09459d7190f60ed189ee20a2ba24de317604226 not found: ID does not exist" containerID="9eb388a698a6a2ee1963deeba09459d7190f60ed189ee20a2ba24de317604226" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.059308 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9eb388a698a6a2ee1963deeba09459d7190f60ed189ee20a2ba24de317604226"} err="failed to get container status \"9eb388a698a6a2ee1963deeba09459d7190f60ed189ee20a2ba24de317604226\": rpc error: code = NotFound desc = could not find container \"9eb388a698a6a2ee1963deeba09459d7190f60ed189ee20a2ba24de317604226\": container with ID starting with 9eb388a698a6a2ee1963deeba09459d7190f60ed189ee20a2ba24de317604226 not found: ID does not exist" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.059330 4733 scope.go:117] "RemoveContainer" containerID="e3043711b80807f025d3bd2e7b4593f22d78dd3e458aa185c18c065af4aca503" Mar 18 10:25:52 crc kubenswrapper[4733]: E0318 10:25:52.059573 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3043711b80807f025d3bd2e7b4593f22d78dd3e458aa185c18c065af4aca503\": container with ID starting with e3043711b80807f025d3bd2e7b4593f22d78dd3e458aa185c18c065af4aca503 not found: ID does not exist" containerID="e3043711b80807f025d3bd2e7b4593f22d78dd3e458aa185c18c065af4aca503" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.059612 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3043711b80807f025d3bd2e7b4593f22d78dd3e458aa185c18c065af4aca503"} err="failed to get container status \"e3043711b80807f025d3bd2e7b4593f22d78dd3e458aa185c18c065af4aca503\": rpc error: code = NotFound desc = could not find container \"e3043711b80807f025d3bd2e7b4593f22d78dd3e458aa185c18c065af4aca503\": container with ID starting with e3043711b80807f025d3bd2e7b4593f22d78dd3e458aa185c18c065af4aca503 not found: ID does not exist" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.059642 4733 scope.go:117] "RemoveContainer" containerID="d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378" Mar 18 10:25:52 crc kubenswrapper[4733]: E0318 10:25:52.059917 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378\": container with ID starting with d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378 not found: ID does not exist" containerID="d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.059947 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378"} err="failed to get container status \"d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378\": rpc error: code = NotFound desc = could not find container \"d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378\": container with ID starting with d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378 not found: ID does not exist" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.059970 4733 scope.go:117] "RemoveContainer" containerID="850880b1c00b2f5a5a32f08989e49cc1406960901b41de4ee69b92f38458d395" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.060433 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"850880b1c00b2f5a5a32f08989e49cc1406960901b41de4ee69b92f38458d395"} err="failed to get container status \"850880b1c00b2f5a5a32f08989e49cc1406960901b41de4ee69b92f38458d395\": rpc error: code = NotFound desc = could not find container \"850880b1c00b2f5a5a32f08989e49cc1406960901b41de4ee69b92f38458d395\": container with ID starting with 850880b1c00b2f5a5a32f08989e49cc1406960901b41de4ee69b92f38458d395 not found: ID does not exist" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.060470 4733 scope.go:117] "RemoveContainer" containerID="f271860bb80800ec82f217effead5b1e9475829bbf78baea857aa7639eea7291" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.061002 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f271860bb80800ec82f217effead5b1e9475829bbf78baea857aa7639eea7291"} err="failed to get container status \"f271860bb80800ec82f217effead5b1e9475829bbf78baea857aa7639eea7291\": rpc error: code = NotFound desc = could not find container \"f271860bb80800ec82f217effead5b1e9475829bbf78baea857aa7639eea7291\": container with ID starting with f271860bb80800ec82f217effead5b1e9475829bbf78baea857aa7639eea7291 not found: ID does not exist" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.061053 4733 scope.go:117] "RemoveContainer" containerID="de2d3be12ab406039374efc6a0094e21103be62b51ef65c4ccf5529d6ef05291" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.061405 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de2d3be12ab406039374efc6a0094e21103be62b51ef65c4ccf5529d6ef05291"} err="failed to get container status \"de2d3be12ab406039374efc6a0094e21103be62b51ef65c4ccf5529d6ef05291\": rpc error: code = NotFound desc = could not find container \"de2d3be12ab406039374efc6a0094e21103be62b51ef65c4ccf5529d6ef05291\": container with ID starting with de2d3be12ab406039374efc6a0094e21103be62b51ef65c4ccf5529d6ef05291 not found: ID does not exist" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.061440 4733 scope.go:117] "RemoveContainer" containerID="10bb15e1466307914440acf630eb4ea4a442cf4354a9cc5e6ddb40d8147a4d77" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.061888 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10bb15e1466307914440acf630eb4ea4a442cf4354a9cc5e6ddb40d8147a4d77"} err="failed to get container status \"10bb15e1466307914440acf630eb4ea4a442cf4354a9cc5e6ddb40d8147a4d77\": rpc error: code = NotFound desc = could not find container \"10bb15e1466307914440acf630eb4ea4a442cf4354a9cc5e6ddb40d8147a4d77\": container with ID starting with 10bb15e1466307914440acf630eb4ea4a442cf4354a9cc5e6ddb40d8147a4d77 not found: ID does not exist" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.061929 4733 scope.go:117] "RemoveContainer" containerID="3c7ac487e3e86e35e8a1d6ddf67975eb4a67657b219938a69a90ccd5774ee0ea" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.062763 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c7ac487e3e86e35e8a1d6ddf67975eb4a67657b219938a69a90ccd5774ee0ea"} err="failed to get container status \"3c7ac487e3e86e35e8a1d6ddf67975eb4a67657b219938a69a90ccd5774ee0ea\": rpc error: code = NotFound desc = could not find container \"3c7ac487e3e86e35e8a1d6ddf67975eb4a67657b219938a69a90ccd5774ee0ea\": container with ID starting with 3c7ac487e3e86e35e8a1d6ddf67975eb4a67657b219938a69a90ccd5774ee0ea not found: ID does not exist" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.062799 4733 scope.go:117] "RemoveContainer" containerID="c6233b06ff88fa77ac74becea9ef44fd4aa09b0ae718390c1b73c78d353ecbc1" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.063361 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6233b06ff88fa77ac74becea9ef44fd4aa09b0ae718390c1b73c78d353ecbc1"} err="failed to get container status \"c6233b06ff88fa77ac74becea9ef44fd4aa09b0ae718390c1b73c78d353ecbc1\": rpc error: code = NotFound desc = could not find container \"c6233b06ff88fa77ac74becea9ef44fd4aa09b0ae718390c1b73c78d353ecbc1\": container with ID starting with c6233b06ff88fa77ac74becea9ef44fd4aa09b0ae718390c1b73c78d353ecbc1 not found: ID does not exist" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.063399 4733 scope.go:117] "RemoveContainer" containerID="8f1102a1e204850c8494267640da7ec93ec67e7341c4eb60d22a7f0772058cb4" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.063693 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f1102a1e204850c8494267640da7ec93ec67e7341c4eb60d22a7f0772058cb4"} err="failed to get container status \"8f1102a1e204850c8494267640da7ec93ec67e7341c4eb60d22a7f0772058cb4\": rpc error: code = NotFound desc = could not find container \"8f1102a1e204850c8494267640da7ec93ec67e7341c4eb60d22a7f0772058cb4\": container with ID starting with 8f1102a1e204850c8494267640da7ec93ec67e7341c4eb60d22a7f0772058cb4 not found: ID does not exist" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.063724 4733 scope.go:117] "RemoveContainer" containerID="9eb388a698a6a2ee1963deeba09459d7190f60ed189ee20a2ba24de317604226" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.064007 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9eb388a698a6a2ee1963deeba09459d7190f60ed189ee20a2ba24de317604226"} err="failed to get container status \"9eb388a698a6a2ee1963deeba09459d7190f60ed189ee20a2ba24de317604226\": rpc error: code = NotFound desc = could not find container \"9eb388a698a6a2ee1963deeba09459d7190f60ed189ee20a2ba24de317604226\": container with ID starting with 9eb388a698a6a2ee1963deeba09459d7190f60ed189ee20a2ba24de317604226 not found: ID does not exist" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.064033 4733 scope.go:117] "RemoveContainer" containerID="e3043711b80807f025d3bd2e7b4593f22d78dd3e458aa185c18c065af4aca503" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.064301 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3043711b80807f025d3bd2e7b4593f22d78dd3e458aa185c18c065af4aca503"} err="failed to get container status \"e3043711b80807f025d3bd2e7b4593f22d78dd3e458aa185c18c065af4aca503\": rpc error: code = NotFound desc = could not find container \"e3043711b80807f025d3bd2e7b4593f22d78dd3e458aa185c18c065af4aca503\": container with ID starting with e3043711b80807f025d3bd2e7b4593f22d78dd3e458aa185c18c065af4aca503 not found: ID does not exist" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.064326 4733 scope.go:117] "RemoveContainer" containerID="d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.064547 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378"} err="failed to get container status \"d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378\": rpc error: code = NotFound desc = could not find container \"d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378\": container with ID starting with d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378 not found: ID does not exist" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.064570 4733 scope.go:117] "RemoveContainer" containerID="850880b1c00b2f5a5a32f08989e49cc1406960901b41de4ee69b92f38458d395" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.064780 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"850880b1c00b2f5a5a32f08989e49cc1406960901b41de4ee69b92f38458d395"} err="failed to get container status \"850880b1c00b2f5a5a32f08989e49cc1406960901b41de4ee69b92f38458d395\": rpc error: code = NotFound desc = could not find container \"850880b1c00b2f5a5a32f08989e49cc1406960901b41de4ee69b92f38458d395\": container with ID starting with 850880b1c00b2f5a5a32f08989e49cc1406960901b41de4ee69b92f38458d395 not found: ID does not exist" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.064802 4733 scope.go:117] "RemoveContainer" containerID="f271860bb80800ec82f217effead5b1e9475829bbf78baea857aa7639eea7291" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.065008 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f271860bb80800ec82f217effead5b1e9475829bbf78baea857aa7639eea7291"} err="failed to get container status \"f271860bb80800ec82f217effead5b1e9475829bbf78baea857aa7639eea7291\": rpc error: code = NotFound desc = could not find container \"f271860bb80800ec82f217effead5b1e9475829bbf78baea857aa7639eea7291\": container with ID starting with f271860bb80800ec82f217effead5b1e9475829bbf78baea857aa7639eea7291 not found: ID does not exist" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.065031 4733 scope.go:117] "RemoveContainer" containerID="de2d3be12ab406039374efc6a0094e21103be62b51ef65c4ccf5529d6ef05291" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.065274 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de2d3be12ab406039374efc6a0094e21103be62b51ef65c4ccf5529d6ef05291"} err="failed to get container status \"de2d3be12ab406039374efc6a0094e21103be62b51ef65c4ccf5529d6ef05291\": rpc error: code = NotFound desc = could not find container \"de2d3be12ab406039374efc6a0094e21103be62b51ef65c4ccf5529d6ef05291\": container with ID starting with de2d3be12ab406039374efc6a0094e21103be62b51ef65c4ccf5529d6ef05291 not found: ID does not exist" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.065300 4733 scope.go:117] "RemoveContainer" containerID="10bb15e1466307914440acf630eb4ea4a442cf4354a9cc5e6ddb40d8147a4d77" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.065516 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10bb15e1466307914440acf630eb4ea4a442cf4354a9cc5e6ddb40d8147a4d77"} err="failed to get container status \"10bb15e1466307914440acf630eb4ea4a442cf4354a9cc5e6ddb40d8147a4d77\": rpc error: code = NotFound desc = could not find container \"10bb15e1466307914440acf630eb4ea4a442cf4354a9cc5e6ddb40d8147a4d77\": container with ID starting with 10bb15e1466307914440acf630eb4ea4a442cf4354a9cc5e6ddb40d8147a4d77 not found: ID does not exist" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.065539 4733 scope.go:117] "RemoveContainer" containerID="3c7ac487e3e86e35e8a1d6ddf67975eb4a67657b219938a69a90ccd5774ee0ea" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.065740 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c7ac487e3e86e35e8a1d6ddf67975eb4a67657b219938a69a90ccd5774ee0ea"} err="failed to get container status \"3c7ac487e3e86e35e8a1d6ddf67975eb4a67657b219938a69a90ccd5774ee0ea\": rpc error: code = NotFound desc = could not find container \"3c7ac487e3e86e35e8a1d6ddf67975eb4a67657b219938a69a90ccd5774ee0ea\": container with ID starting with 3c7ac487e3e86e35e8a1d6ddf67975eb4a67657b219938a69a90ccd5774ee0ea not found: ID does not exist" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.065763 4733 scope.go:117] "RemoveContainer" containerID="c6233b06ff88fa77ac74becea9ef44fd4aa09b0ae718390c1b73c78d353ecbc1" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.065971 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6233b06ff88fa77ac74becea9ef44fd4aa09b0ae718390c1b73c78d353ecbc1"} err="failed to get container status \"c6233b06ff88fa77ac74becea9ef44fd4aa09b0ae718390c1b73c78d353ecbc1\": rpc error: code = NotFound desc = could not find container \"c6233b06ff88fa77ac74becea9ef44fd4aa09b0ae718390c1b73c78d353ecbc1\": container with ID starting with c6233b06ff88fa77ac74becea9ef44fd4aa09b0ae718390c1b73c78d353ecbc1 not found: ID does not exist" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.065993 4733 scope.go:117] "RemoveContainer" containerID="8f1102a1e204850c8494267640da7ec93ec67e7341c4eb60d22a7f0772058cb4" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.066253 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f1102a1e204850c8494267640da7ec93ec67e7341c4eb60d22a7f0772058cb4"} err="failed to get container status \"8f1102a1e204850c8494267640da7ec93ec67e7341c4eb60d22a7f0772058cb4\": rpc error: code = NotFound desc = could not find container \"8f1102a1e204850c8494267640da7ec93ec67e7341c4eb60d22a7f0772058cb4\": container with ID starting with 8f1102a1e204850c8494267640da7ec93ec67e7341c4eb60d22a7f0772058cb4 not found: ID does not exist" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.066278 4733 scope.go:117] "RemoveContainer" containerID="9eb388a698a6a2ee1963deeba09459d7190f60ed189ee20a2ba24de317604226" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.066489 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9eb388a698a6a2ee1963deeba09459d7190f60ed189ee20a2ba24de317604226"} err="failed to get container status \"9eb388a698a6a2ee1963deeba09459d7190f60ed189ee20a2ba24de317604226\": rpc error: code = NotFound desc = could not find container \"9eb388a698a6a2ee1963deeba09459d7190f60ed189ee20a2ba24de317604226\": container with ID starting with 9eb388a698a6a2ee1963deeba09459d7190f60ed189ee20a2ba24de317604226 not found: ID does not exist" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.066512 4733 scope.go:117] "RemoveContainer" containerID="e3043711b80807f025d3bd2e7b4593f22d78dd3e458aa185c18c065af4aca503" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.066720 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3043711b80807f025d3bd2e7b4593f22d78dd3e458aa185c18c065af4aca503"} err="failed to get container status \"e3043711b80807f025d3bd2e7b4593f22d78dd3e458aa185c18c065af4aca503\": rpc error: code = NotFound desc = could not find container \"e3043711b80807f025d3bd2e7b4593f22d78dd3e458aa185c18c065af4aca503\": container with ID starting with e3043711b80807f025d3bd2e7b4593f22d78dd3e458aa185c18c065af4aca503 not found: ID does not exist" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.066743 4733 scope.go:117] "RemoveContainer" containerID="d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.066957 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378"} err="failed to get container status \"d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378\": rpc error: code = NotFound desc = could not find container \"d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378\": container with ID starting with d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378 not found: ID does not exist" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.066987 4733 scope.go:117] "RemoveContainer" containerID="850880b1c00b2f5a5a32f08989e49cc1406960901b41de4ee69b92f38458d395" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.067246 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"850880b1c00b2f5a5a32f08989e49cc1406960901b41de4ee69b92f38458d395"} err="failed to get container status \"850880b1c00b2f5a5a32f08989e49cc1406960901b41de4ee69b92f38458d395\": rpc error: code = NotFound desc = could not find container \"850880b1c00b2f5a5a32f08989e49cc1406960901b41de4ee69b92f38458d395\": container with ID starting with 850880b1c00b2f5a5a32f08989e49cc1406960901b41de4ee69b92f38458d395 not found: ID does not exist" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.067272 4733 scope.go:117] "RemoveContainer" containerID="f271860bb80800ec82f217effead5b1e9475829bbf78baea857aa7639eea7291" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.067490 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f271860bb80800ec82f217effead5b1e9475829bbf78baea857aa7639eea7291"} err="failed to get container status \"f271860bb80800ec82f217effead5b1e9475829bbf78baea857aa7639eea7291\": rpc error: code = NotFound desc = could not find container \"f271860bb80800ec82f217effead5b1e9475829bbf78baea857aa7639eea7291\": container with ID starting with f271860bb80800ec82f217effead5b1e9475829bbf78baea857aa7639eea7291 not found: ID does not exist" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.067516 4733 scope.go:117] "RemoveContainer" containerID="de2d3be12ab406039374efc6a0094e21103be62b51ef65c4ccf5529d6ef05291" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.067724 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de2d3be12ab406039374efc6a0094e21103be62b51ef65c4ccf5529d6ef05291"} err="failed to get container status \"de2d3be12ab406039374efc6a0094e21103be62b51ef65c4ccf5529d6ef05291\": rpc error: code = NotFound desc = could not find container \"de2d3be12ab406039374efc6a0094e21103be62b51ef65c4ccf5529d6ef05291\": container with ID starting with de2d3be12ab406039374efc6a0094e21103be62b51ef65c4ccf5529d6ef05291 not found: ID does not exist" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.067746 4733 scope.go:117] "RemoveContainer" containerID="10bb15e1466307914440acf630eb4ea4a442cf4354a9cc5e6ddb40d8147a4d77" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.067962 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10bb15e1466307914440acf630eb4ea4a442cf4354a9cc5e6ddb40d8147a4d77"} err="failed to get container status \"10bb15e1466307914440acf630eb4ea4a442cf4354a9cc5e6ddb40d8147a4d77\": rpc error: code = NotFound desc = could not find container \"10bb15e1466307914440acf630eb4ea4a442cf4354a9cc5e6ddb40d8147a4d77\": container with ID starting with 10bb15e1466307914440acf630eb4ea4a442cf4354a9cc5e6ddb40d8147a4d77 not found: ID does not exist" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.067992 4733 scope.go:117] "RemoveContainer" containerID="3c7ac487e3e86e35e8a1d6ddf67975eb4a67657b219938a69a90ccd5774ee0ea" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.068260 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c7ac487e3e86e35e8a1d6ddf67975eb4a67657b219938a69a90ccd5774ee0ea"} err="failed to get container status \"3c7ac487e3e86e35e8a1d6ddf67975eb4a67657b219938a69a90ccd5774ee0ea\": rpc error: code = NotFound desc = could not find container \"3c7ac487e3e86e35e8a1d6ddf67975eb4a67657b219938a69a90ccd5774ee0ea\": container with ID starting with 3c7ac487e3e86e35e8a1d6ddf67975eb4a67657b219938a69a90ccd5774ee0ea not found: ID does not exist" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.068294 4733 scope.go:117] "RemoveContainer" containerID="c6233b06ff88fa77ac74becea9ef44fd4aa09b0ae718390c1b73c78d353ecbc1" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.068535 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6233b06ff88fa77ac74becea9ef44fd4aa09b0ae718390c1b73c78d353ecbc1"} err="failed to get container status \"c6233b06ff88fa77ac74becea9ef44fd4aa09b0ae718390c1b73c78d353ecbc1\": rpc error: code = NotFound desc = could not find container \"c6233b06ff88fa77ac74becea9ef44fd4aa09b0ae718390c1b73c78d353ecbc1\": container with ID starting with c6233b06ff88fa77ac74becea9ef44fd4aa09b0ae718390c1b73c78d353ecbc1 not found: ID does not exist" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.068566 4733 scope.go:117] "RemoveContainer" containerID="8f1102a1e204850c8494267640da7ec93ec67e7341c4eb60d22a7f0772058cb4" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.068803 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f1102a1e204850c8494267640da7ec93ec67e7341c4eb60d22a7f0772058cb4"} err="failed to get container status \"8f1102a1e204850c8494267640da7ec93ec67e7341c4eb60d22a7f0772058cb4\": rpc error: code = NotFound desc = could not find container \"8f1102a1e204850c8494267640da7ec93ec67e7341c4eb60d22a7f0772058cb4\": container with ID starting with 8f1102a1e204850c8494267640da7ec93ec67e7341c4eb60d22a7f0772058cb4 not found: ID does not exist" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.068835 4733 scope.go:117] "RemoveContainer" containerID="9eb388a698a6a2ee1963deeba09459d7190f60ed189ee20a2ba24de317604226" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.069065 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9eb388a698a6a2ee1963deeba09459d7190f60ed189ee20a2ba24de317604226"} err="failed to get container status \"9eb388a698a6a2ee1963deeba09459d7190f60ed189ee20a2ba24de317604226\": rpc error: code = NotFound desc = could not find container \"9eb388a698a6a2ee1963deeba09459d7190f60ed189ee20a2ba24de317604226\": container with ID starting with 9eb388a698a6a2ee1963deeba09459d7190f60ed189ee20a2ba24de317604226 not found: ID does not exist" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.069091 4733 scope.go:117] "RemoveContainer" containerID="e3043711b80807f025d3bd2e7b4593f22d78dd3e458aa185c18c065af4aca503" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.069356 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3043711b80807f025d3bd2e7b4593f22d78dd3e458aa185c18c065af4aca503"} err="failed to get container status \"e3043711b80807f025d3bd2e7b4593f22d78dd3e458aa185c18c065af4aca503\": rpc error: code = NotFound desc = could not find container \"e3043711b80807f025d3bd2e7b4593f22d78dd3e458aa185c18c065af4aca503\": container with ID starting with e3043711b80807f025d3bd2e7b4593f22d78dd3e458aa185c18c065af4aca503 not found: ID does not exist" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.069389 4733 scope.go:117] "RemoveContainer" containerID="d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.069635 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378"} err="failed to get container status \"d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378\": rpc error: code = NotFound desc = could not find container \"d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378\": container with ID starting with d8e8b34a9866e756d4bedfedebd589abe90e519eb174e5962e4744d0d6c7f378 not found: ID does not exist" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.180478 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.842895 4733 generic.go:334] "Generic (PLEG): container finished" podID="a10a6717-677d-43b8-8d78-8f60b16836ed" containerID="d4da2b3b60b9de1cafe978e281f7d1a58bcc91654d6d64b25f5b83fe0d977358" exitCode=0 Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.842965 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" event={"ID":"a10a6717-677d-43b8-8d78-8f60b16836ed","Type":"ContainerDied","Data":"d4da2b3b60b9de1cafe978e281f7d1a58bcc91654d6d64b25f5b83fe0d977358"} Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.842988 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" event={"ID":"a10a6717-677d-43b8-8d78-8f60b16836ed","Type":"ContainerStarted","Data":"61932908893bc650e446c63cff073523850debe73375b309dc80abfe37eae512"} Mar 18 10:25:52 crc kubenswrapper[4733]: I0318 10:25:52.848079 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-g6j2q_cc85b0d4-15a5-4894-9f07-9aaeb28f63fa/kube-multus/2.log" Mar 18 10:25:53 crc kubenswrapper[4733]: I0318 10:25:53.185285 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73327417-4d3b-45f1-b3b6-575fdeeaa31a" path="/var/lib/kubelet/pods/73327417-4d3b-45f1-b3b6-575fdeeaa31a/volumes" Mar 18 10:25:53 crc kubenswrapper[4733]: I0318 10:25:53.862335 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" event={"ID":"a10a6717-677d-43b8-8d78-8f60b16836ed","Type":"ContainerStarted","Data":"a2d893b5c4cf7c22fc1504a2a49212da6cfea5758ab2bc466cac6aa3003a9f92"} Mar 18 10:25:53 crc kubenswrapper[4733]: I0318 10:25:53.862908 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" event={"ID":"a10a6717-677d-43b8-8d78-8f60b16836ed","Type":"ContainerStarted","Data":"9ac071de97432e33422df8a380f9305452b3361e02459419299a65bb49a7a681"} Mar 18 10:25:53 crc kubenswrapper[4733]: I0318 10:25:53.862942 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" event={"ID":"a10a6717-677d-43b8-8d78-8f60b16836ed","Type":"ContainerStarted","Data":"e45684d426968124d1ff7372ac51e178cd6e9a5fdf87a6478b6776c86c417590"} Mar 18 10:25:53 crc kubenswrapper[4733]: I0318 10:25:53.862970 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" event={"ID":"a10a6717-677d-43b8-8d78-8f60b16836ed","Type":"ContainerStarted","Data":"c4f5d37616b6ca27fb9b1ff3b37f035d03db2a230aa08159945f7c70446a286b"} Mar 18 10:25:53 crc kubenswrapper[4733]: I0318 10:25:53.862995 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" event={"ID":"a10a6717-677d-43b8-8d78-8f60b16836ed","Type":"ContainerStarted","Data":"baaee666b1ce55cbef60b96672d5f702aa3e96811a7b8870393b299aa946d496"} Mar 18 10:25:53 crc kubenswrapper[4733]: I0318 10:25:53.863019 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" event={"ID":"a10a6717-677d-43b8-8d78-8f60b16836ed","Type":"ContainerStarted","Data":"cea74c0bbf0a34edcc4406c026e508f89cb98477944ee37f0a516e38b1ac1083"} Mar 18 10:25:56 crc kubenswrapper[4733]: I0318 10:25:56.902113 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" event={"ID":"a10a6717-677d-43b8-8d78-8f60b16836ed","Type":"ContainerStarted","Data":"7a324d5a2a6e0aa931bdb486b9d81da9aee89d6483988b88c47c792938b3c49c"} Mar 18 10:25:58 crc kubenswrapper[4733]: I0318 10:25:58.928436 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" event={"ID":"a10a6717-677d-43b8-8d78-8f60b16836ed","Type":"ContainerStarted","Data":"2474aca6cf95defd9534c2913ac781f9cbd560505ecc63aec421423f42d9bb6b"} Mar 18 10:25:58 crc kubenswrapper[4733]: I0318 10:25:58.930394 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" Mar 18 10:25:58 crc kubenswrapper[4733]: I0318 10:25:58.930421 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" Mar 18 10:25:58 crc kubenswrapper[4733]: I0318 10:25:58.930431 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" Mar 18 10:25:58 crc kubenswrapper[4733]: I0318 10:25:58.959080 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" podStartSLOduration=7.959060382 podStartE2EDuration="7.959060382s" podCreationTimestamp="2026-03-18 10:25:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:25:58.958567808 +0000 UTC m=+798.450302143" watchObservedRunningTime="2026-03-18 10:25:58.959060382 +0000 UTC m=+798.450794707" Mar 18 10:25:58 crc kubenswrapper[4733]: I0318 10:25:58.964205 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" Mar 18 10:25:58 crc kubenswrapper[4733]: I0318 10:25:58.966287 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" Mar 18 10:26:00 crc kubenswrapper[4733]: I0318 10:26:00.150624 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563826-tfzqx"] Mar 18 10:26:00 crc kubenswrapper[4733]: I0318 10:26:00.151741 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563826-tfzqx" Mar 18 10:26:00 crc kubenswrapper[4733]: I0318 10:26:00.156241 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wmd5k" Mar 18 10:26:00 crc kubenswrapper[4733]: I0318 10:26:00.156660 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:26:00 crc kubenswrapper[4733]: I0318 10:26:00.156783 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:26:00 crc kubenswrapper[4733]: I0318 10:26:00.167635 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563826-tfzqx"] Mar 18 10:26:00 crc kubenswrapper[4733]: I0318 10:26:00.237022 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bp6vj\" (UniqueName: \"kubernetes.io/projected/eb9f28a6-6f4c-440b-abfc-cca26041cbef-kube-api-access-bp6vj\") pod \"auto-csr-approver-29563826-tfzqx\" (UID: \"eb9f28a6-6f4c-440b-abfc-cca26041cbef\") " pod="openshift-infra/auto-csr-approver-29563826-tfzqx" Mar 18 10:26:00 crc kubenswrapper[4733]: I0318 10:26:00.338732 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bp6vj\" (UniqueName: \"kubernetes.io/projected/eb9f28a6-6f4c-440b-abfc-cca26041cbef-kube-api-access-bp6vj\") pod \"auto-csr-approver-29563826-tfzqx\" (UID: \"eb9f28a6-6f4c-440b-abfc-cca26041cbef\") " pod="openshift-infra/auto-csr-approver-29563826-tfzqx" Mar 18 10:26:00 crc kubenswrapper[4733]: I0318 10:26:00.365909 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bp6vj\" (UniqueName: \"kubernetes.io/projected/eb9f28a6-6f4c-440b-abfc-cca26041cbef-kube-api-access-bp6vj\") pod \"auto-csr-approver-29563826-tfzqx\" (UID: \"eb9f28a6-6f4c-440b-abfc-cca26041cbef\") " pod="openshift-infra/auto-csr-approver-29563826-tfzqx" Mar 18 10:26:00 crc kubenswrapper[4733]: I0318 10:26:00.477798 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563826-tfzqx" Mar 18 10:26:00 crc kubenswrapper[4733]: E0318 10:26:00.520820 4733 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29563826-tfzqx_openshift-infra_eb9f28a6-6f4c-440b-abfc-cca26041cbef_0(cb18a9006cb61d0feef0c6ba70e5ea60694d961bc6d25656f016d8e25a763ae7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 10:26:00 crc kubenswrapper[4733]: E0318 10:26:00.521382 4733 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29563826-tfzqx_openshift-infra_eb9f28a6-6f4c-440b-abfc-cca26041cbef_0(cb18a9006cb61d0feef0c6ba70e5ea60694d961bc6d25656f016d8e25a763ae7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29563826-tfzqx" Mar 18 10:26:00 crc kubenswrapper[4733]: E0318 10:26:00.521426 4733 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29563826-tfzqx_openshift-infra_eb9f28a6-6f4c-440b-abfc-cca26041cbef_0(cb18a9006cb61d0feef0c6ba70e5ea60694d961bc6d25656f016d8e25a763ae7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29563826-tfzqx" Mar 18 10:26:00 crc kubenswrapper[4733]: E0318 10:26:00.521554 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"auto-csr-approver-29563826-tfzqx_openshift-infra(eb9f28a6-6f4c-440b-abfc-cca26041cbef)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"auto-csr-approver-29563826-tfzqx_openshift-infra(eb9f28a6-6f4c-440b-abfc-cca26041cbef)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29563826-tfzqx_openshift-infra_eb9f28a6-6f4c-440b-abfc-cca26041cbef_0(cb18a9006cb61d0feef0c6ba70e5ea60694d961bc6d25656f016d8e25a763ae7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-infra/auto-csr-approver-29563826-tfzqx" podUID="eb9f28a6-6f4c-440b-abfc-cca26041cbef" Mar 18 10:26:00 crc kubenswrapper[4733]: I0318 10:26:00.941128 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563826-tfzqx" Mar 18 10:26:00 crc kubenswrapper[4733]: I0318 10:26:00.941602 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563826-tfzqx" Mar 18 10:26:00 crc kubenswrapper[4733]: E0318 10:26:00.975239 4733 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29563826-tfzqx_openshift-infra_eb9f28a6-6f4c-440b-abfc-cca26041cbef_0(aa2611edfeef07d4211957559e0fe8996517afcd8ee9dd4dca6e796dc9e459fe): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 10:26:00 crc kubenswrapper[4733]: E0318 10:26:00.975309 4733 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29563826-tfzqx_openshift-infra_eb9f28a6-6f4c-440b-abfc-cca26041cbef_0(aa2611edfeef07d4211957559e0fe8996517afcd8ee9dd4dca6e796dc9e459fe): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29563826-tfzqx" Mar 18 10:26:00 crc kubenswrapper[4733]: E0318 10:26:00.975335 4733 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29563826-tfzqx_openshift-infra_eb9f28a6-6f4c-440b-abfc-cca26041cbef_0(aa2611edfeef07d4211957559e0fe8996517afcd8ee9dd4dca6e796dc9e459fe): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29563826-tfzqx" Mar 18 10:26:00 crc kubenswrapper[4733]: E0318 10:26:00.975402 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"auto-csr-approver-29563826-tfzqx_openshift-infra(eb9f28a6-6f4c-440b-abfc-cca26041cbef)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"auto-csr-approver-29563826-tfzqx_openshift-infra(eb9f28a6-6f4c-440b-abfc-cca26041cbef)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29563826-tfzqx_openshift-infra_eb9f28a6-6f4c-440b-abfc-cca26041cbef_0(aa2611edfeef07d4211957559e0fe8996517afcd8ee9dd4dca6e796dc9e459fe): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-infra/auto-csr-approver-29563826-tfzqx" podUID="eb9f28a6-6f4c-440b-abfc-cca26041cbef" Mar 18 10:26:05 crc kubenswrapper[4733]: I0318 10:26:05.175418 4733 scope.go:117] "RemoveContainer" containerID="e6e4d066d930397d09ab341b832e9b1659ca8d82f0e6fdc83f2d3f3738f5c64d" Mar 18 10:26:05 crc kubenswrapper[4733]: E0318 10:26:05.176712 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-g6j2q_openshift-multus(cc85b0d4-15a5-4894-9f07-9aaeb28f63fa)\"" pod="openshift-multus/multus-g6j2q" podUID="cc85b0d4-15a5-4894-9f07-9aaeb28f63fa" Mar 18 10:26:13 crc kubenswrapper[4733]: I0318 10:26:13.571727 4733 patch_prober.go:28] interesting pod/machine-config-daemon-2h7dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:26:13 crc kubenswrapper[4733]: I0318 10:26:13.572820 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:26:14 crc kubenswrapper[4733]: I0318 10:26:14.174901 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563826-tfzqx" Mar 18 10:26:14 crc kubenswrapper[4733]: I0318 10:26:14.175723 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563826-tfzqx" Mar 18 10:26:14 crc kubenswrapper[4733]: E0318 10:26:14.216279 4733 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29563826-tfzqx_openshift-infra_eb9f28a6-6f4c-440b-abfc-cca26041cbef_0(fe870290ef5184b8c09b344f1f32f093def53a2f6d05b152aa0dbc0f6af9f396): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 10:26:14 crc kubenswrapper[4733]: E0318 10:26:14.216389 4733 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29563826-tfzqx_openshift-infra_eb9f28a6-6f4c-440b-abfc-cca26041cbef_0(fe870290ef5184b8c09b344f1f32f093def53a2f6d05b152aa0dbc0f6af9f396): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29563826-tfzqx" Mar 18 10:26:14 crc kubenswrapper[4733]: E0318 10:26:14.216435 4733 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29563826-tfzqx_openshift-infra_eb9f28a6-6f4c-440b-abfc-cca26041cbef_0(fe870290ef5184b8c09b344f1f32f093def53a2f6d05b152aa0dbc0f6af9f396): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29563826-tfzqx" Mar 18 10:26:14 crc kubenswrapper[4733]: E0318 10:26:14.216527 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"auto-csr-approver-29563826-tfzqx_openshift-infra(eb9f28a6-6f4c-440b-abfc-cca26041cbef)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"auto-csr-approver-29563826-tfzqx_openshift-infra(eb9f28a6-6f4c-440b-abfc-cca26041cbef)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29563826-tfzqx_openshift-infra_eb9f28a6-6f4c-440b-abfc-cca26041cbef_0(fe870290ef5184b8c09b344f1f32f093def53a2f6d05b152aa0dbc0f6af9f396): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-infra/auto-csr-approver-29563826-tfzqx" podUID="eb9f28a6-6f4c-440b-abfc-cca26041cbef" Mar 18 10:26:18 crc kubenswrapper[4733]: I0318 10:26:18.176257 4733 scope.go:117] "RemoveContainer" containerID="e6e4d066d930397d09ab341b832e9b1659ca8d82f0e6fdc83f2d3f3738f5c64d" Mar 18 10:26:19 crc kubenswrapper[4733]: I0318 10:26:19.078902 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-g6j2q_cc85b0d4-15a5-4894-9f07-9aaeb28f63fa/kube-multus/2.log" Mar 18 10:26:19 crc kubenswrapper[4733]: I0318 10:26:19.079350 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-g6j2q" event={"ID":"cc85b0d4-15a5-4894-9f07-9aaeb28f63fa","Type":"ContainerStarted","Data":"2634f8ab38a5754851b399ebb93dc944c97649d7f30f22a1e2664690641f0fa7"} Mar 18 10:26:22 crc kubenswrapper[4733]: I0318 10:26:22.216925 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pwtdp" Mar 18 10:26:23 crc kubenswrapper[4733]: I0318 10:26:23.039034 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rxg2p"] Mar 18 10:26:23 crc kubenswrapper[4733]: I0318 10:26:23.040374 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rxg2p" Mar 18 10:26:23 crc kubenswrapper[4733]: I0318 10:26:23.042966 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 18 10:26:23 crc kubenswrapper[4733]: I0318 10:26:23.050584 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rxg2p"] Mar 18 10:26:23 crc kubenswrapper[4733]: I0318 10:26:23.117906 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3f95562e-ae03-4b2d-92b7-bc5593785f3c-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rxg2p\" (UID: \"3f95562e-ae03-4b2d-92b7-bc5593785f3c\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rxg2p" Mar 18 10:26:23 crc kubenswrapper[4733]: I0318 10:26:23.117989 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlzb5\" (UniqueName: \"kubernetes.io/projected/3f95562e-ae03-4b2d-92b7-bc5593785f3c-kube-api-access-qlzb5\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rxg2p\" (UID: \"3f95562e-ae03-4b2d-92b7-bc5593785f3c\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rxg2p" Mar 18 10:26:23 crc kubenswrapper[4733]: I0318 10:26:23.118041 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3f95562e-ae03-4b2d-92b7-bc5593785f3c-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rxg2p\" (UID: \"3f95562e-ae03-4b2d-92b7-bc5593785f3c\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rxg2p" Mar 18 10:26:23 crc kubenswrapper[4733]: I0318 10:26:23.219320 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3f95562e-ae03-4b2d-92b7-bc5593785f3c-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rxg2p\" (UID: \"3f95562e-ae03-4b2d-92b7-bc5593785f3c\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rxg2p" Mar 18 10:26:23 crc kubenswrapper[4733]: I0318 10:26:23.219389 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlzb5\" (UniqueName: \"kubernetes.io/projected/3f95562e-ae03-4b2d-92b7-bc5593785f3c-kube-api-access-qlzb5\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rxg2p\" (UID: \"3f95562e-ae03-4b2d-92b7-bc5593785f3c\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rxg2p" Mar 18 10:26:23 crc kubenswrapper[4733]: I0318 10:26:23.219434 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3f95562e-ae03-4b2d-92b7-bc5593785f3c-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rxg2p\" (UID: \"3f95562e-ae03-4b2d-92b7-bc5593785f3c\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rxg2p" Mar 18 10:26:23 crc kubenswrapper[4733]: I0318 10:26:23.221550 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3f95562e-ae03-4b2d-92b7-bc5593785f3c-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rxg2p\" (UID: \"3f95562e-ae03-4b2d-92b7-bc5593785f3c\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rxg2p" Mar 18 10:26:23 crc kubenswrapper[4733]: I0318 10:26:23.221961 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3f95562e-ae03-4b2d-92b7-bc5593785f3c-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rxg2p\" (UID: \"3f95562e-ae03-4b2d-92b7-bc5593785f3c\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rxg2p" Mar 18 10:26:23 crc kubenswrapper[4733]: I0318 10:26:23.256805 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlzb5\" (UniqueName: \"kubernetes.io/projected/3f95562e-ae03-4b2d-92b7-bc5593785f3c-kube-api-access-qlzb5\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rxg2p\" (UID: \"3f95562e-ae03-4b2d-92b7-bc5593785f3c\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rxg2p" Mar 18 10:26:23 crc kubenswrapper[4733]: I0318 10:26:23.360658 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rxg2p" Mar 18 10:26:23 crc kubenswrapper[4733]: I0318 10:26:23.681337 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rxg2p"] Mar 18 10:26:24 crc kubenswrapper[4733]: I0318 10:26:24.125400 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rxg2p" event={"ID":"3f95562e-ae03-4b2d-92b7-bc5593785f3c","Type":"ContainerStarted","Data":"6c79fe35f870da8b106890820b1a9cf7106b40a2dceff5de739a0c9396b65f31"} Mar 18 10:26:24 crc kubenswrapper[4733]: I0318 10:26:24.125896 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rxg2p" event={"ID":"3f95562e-ae03-4b2d-92b7-bc5593785f3c","Type":"ContainerStarted","Data":"115707dd95d204d5c85fc33da24d6f18d091d0e56db5ebe637e2d7aaa0ea6ad1"} Mar 18 10:26:26 crc kubenswrapper[4733]: I0318 10:26:26.141093 4733 generic.go:334] "Generic (PLEG): container finished" podID="3f95562e-ae03-4b2d-92b7-bc5593785f3c" containerID="6c79fe35f870da8b106890820b1a9cf7106b40a2dceff5de739a0c9396b65f31" exitCode=0 Mar 18 10:26:26 crc kubenswrapper[4733]: I0318 10:26:26.141257 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rxg2p" event={"ID":"3f95562e-ae03-4b2d-92b7-bc5593785f3c","Type":"ContainerDied","Data":"6c79fe35f870da8b106890820b1a9cf7106b40a2dceff5de739a0c9396b65f31"} Mar 18 10:26:28 crc kubenswrapper[4733]: I0318 10:26:28.159178 4733 generic.go:334] "Generic (PLEG): container finished" podID="3f95562e-ae03-4b2d-92b7-bc5593785f3c" containerID="f7502e815678bd10e8b02876d4de520d25734e6bcf12503e9d9d6e0c0c015342" exitCode=0 Mar 18 10:26:28 crc kubenswrapper[4733]: I0318 10:26:28.159714 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rxg2p" event={"ID":"3f95562e-ae03-4b2d-92b7-bc5593785f3c","Type":"ContainerDied","Data":"f7502e815678bd10e8b02876d4de520d25734e6bcf12503e9d9d6e0c0c015342"} Mar 18 10:26:28 crc kubenswrapper[4733]: I0318 10:26:28.175465 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563826-tfzqx" Mar 18 10:26:28 crc kubenswrapper[4733]: I0318 10:26:28.176420 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563826-tfzqx" Mar 18 10:26:28 crc kubenswrapper[4733]: I0318 10:26:28.476915 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563826-tfzqx"] Mar 18 10:26:28 crc kubenswrapper[4733]: W0318 10:26:28.490162 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb9f28a6_6f4c_440b_abfc_cca26041cbef.slice/crio-9f8f68c9457c58f5aad92937557aa65f4fad0e1186bd7002a29cc99219d0aec6 WatchSource:0}: Error finding container 9f8f68c9457c58f5aad92937557aa65f4fad0e1186bd7002a29cc99219d0aec6: Status 404 returned error can't find the container with id 9f8f68c9457c58f5aad92937557aa65f4fad0e1186bd7002a29cc99219d0aec6 Mar 18 10:26:29 crc kubenswrapper[4733]: I0318 10:26:29.172467 4733 generic.go:334] "Generic (PLEG): container finished" podID="3f95562e-ae03-4b2d-92b7-bc5593785f3c" containerID="aaa730e8d1687e25ca04b79b8603e52bb0f970b31d74876a63a754170920f8b9" exitCode=0 Mar 18 10:26:29 crc kubenswrapper[4733]: I0318 10:26:29.173092 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rxg2p" event={"ID":"3f95562e-ae03-4b2d-92b7-bc5593785f3c","Type":"ContainerDied","Data":"aaa730e8d1687e25ca04b79b8603e52bb0f970b31d74876a63a754170920f8b9"} Mar 18 10:26:29 crc kubenswrapper[4733]: I0318 10:26:29.175077 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563826-tfzqx" event={"ID":"eb9f28a6-6f4c-440b-abfc-cca26041cbef","Type":"ContainerStarted","Data":"9f8f68c9457c58f5aad92937557aa65f4fad0e1186bd7002a29cc99219d0aec6"} Mar 18 10:26:30 crc kubenswrapper[4733]: I0318 10:26:30.184355 4733 generic.go:334] "Generic (PLEG): container finished" podID="eb9f28a6-6f4c-440b-abfc-cca26041cbef" containerID="73a17ce4bce512adc8ff4282e561fca0880aa24a1a28aaa52332d077a8673f8c" exitCode=0 Mar 18 10:26:30 crc kubenswrapper[4733]: I0318 10:26:30.184430 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563826-tfzqx" event={"ID":"eb9f28a6-6f4c-440b-abfc-cca26041cbef","Type":"ContainerDied","Data":"73a17ce4bce512adc8ff4282e561fca0880aa24a1a28aaa52332d077a8673f8c"} Mar 18 10:26:30 crc kubenswrapper[4733]: I0318 10:26:30.495248 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rxg2p" Mar 18 10:26:30 crc kubenswrapper[4733]: I0318 10:26:30.635486 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3f95562e-ae03-4b2d-92b7-bc5593785f3c-util\") pod \"3f95562e-ae03-4b2d-92b7-bc5593785f3c\" (UID: \"3f95562e-ae03-4b2d-92b7-bc5593785f3c\") " Mar 18 10:26:30 crc kubenswrapper[4733]: I0318 10:26:30.635581 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlzb5\" (UniqueName: \"kubernetes.io/projected/3f95562e-ae03-4b2d-92b7-bc5593785f3c-kube-api-access-qlzb5\") pod \"3f95562e-ae03-4b2d-92b7-bc5593785f3c\" (UID: \"3f95562e-ae03-4b2d-92b7-bc5593785f3c\") " Mar 18 10:26:30 crc kubenswrapper[4733]: I0318 10:26:30.635651 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3f95562e-ae03-4b2d-92b7-bc5593785f3c-bundle\") pod \"3f95562e-ae03-4b2d-92b7-bc5593785f3c\" (UID: \"3f95562e-ae03-4b2d-92b7-bc5593785f3c\") " Mar 18 10:26:30 crc kubenswrapper[4733]: I0318 10:26:30.636713 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f95562e-ae03-4b2d-92b7-bc5593785f3c-bundle" (OuterVolumeSpecName: "bundle") pod "3f95562e-ae03-4b2d-92b7-bc5593785f3c" (UID: "3f95562e-ae03-4b2d-92b7-bc5593785f3c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:26:30 crc kubenswrapper[4733]: I0318 10:26:30.643521 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f95562e-ae03-4b2d-92b7-bc5593785f3c-kube-api-access-qlzb5" (OuterVolumeSpecName: "kube-api-access-qlzb5") pod "3f95562e-ae03-4b2d-92b7-bc5593785f3c" (UID: "3f95562e-ae03-4b2d-92b7-bc5593785f3c"). InnerVolumeSpecName "kube-api-access-qlzb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:26:30 crc kubenswrapper[4733]: I0318 10:26:30.654828 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f95562e-ae03-4b2d-92b7-bc5593785f3c-util" (OuterVolumeSpecName: "util") pod "3f95562e-ae03-4b2d-92b7-bc5593785f3c" (UID: "3f95562e-ae03-4b2d-92b7-bc5593785f3c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:26:30 crc kubenswrapper[4733]: I0318 10:26:30.737301 4733 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3f95562e-ae03-4b2d-92b7-bc5593785f3c-util\") on node \"crc\" DevicePath \"\"" Mar 18 10:26:30 crc kubenswrapper[4733]: I0318 10:26:30.737356 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlzb5\" (UniqueName: \"kubernetes.io/projected/3f95562e-ae03-4b2d-92b7-bc5593785f3c-kube-api-access-qlzb5\") on node \"crc\" DevicePath \"\"" Mar 18 10:26:30 crc kubenswrapper[4733]: I0318 10:26:30.737378 4733 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3f95562e-ae03-4b2d-92b7-bc5593785f3c-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 10:26:31 crc kubenswrapper[4733]: I0318 10:26:31.199021 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rxg2p" event={"ID":"3f95562e-ae03-4b2d-92b7-bc5593785f3c","Type":"ContainerDied","Data":"115707dd95d204d5c85fc33da24d6f18d091d0e56db5ebe637e2d7aaa0ea6ad1"} Mar 18 10:26:31 crc kubenswrapper[4733]: I0318 10:26:31.199552 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="115707dd95d204d5c85fc33da24d6f18d091d0e56db5ebe637e2d7aaa0ea6ad1" Mar 18 10:26:31 crc kubenswrapper[4733]: I0318 10:26:31.199038 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rxg2p" Mar 18 10:26:31 crc kubenswrapper[4733]: I0318 10:26:31.517236 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563826-tfzqx" Mar 18 10:26:31 crc kubenswrapper[4733]: I0318 10:26:31.549581 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bp6vj\" (UniqueName: \"kubernetes.io/projected/eb9f28a6-6f4c-440b-abfc-cca26041cbef-kube-api-access-bp6vj\") pod \"eb9f28a6-6f4c-440b-abfc-cca26041cbef\" (UID: \"eb9f28a6-6f4c-440b-abfc-cca26041cbef\") " Mar 18 10:26:31 crc kubenswrapper[4733]: I0318 10:26:31.556198 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb9f28a6-6f4c-440b-abfc-cca26041cbef-kube-api-access-bp6vj" (OuterVolumeSpecName: "kube-api-access-bp6vj") pod "eb9f28a6-6f4c-440b-abfc-cca26041cbef" (UID: "eb9f28a6-6f4c-440b-abfc-cca26041cbef"). InnerVolumeSpecName "kube-api-access-bp6vj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:26:31 crc kubenswrapper[4733]: I0318 10:26:31.650960 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bp6vj\" (UniqueName: \"kubernetes.io/projected/eb9f28a6-6f4c-440b-abfc-cca26041cbef-kube-api-access-bp6vj\") on node \"crc\" DevicePath \"\"" Mar 18 10:26:32 crc kubenswrapper[4733]: I0318 10:26:32.209548 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563826-tfzqx" event={"ID":"eb9f28a6-6f4c-440b-abfc-cca26041cbef","Type":"ContainerDied","Data":"9f8f68c9457c58f5aad92937557aa65f4fad0e1186bd7002a29cc99219d0aec6"} Mar 18 10:26:32 crc kubenswrapper[4733]: I0318 10:26:32.209992 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f8f68c9457c58f5aad92937557aa65f4fad0e1186bd7002a29cc99219d0aec6" Mar 18 10:26:32 crc kubenswrapper[4733]: I0318 10:26:32.210100 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563826-tfzqx" Mar 18 10:26:32 crc kubenswrapper[4733]: I0318 10:26:32.612637 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563820-x8mq8"] Mar 18 10:26:32 crc kubenswrapper[4733]: I0318 10:26:32.619266 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563820-x8mq8"] Mar 18 10:26:33 crc kubenswrapper[4733]: I0318 10:26:33.190302 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="949d71ae-f754-4b5c-8c0b-fec8d374f27e" path="/var/lib/kubelet/pods/949d71ae-f754-4b5c-8c0b-fec8d374f27e/volumes" Mar 18 10:26:34 crc kubenswrapper[4733]: I0318 10:26:34.449003 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-2d4dp"] Mar 18 10:26:34 crc kubenswrapper[4733]: E0318 10:26:34.449237 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f95562e-ae03-4b2d-92b7-bc5593785f3c" containerName="pull" Mar 18 10:26:34 crc kubenswrapper[4733]: I0318 10:26:34.449248 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f95562e-ae03-4b2d-92b7-bc5593785f3c" containerName="pull" Mar 18 10:26:34 crc kubenswrapper[4733]: E0318 10:26:34.449255 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb9f28a6-6f4c-440b-abfc-cca26041cbef" containerName="oc" Mar 18 10:26:34 crc kubenswrapper[4733]: I0318 10:26:34.449261 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb9f28a6-6f4c-440b-abfc-cca26041cbef" containerName="oc" Mar 18 10:26:34 crc kubenswrapper[4733]: E0318 10:26:34.449275 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f95562e-ae03-4b2d-92b7-bc5593785f3c" containerName="extract" Mar 18 10:26:34 crc kubenswrapper[4733]: I0318 10:26:34.449282 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f95562e-ae03-4b2d-92b7-bc5593785f3c" containerName="extract" Mar 18 10:26:34 crc kubenswrapper[4733]: E0318 10:26:34.449291 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f95562e-ae03-4b2d-92b7-bc5593785f3c" containerName="util" Mar 18 10:26:34 crc kubenswrapper[4733]: I0318 10:26:34.449297 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f95562e-ae03-4b2d-92b7-bc5593785f3c" containerName="util" Mar 18 10:26:34 crc kubenswrapper[4733]: I0318 10:26:34.449392 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb9f28a6-6f4c-440b-abfc-cca26041cbef" containerName="oc" Mar 18 10:26:34 crc kubenswrapper[4733]: I0318 10:26:34.449402 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f95562e-ae03-4b2d-92b7-bc5593785f3c" containerName="extract" Mar 18 10:26:34 crc kubenswrapper[4733]: I0318 10:26:34.449779 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-2d4dp" Mar 18 10:26:34 crc kubenswrapper[4733]: I0318 10:26:34.452258 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 18 10:26:34 crc kubenswrapper[4733]: I0318 10:26:34.452414 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 18 10:26:34 crc kubenswrapper[4733]: I0318 10:26:34.452808 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-kdvvs" Mar 18 10:26:34 crc kubenswrapper[4733]: I0318 10:26:34.461299 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-2d4dp"] Mar 18 10:26:34 crc kubenswrapper[4733]: I0318 10:26:34.519315 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbp2v\" (UniqueName: \"kubernetes.io/projected/7c8f098b-42c0-4132-88c0-350e0c872f9d-kube-api-access-wbp2v\") pod \"nmstate-operator-796d4cfff4-2d4dp\" (UID: \"7c8f098b-42c0-4132-88c0-350e0c872f9d\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-2d4dp" Mar 18 10:26:34 crc kubenswrapper[4733]: I0318 10:26:34.620325 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbp2v\" (UniqueName: \"kubernetes.io/projected/7c8f098b-42c0-4132-88c0-350e0c872f9d-kube-api-access-wbp2v\") pod \"nmstate-operator-796d4cfff4-2d4dp\" (UID: \"7c8f098b-42c0-4132-88c0-350e0c872f9d\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-2d4dp" Mar 18 10:26:34 crc kubenswrapper[4733]: I0318 10:26:34.638786 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbp2v\" (UniqueName: \"kubernetes.io/projected/7c8f098b-42c0-4132-88c0-350e0c872f9d-kube-api-access-wbp2v\") pod \"nmstate-operator-796d4cfff4-2d4dp\" (UID: \"7c8f098b-42c0-4132-88c0-350e0c872f9d\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-2d4dp" Mar 18 10:26:34 crc kubenswrapper[4733]: I0318 10:26:34.824404 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-2d4dp" Mar 18 10:26:35 crc kubenswrapper[4733]: I0318 10:26:35.068254 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-2d4dp"] Mar 18 10:26:35 crc kubenswrapper[4733]: I0318 10:26:35.250466 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-2d4dp" event={"ID":"7c8f098b-42c0-4132-88c0-350e0c872f9d","Type":"ContainerStarted","Data":"55b8034339e1f47af10841a0fb6d53aed4dbdcf3f614f4d6e30878c8f218ef41"} Mar 18 10:26:38 crc kubenswrapper[4733]: I0318 10:26:38.277857 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-2d4dp" event={"ID":"7c8f098b-42c0-4132-88c0-350e0c872f9d","Type":"ContainerStarted","Data":"d7d6ef59a7ccc94cd571a4f6c8fa734e4d332d1da6eabe513015c2090c31f499"} Mar 18 10:26:38 crc kubenswrapper[4733]: I0318 10:26:38.300416 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-2d4dp" podStartSLOduration=1.9651879270000001 podStartE2EDuration="4.300387377s" podCreationTimestamp="2026-03-18 10:26:34 +0000 UTC" firstStartedPulling="2026-03-18 10:26:35.08124262 +0000 UTC m=+834.572976955" lastFinishedPulling="2026-03-18 10:26:37.41644208 +0000 UTC m=+836.908176405" observedRunningTime="2026-03-18 10:26:38.294681294 +0000 UTC m=+837.786415689" watchObservedRunningTime="2026-03-18 10:26:38.300387377 +0000 UTC m=+837.792121742" Mar 18 10:26:40 crc kubenswrapper[4733]: I0318 10:26:40.038561 4733 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.000949 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-7swn6"] Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.003261 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-7swn6" Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.005337 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-lcxp9" Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.043737 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-8jncr"] Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.046914 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-8jncr" Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.061592 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-m6rhx"] Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.062550 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-m6rhx" Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.064473 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.065803 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-m6rhx"] Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.080433 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-7swn6"] Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.147396 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-btpf9"] Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.148848 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/4c5d76ae-c917-4ba7-91d7-332a8e578245-nmstate-lock\") pod \"nmstate-handler-8jncr\" (UID: \"4c5d76ae-c917-4ba7-91d7-332a8e578245\") " pod="openshift-nmstate/nmstate-handler-8jncr" Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.148927 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lg4n7\" (UniqueName: \"kubernetes.io/projected/4c5d76ae-c917-4ba7-91d7-332a8e578245-kube-api-access-lg4n7\") pod \"nmstate-handler-8jncr\" (UID: \"4c5d76ae-c917-4ba7-91d7-332a8e578245\") " pod="openshift-nmstate/nmstate-handler-8jncr" Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.148955 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/4c5d76ae-c917-4ba7-91d7-332a8e578245-dbus-socket\") pod \"nmstate-handler-8jncr\" (UID: \"4c5d76ae-c917-4ba7-91d7-332a8e578245\") " pod="openshift-nmstate/nmstate-handler-8jncr" Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.148974 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/4c5d76ae-c917-4ba7-91d7-332a8e578245-ovs-socket\") pod \"nmstate-handler-8jncr\" (UID: \"4c5d76ae-c917-4ba7-91d7-332a8e578245\") " pod="openshift-nmstate/nmstate-handler-8jncr" Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.149001 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nj9z\" (UniqueName: \"kubernetes.io/projected/eb2e5225-c943-4b06-b2de-90ab1168242b-kube-api-access-5nj9z\") pod \"nmstate-metrics-9b8c8685d-7swn6\" (UID: \"eb2e5225-c943-4b06-b2de-90ab1168242b\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-7swn6" Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.151364 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-btpf9" Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.153404 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-l7t6j" Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.155635 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-btpf9"] Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.155879 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.161819 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.250010 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nj9z\" (UniqueName: \"kubernetes.io/projected/eb2e5225-c943-4b06-b2de-90ab1168242b-kube-api-access-5nj9z\") pod \"nmstate-metrics-9b8c8685d-7swn6\" (UID: \"eb2e5225-c943-4b06-b2de-90ab1168242b\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-7swn6" Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.250418 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/4c5d76ae-c917-4ba7-91d7-332a8e578245-nmstate-lock\") pod \"nmstate-handler-8jncr\" (UID: \"4c5d76ae-c917-4ba7-91d7-332a8e578245\") " pod="openshift-nmstate/nmstate-handler-8jncr" Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.250469 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/29460af7-7801-4268-aae8-f84763762e2f-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-m6rhx\" (UID: \"29460af7-7801-4268-aae8-f84763762e2f\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-m6rhx" Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.250500 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5fkq\" (UniqueName: \"kubernetes.io/projected/29460af7-7801-4268-aae8-f84763762e2f-kube-api-access-p5fkq\") pod \"nmstate-webhook-5f558f5558-m6rhx\" (UID: \"29460af7-7801-4268-aae8-f84763762e2f\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-m6rhx" Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.250550 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lg4n7\" (UniqueName: \"kubernetes.io/projected/4c5d76ae-c917-4ba7-91d7-332a8e578245-kube-api-access-lg4n7\") pod \"nmstate-handler-8jncr\" (UID: \"4c5d76ae-c917-4ba7-91d7-332a8e578245\") " pod="openshift-nmstate/nmstate-handler-8jncr" Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.250626 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/4c5d76ae-c917-4ba7-91d7-332a8e578245-nmstate-lock\") pod \"nmstate-handler-8jncr\" (UID: \"4c5d76ae-c917-4ba7-91d7-332a8e578245\") " pod="openshift-nmstate/nmstate-handler-8jncr" Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.250816 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/4c5d76ae-c917-4ba7-91d7-332a8e578245-dbus-socket\") pod \"nmstate-handler-8jncr\" (UID: \"4c5d76ae-c917-4ba7-91d7-332a8e578245\") " pod="openshift-nmstate/nmstate-handler-8jncr" Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.250852 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/4c5d76ae-c917-4ba7-91d7-332a8e578245-ovs-socket\") pod \"nmstate-handler-8jncr\" (UID: \"4c5d76ae-c917-4ba7-91d7-332a8e578245\") " pod="openshift-nmstate/nmstate-handler-8jncr" Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.250927 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/4c5d76ae-c917-4ba7-91d7-332a8e578245-ovs-socket\") pod \"nmstate-handler-8jncr\" (UID: \"4c5d76ae-c917-4ba7-91d7-332a8e578245\") " pod="openshift-nmstate/nmstate-handler-8jncr" Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.251136 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/4c5d76ae-c917-4ba7-91d7-332a8e578245-dbus-socket\") pod \"nmstate-handler-8jncr\" (UID: \"4c5d76ae-c917-4ba7-91d7-332a8e578245\") " pod="openshift-nmstate/nmstate-handler-8jncr" Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.273625 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nj9z\" (UniqueName: \"kubernetes.io/projected/eb2e5225-c943-4b06-b2de-90ab1168242b-kube-api-access-5nj9z\") pod \"nmstate-metrics-9b8c8685d-7swn6\" (UID: \"eb2e5225-c943-4b06-b2de-90ab1168242b\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-7swn6" Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.273753 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lg4n7\" (UniqueName: \"kubernetes.io/projected/4c5d76ae-c917-4ba7-91d7-332a8e578245-kube-api-access-lg4n7\") pod \"nmstate-handler-8jncr\" (UID: \"4c5d76ae-c917-4ba7-91d7-332a8e578245\") " pod="openshift-nmstate/nmstate-handler-8jncr" Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.343985 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-5dd4b96b5d-zqmlh"] Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.344735 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5dd4b96b5d-zqmlh" Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.347667 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-7swn6" Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.351744 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/945cd091-c09f-4763-8093-ba83e642949f-oauth-serving-cert\") pod \"console-5dd4b96b5d-zqmlh\" (UID: \"945cd091-c09f-4763-8093-ba83e642949f\") " pod="openshift-console/console-5dd4b96b5d-zqmlh" Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.351806 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/945cd091-c09f-4763-8093-ba83e642949f-console-oauth-config\") pod \"console-5dd4b96b5d-zqmlh\" (UID: \"945cd091-c09f-4763-8093-ba83e642949f\") " pod="openshift-console/console-5dd4b96b5d-zqmlh" Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.351849 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qlmx\" (UniqueName: \"kubernetes.io/projected/95b678ac-c7be-4c57-8663-05b207f43338-kube-api-access-5qlmx\") pod \"nmstate-console-plugin-86f58fcf4-btpf9\" (UID: \"95b678ac-c7be-4c57-8663-05b207f43338\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-btpf9" Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.351886 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/29460af7-7801-4268-aae8-f84763762e2f-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-m6rhx\" (UID: \"29460af7-7801-4268-aae8-f84763762e2f\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-m6rhx" Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.351910 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5fkq\" (UniqueName: \"kubernetes.io/projected/29460af7-7801-4268-aae8-f84763762e2f-kube-api-access-p5fkq\") pod \"nmstate-webhook-5f558f5558-m6rhx\" (UID: \"29460af7-7801-4268-aae8-f84763762e2f\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-m6rhx" Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.351930 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/945cd091-c09f-4763-8093-ba83e642949f-service-ca\") pod \"console-5dd4b96b5d-zqmlh\" (UID: \"945cd091-c09f-4763-8093-ba83e642949f\") " pod="openshift-console/console-5dd4b96b5d-zqmlh" Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.351971 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/945cd091-c09f-4763-8093-ba83e642949f-console-serving-cert\") pod \"console-5dd4b96b5d-zqmlh\" (UID: \"945cd091-c09f-4763-8093-ba83e642949f\") " pod="openshift-console/console-5dd4b96b5d-zqmlh" Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.352003 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/945cd091-c09f-4763-8093-ba83e642949f-console-config\") pod \"console-5dd4b96b5d-zqmlh\" (UID: \"945cd091-c09f-4763-8093-ba83e642949f\") " pod="openshift-console/console-5dd4b96b5d-zqmlh" Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.352023 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sm52g\" (UniqueName: \"kubernetes.io/projected/945cd091-c09f-4763-8093-ba83e642949f-kube-api-access-sm52g\") pod \"console-5dd4b96b5d-zqmlh\" (UID: \"945cd091-c09f-4763-8093-ba83e642949f\") " pod="openshift-console/console-5dd4b96b5d-zqmlh" Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.352043 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/95b678ac-c7be-4c57-8663-05b207f43338-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-btpf9\" (UID: \"95b678ac-c7be-4c57-8663-05b207f43338\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-btpf9" Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.352062 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/945cd091-c09f-4763-8093-ba83e642949f-trusted-ca-bundle\") pod \"console-5dd4b96b5d-zqmlh\" (UID: \"945cd091-c09f-4763-8093-ba83e642949f\") " pod="openshift-console/console-5dd4b96b5d-zqmlh" Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.352087 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/95b678ac-c7be-4c57-8663-05b207f43338-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-btpf9\" (UID: \"95b678ac-c7be-4c57-8663-05b207f43338\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-btpf9" Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.357947 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/29460af7-7801-4268-aae8-f84763762e2f-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-m6rhx\" (UID: \"29460af7-7801-4268-aae8-f84763762e2f\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-m6rhx" Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.358684 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5dd4b96b5d-zqmlh"] Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.370650 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5fkq\" (UniqueName: \"kubernetes.io/projected/29460af7-7801-4268-aae8-f84763762e2f-kube-api-access-p5fkq\") pod \"nmstate-webhook-5f558f5558-m6rhx\" (UID: \"29460af7-7801-4268-aae8-f84763762e2f\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-m6rhx" Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.373225 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-8jncr" Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.378474 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-m6rhx" Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.453373 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/945cd091-c09f-4763-8093-ba83e642949f-oauth-serving-cert\") pod \"console-5dd4b96b5d-zqmlh\" (UID: \"945cd091-c09f-4763-8093-ba83e642949f\") " pod="openshift-console/console-5dd4b96b5d-zqmlh" Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.453724 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/945cd091-c09f-4763-8093-ba83e642949f-console-oauth-config\") pod \"console-5dd4b96b5d-zqmlh\" (UID: \"945cd091-c09f-4763-8093-ba83e642949f\") " pod="openshift-console/console-5dd4b96b5d-zqmlh" Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.453764 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qlmx\" (UniqueName: \"kubernetes.io/projected/95b678ac-c7be-4c57-8663-05b207f43338-kube-api-access-5qlmx\") pod \"nmstate-console-plugin-86f58fcf4-btpf9\" (UID: \"95b678ac-c7be-4c57-8663-05b207f43338\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-btpf9" Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.453795 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/945cd091-c09f-4763-8093-ba83e642949f-service-ca\") pod \"console-5dd4b96b5d-zqmlh\" (UID: \"945cd091-c09f-4763-8093-ba83e642949f\") " pod="openshift-console/console-5dd4b96b5d-zqmlh" Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.453838 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/945cd091-c09f-4763-8093-ba83e642949f-console-serving-cert\") pod \"console-5dd4b96b5d-zqmlh\" (UID: \"945cd091-c09f-4763-8093-ba83e642949f\") " pod="openshift-console/console-5dd4b96b5d-zqmlh" Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.453859 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/945cd091-c09f-4763-8093-ba83e642949f-console-config\") pod \"console-5dd4b96b5d-zqmlh\" (UID: \"945cd091-c09f-4763-8093-ba83e642949f\") " pod="openshift-console/console-5dd4b96b5d-zqmlh" Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.453873 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sm52g\" (UniqueName: \"kubernetes.io/projected/945cd091-c09f-4763-8093-ba83e642949f-kube-api-access-sm52g\") pod \"console-5dd4b96b5d-zqmlh\" (UID: \"945cd091-c09f-4763-8093-ba83e642949f\") " pod="openshift-console/console-5dd4b96b5d-zqmlh" Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.453890 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/95b678ac-c7be-4c57-8663-05b207f43338-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-btpf9\" (UID: \"95b678ac-c7be-4c57-8663-05b207f43338\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-btpf9" Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.453924 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/945cd091-c09f-4763-8093-ba83e642949f-trusted-ca-bundle\") pod \"console-5dd4b96b5d-zqmlh\" (UID: \"945cd091-c09f-4763-8093-ba83e642949f\") " pod="openshift-console/console-5dd4b96b5d-zqmlh" Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.453945 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/95b678ac-c7be-4c57-8663-05b207f43338-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-btpf9\" (UID: \"95b678ac-c7be-4c57-8663-05b207f43338\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-btpf9" Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.454541 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/945cd091-c09f-4763-8093-ba83e642949f-oauth-serving-cert\") pod \"console-5dd4b96b5d-zqmlh\" (UID: \"945cd091-c09f-4763-8093-ba83e642949f\") " pod="openshift-console/console-5dd4b96b5d-zqmlh" Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.454869 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/95b678ac-c7be-4c57-8663-05b207f43338-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-btpf9\" (UID: \"95b678ac-c7be-4c57-8663-05b207f43338\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-btpf9" Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.456072 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/945cd091-c09f-4763-8093-ba83e642949f-console-config\") pod \"console-5dd4b96b5d-zqmlh\" (UID: \"945cd091-c09f-4763-8093-ba83e642949f\") " pod="openshift-console/console-5dd4b96b5d-zqmlh" Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.457318 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/945cd091-c09f-4763-8093-ba83e642949f-service-ca\") pod \"console-5dd4b96b5d-zqmlh\" (UID: \"945cd091-c09f-4763-8093-ba83e642949f\") " pod="openshift-console/console-5dd4b96b5d-zqmlh" Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.457325 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/945cd091-c09f-4763-8093-ba83e642949f-trusted-ca-bundle\") pod \"console-5dd4b96b5d-zqmlh\" (UID: \"945cd091-c09f-4763-8093-ba83e642949f\") " pod="openshift-console/console-5dd4b96b5d-zqmlh" Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.462368 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/95b678ac-c7be-4c57-8663-05b207f43338-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-btpf9\" (UID: \"95b678ac-c7be-4c57-8663-05b207f43338\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-btpf9" Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.470475 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/945cd091-c09f-4763-8093-ba83e642949f-console-serving-cert\") pod \"console-5dd4b96b5d-zqmlh\" (UID: \"945cd091-c09f-4763-8093-ba83e642949f\") " pod="openshift-console/console-5dd4b96b5d-zqmlh" Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.475463 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qlmx\" (UniqueName: \"kubernetes.io/projected/95b678ac-c7be-4c57-8663-05b207f43338-kube-api-access-5qlmx\") pod \"nmstate-console-plugin-86f58fcf4-btpf9\" (UID: \"95b678ac-c7be-4c57-8663-05b207f43338\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-btpf9" Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.475720 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/945cd091-c09f-4763-8093-ba83e642949f-console-oauth-config\") pod \"console-5dd4b96b5d-zqmlh\" (UID: \"945cd091-c09f-4763-8093-ba83e642949f\") " pod="openshift-console/console-5dd4b96b5d-zqmlh" Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.477060 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sm52g\" (UniqueName: \"kubernetes.io/projected/945cd091-c09f-4763-8093-ba83e642949f-kube-api-access-sm52g\") pod \"console-5dd4b96b5d-zqmlh\" (UID: \"945cd091-c09f-4763-8093-ba83e642949f\") " pod="openshift-console/console-5dd4b96b5d-zqmlh" Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.570839 4733 patch_prober.go:28] interesting pod/machine-config-daemon-2h7dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.570893 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.707223 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5dd4b96b5d-zqmlh" Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.771940 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-btpf9" Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.802939 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-7swn6"] Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.845432 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-m6rhx"] Mar 18 10:26:43 crc kubenswrapper[4733]: I0318 10:26:43.949104 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5dd4b96b5d-zqmlh"] Mar 18 10:26:43 crc kubenswrapper[4733]: W0318 10:26:43.955388 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod945cd091_c09f_4763_8093_ba83e642949f.slice/crio-0fcbc5b2715fb945771bb4e45ca8e97ef927621c516ac4545d113d30b21dbe45 WatchSource:0}: Error finding container 0fcbc5b2715fb945771bb4e45ca8e97ef927621c516ac4545d113d30b21dbe45: Status 404 returned error can't find the container with id 0fcbc5b2715fb945771bb4e45ca8e97ef927621c516ac4545d113d30b21dbe45 Mar 18 10:26:44 crc kubenswrapper[4733]: I0318 10:26:44.220729 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-btpf9"] Mar 18 10:26:44 crc kubenswrapper[4733]: W0318 10:26:44.222769 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95b678ac_c7be_4c57_8663_05b207f43338.slice/crio-1dc84b8c6e3d970faa0bdbd3e8648b36dd5718802988d88510f633589e313fde WatchSource:0}: Error finding container 1dc84b8c6e3d970faa0bdbd3e8648b36dd5718802988d88510f633589e313fde: Status 404 returned error can't find the container with id 1dc84b8c6e3d970faa0bdbd3e8648b36dd5718802988d88510f633589e313fde Mar 18 10:26:44 crc kubenswrapper[4733]: I0318 10:26:44.320691 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-8jncr" event={"ID":"4c5d76ae-c917-4ba7-91d7-332a8e578245","Type":"ContainerStarted","Data":"6ab5acedaccba19ca2d61c2a4cb76a0aded93dc197ab1404f37722f281697a54"} Mar 18 10:26:44 crc kubenswrapper[4733]: I0318 10:26:44.322124 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-7swn6" event={"ID":"eb2e5225-c943-4b06-b2de-90ab1168242b","Type":"ContainerStarted","Data":"5080197f0636affd33fb711f1b386ccebdd88e500e6f1a865cdbbdf72b2bddb2"} Mar 18 10:26:44 crc kubenswrapper[4733]: I0318 10:26:44.323472 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-btpf9" event={"ID":"95b678ac-c7be-4c57-8663-05b207f43338","Type":"ContainerStarted","Data":"1dc84b8c6e3d970faa0bdbd3e8648b36dd5718802988d88510f633589e313fde"} Mar 18 10:26:44 crc kubenswrapper[4733]: I0318 10:26:44.327425 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5dd4b96b5d-zqmlh" event={"ID":"945cd091-c09f-4763-8093-ba83e642949f","Type":"ContainerStarted","Data":"cc47ca98e8782deb4d798932d2dc218a2467fdeb0abee8feb00454d70f952efa"} Mar 18 10:26:44 crc kubenswrapper[4733]: I0318 10:26:44.327466 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5dd4b96b5d-zqmlh" event={"ID":"945cd091-c09f-4763-8093-ba83e642949f","Type":"ContainerStarted","Data":"0fcbc5b2715fb945771bb4e45ca8e97ef927621c516ac4545d113d30b21dbe45"} Mar 18 10:26:44 crc kubenswrapper[4733]: I0318 10:26:44.328686 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-m6rhx" event={"ID":"29460af7-7801-4268-aae8-f84763762e2f","Type":"ContainerStarted","Data":"c5f05e3d43f450e1955fbac937f68f35f2457a07e96f55f2df71849cb74083ef"} Mar 18 10:26:44 crc kubenswrapper[4733]: I0318 10:26:44.357749 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5dd4b96b5d-zqmlh" podStartSLOduration=1.357724759 podStartE2EDuration="1.357724759s" podCreationTimestamp="2026-03-18 10:26:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:26:44.353334234 +0000 UTC m=+843.845068569" watchObservedRunningTime="2026-03-18 10:26:44.357724759 +0000 UTC m=+843.849459094" Mar 18 10:26:47 crc kubenswrapper[4733]: I0318 10:26:47.350902 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-btpf9" event={"ID":"95b678ac-c7be-4c57-8663-05b207f43338","Type":"ContainerStarted","Data":"0812f89430e5b52e2e1cb4addedf266bf709ff4039eff4e0ac0d69576cb60412"} Mar 18 10:26:47 crc kubenswrapper[4733]: I0318 10:26:47.354284 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-m6rhx" Mar 18 10:26:47 crc kubenswrapper[4733]: I0318 10:26:47.354354 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-m6rhx" event={"ID":"29460af7-7801-4268-aae8-f84763762e2f","Type":"ContainerStarted","Data":"ca9eceb62c3c34b30ce951bab0cc3fdbaa6d0e81c4b060a2021b2b230e232828"} Mar 18 10:26:47 crc kubenswrapper[4733]: I0318 10:26:47.358225 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-8jncr" event={"ID":"4c5d76ae-c917-4ba7-91d7-332a8e578245","Type":"ContainerStarted","Data":"e79687014784c6df11db62f5830bad52f1f1bd810a7c455066c5b4c4bee29b5f"} Mar 18 10:26:47 crc kubenswrapper[4733]: I0318 10:26:47.359113 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-8jncr" Mar 18 10:26:47 crc kubenswrapper[4733]: I0318 10:26:47.361228 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-7swn6" event={"ID":"eb2e5225-c943-4b06-b2de-90ab1168242b","Type":"ContainerStarted","Data":"16abd110e2c3a2397e31e2fbe828abd78cd267f5a803a7d2cdc44df903a8eb78"} Mar 18 10:26:47 crc kubenswrapper[4733]: I0318 10:26:47.380803 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-btpf9" podStartSLOduration=2.03521635 podStartE2EDuration="4.374950288s" podCreationTimestamp="2026-03-18 10:26:43 +0000 UTC" firstStartedPulling="2026-03-18 10:26:44.226153761 +0000 UTC m=+843.717888096" lastFinishedPulling="2026-03-18 10:26:46.565887709 +0000 UTC m=+846.057622034" observedRunningTime="2026-03-18 10:26:47.37185502 +0000 UTC m=+846.863589385" watchObservedRunningTime="2026-03-18 10:26:47.374950288 +0000 UTC m=+846.866684653" Mar 18 10:26:47 crc kubenswrapper[4733]: I0318 10:26:47.397810 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-8jncr" podStartSLOduration=1.349079352 podStartE2EDuration="4.397790601s" podCreationTimestamp="2026-03-18 10:26:43 +0000 UTC" firstStartedPulling="2026-03-18 10:26:43.436885477 +0000 UTC m=+842.928619802" lastFinishedPulling="2026-03-18 10:26:46.485596726 +0000 UTC m=+845.977331051" observedRunningTime="2026-03-18 10:26:47.391776599 +0000 UTC m=+846.883510934" watchObservedRunningTime="2026-03-18 10:26:47.397790601 +0000 UTC m=+846.889524926" Mar 18 10:26:47 crc kubenswrapper[4733]: I0318 10:26:47.413232 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-m6rhx" podStartSLOduration=2.787611978 podStartE2EDuration="5.41317308s" podCreationTimestamp="2026-03-18 10:26:42 +0000 UTC" firstStartedPulling="2026-03-18 10:26:43.862521845 +0000 UTC m=+843.354256160" lastFinishedPulling="2026-03-18 10:26:46.488082937 +0000 UTC m=+845.979817262" observedRunningTime="2026-03-18 10:26:47.410099312 +0000 UTC m=+846.901833657" watchObservedRunningTime="2026-03-18 10:26:47.41317308 +0000 UTC m=+846.904907445" Mar 18 10:26:48 crc kubenswrapper[4733]: I0318 10:26:48.431532 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qpfbg"] Mar 18 10:26:48 crc kubenswrapper[4733]: I0318 10:26:48.432981 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qpfbg" Mar 18 10:26:48 crc kubenswrapper[4733]: I0318 10:26:48.439219 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qpfbg"] Mar 18 10:26:48 crc kubenswrapper[4733]: I0318 10:26:48.628698 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wn6vm\" (UniqueName: \"kubernetes.io/projected/e752ae43-529b-407b-8346-a9eb89990c1f-kube-api-access-wn6vm\") pod \"community-operators-qpfbg\" (UID: \"e752ae43-529b-407b-8346-a9eb89990c1f\") " pod="openshift-marketplace/community-operators-qpfbg" Mar 18 10:26:48 crc kubenswrapper[4733]: I0318 10:26:48.628751 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e752ae43-529b-407b-8346-a9eb89990c1f-catalog-content\") pod \"community-operators-qpfbg\" (UID: \"e752ae43-529b-407b-8346-a9eb89990c1f\") " pod="openshift-marketplace/community-operators-qpfbg" Mar 18 10:26:48 crc kubenswrapper[4733]: I0318 10:26:48.628840 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e752ae43-529b-407b-8346-a9eb89990c1f-utilities\") pod \"community-operators-qpfbg\" (UID: \"e752ae43-529b-407b-8346-a9eb89990c1f\") " pod="openshift-marketplace/community-operators-qpfbg" Mar 18 10:26:48 crc kubenswrapper[4733]: I0318 10:26:48.730468 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e752ae43-529b-407b-8346-a9eb89990c1f-utilities\") pod \"community-operators-qpfbg\" (UID: \"e752ae43-529b-407b-8346-a9eb89990c1f\") " pod="openshift-marketplace/community-operators-qpfbg" Mar 18 10:26:48 crc kubenswrapper[4733]: I0318 10:26:48.730828 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wn6vm\" (UniqueName: \"kubernetes.io/projected/e752ae43-529b-407b-8346-a9eb89990c1f-kube-api-access-wn6vm\") pod \"community-operators-qpfbg\" (UID: \"e752ae43-529b-407b-8346-a9eb89990c1f\") " pod="openshift-marketplace/community-operators-qpfbg" Mar 18 10:26:48 crc kubenswrapper[4733]: I0318 10:26:48.730855 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e752ae43-529b-407b-8346-a9eb89990c1f-catalog-content\") pod \"community-operators-qpfbg\" (UID: \"e752ae43-529b-407b-8346-a9eb89990c1f\") " pod="openshift-marketplace/community-operators-qpfbg" Mar 18 10:26:48 crc kubenswrapper[4733]: I0318 10:26:48.730995 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e752ae43-529b-407b-8346-a9eb89990c1f-utilities\") pod \"community-operators-qpfbg\" (UID: \"e752ae43-529b-407b-8346-a9eb89990c1f\") " pod="openshift-marketplace/community-operators-qpfbg" Mar 18 10:26:48 crc kubenswrapper[4733]: I0318 10:26:48.731306 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e752ae43-529b-407b-8346-a9eb89990c1f-catalog-content\") pod \"community-operators-qpfbg\" (UID: \"e752ae43-529b-407b-8346-a9eb89990c1f\") " pod="openshift-marketplace/community-operators-qpfbg" Mar 18 10:26:48 crc kubenswrapper[4733]: I0318 10:26:48.753678 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wn6vm\" (UniqueName: \"kubernetes.io/projected/e752ae43-529b-407b-8346-a9eb89990c1f-kube-api-access-wn6vm\") pod \"community-operators-qpfbg\" (UID: \"e752ae43-529b-407b-8346-a9eb89990c1f\") " pod="openshift-marketplace/community-operators-qpfbg" Mar 18 10:26:48 crc kubenswrapper[4733]: I0318 10:26:48.781674 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qpfbg" Mar 18 10:26:49 crc kubenswrapper[4733]: I0318 10:26:49.375046 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-7swn6" event={"ID":"eb2e5225-c943-4b06-b2de-90ab1168242b","Type":"ContainerStarted","Data":"42590cc87de02e5705810a3bbc50a741275a61b4ae89fb7d3122b0303f2aea96"} Mar 18 10:26:49 crc kubenswrapper[4733]: I0318 10:26:49.401301 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-7swn6" podStartSLOduration=2.059048798 podStartE2EDuration="7.401270775s" podCreationTimestamp="2026-03-18 10:26:42 +0000 UTC" firstStartedPulling="2026-03-18 10:26:43.818569399 +0000 UTC m=+843.310303744" lastFinishedPulling="2026-03-18 10:26:49.160791386 +0000 UTC m=+848.652525721" observedRunningTime="2026-03-18 10:26:49.38990686 +0000 UTC m=+848.881641225" watchObservedRunningTime="2026-03-18 10:26:49.401270775 +0000 UTC m=+848.893005150" Mar 18 10:26:49 crc kubenswrapper[4733]: I0318 10:26:49.579598 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qpfbg"] Mar 18 10:26:50 crc kubenswrapper[4733]: I0318 10:26:50.384051 4733 generic.go:334] "Generic (PLEG): container finished" podID="e752ae43-529b-407b-8346-a9eb89990c1f" containerID="c39a8c0e7870cb3625474393180c8aa4297d3c1427acc66e1c577dd7840a186c" exitCode=0 Mar 18 10:26:50 crc kubenswrapper[4733]: I0318 10:26:50.384126 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qpfbg" event={"ID":"e752ae43-529b-407b-8346-a9eb89990c1f","Type":"ContainerDied","Data":"c39a8c0e7870cb3625474393180c8aa4297d3c1427acc66e1c577dd7840a186c"} Mar 18 10:26:50 crc kubenswrapper[4733]: I0318 10:26:50.384695 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qpfbg" event={"ID":"e752ae43-529b-407b-8346-a9eb89990c1f","Type":"ContainerStarted","Data":"a969dc374c564078153562a05b2d6e6f30e62624c3adff3d6bf3cbebc8ffe206"} Mar 18 10:26:51 crc kubenswrapper[4733]: I0318 10:26:51.395544 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qpfbg" event={"ID":"e752ae43-529b-407b-8346-a9eb89990c1f","Type":"ContainerStarted","Data":"d84796ff2be656e89a749a5e50910da1185cb7b35004070858a0b58ba51bc5ea"} Mar 18 10:26:52 crc kubenswrapper[4733]: I0318 10:26:52.407867 4733 generic.go:334] "Generic (PLEG): container finished" podID="e752ae43-529b-407b-8346-a9eb89990c1f" containerID="d84796ff2be656e89a749a5e50910da1185cb7b35004070858a0b58ba51bc5ea" exitCode=0 Mar 18 10:26:52 crc kubenswrapper[4733]: I0318 10:26:52.407934 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qpfbg" event={"ID":"e752ae43-529b-407b-8346-a9eb89990c1f","Type":"ContainerDied","Data":"d84796ff2be656e89a749a5e50910da1185cb7b35004070858a0b58ba51bc5ea"} Mar 18 10:26:53 crc kubenswrapper[4733]: I0318 10:26:53.425298 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qpfbg" event={"ID":"e752ae43-529b-407b-8346-a9eb89990c1f","Type":"ContainerStarted","Data":"8a6db4a96078cce01cf2e13028e9a6e04a621dfaaea4705c4e07797befc3b96f"} Mar 18 10:26:53 crc kubenswrapper[4733]: I0318 10:26:53.436093 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-8jncr" Mar 18 10:26:53 crc kubenswrapper[4733]: I0318 10:26:53.445841 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5rwz6"] Mar 18 10:26:53 crc kubenswrapper[4733]: I0318 10:26:53.452956 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5rwz6" Mar 18 10:26:53 crc kubenswrapper[4733]: I0318 10:26:53.454646 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5rwz6"] Mar 18 10:26:53 crc kubenswrapper[4733]: I0318 10:26:53.470492 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qpfbg" podStartSLOduration=3.014091372 podStartE2EDuration="5.470464641s" podCreationTimestamp="2026-03-18 10:26:48 +0000 UTC" firstStartedPulling="2026-03-18 10:26:50.386216138 +0000 UTC m=+849.877950503" lastFinishedPulling="2026-03-18 10:26:52.842589427 +0000 UTC m=+852.334323772" observedRunningTime="2026-03-18 10:26:53.462854354 +0000 UTC m=+852.954588719" watchObservedRunningTime="2026-03-18 10:26:53.470464641 +0000 UTC m=+852.962199006" Mar 18 10:26:53 crc kubenswrapper[4733]: I0318 10:26:53.606476 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2eed320-8685-4d5c-8793-e506593a33ca-utilities\") pod \"redhat-marketplace-5rwz6\" (UID: \"c2eed320-8685-4d5c-8793-e506593a33ca\") " pod="openshift-marketplace/redhat-marketplace-5rwz6" Mar 18 10:26:53 crc kubenswrapper[4733]: I0318 10:26:53.606543 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgw6d\" (UniqueName: \"kubernetes.io/projected/c2eed320-8685-4d5c-8793-e506593a33ca-kube-api-access-qgw6d\") pod \"redhat-marketplace-5rwz6\" (UID: \"c2eed320-8685-4d5c-8793-e506593a33ca\") " pod="openshift-marketplace/redhat-marketplace-5rwz6" Mar 18 10:26:53 crc kubenswrapper[4733]: I0318 10:26:53.606649 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2eed320-8685-4d5c-8793-e506593a33ca-catalog-content\") pod \"redhat-marketplace-5rwz6\" (UID: \"c2eed320-8685-4d5c-8793-e506593a33ca\") " pod="openshift-marketplace/redhat-marketplace-5rwz6" Mar 18 10:26:53 crc kubenswrapper[4733]: I0318 10:26:53.707696 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2eed320-8685-4d5c-8793-e506593a33ca-catalog-content\") pod \"redhat-marketplace-5rwz6\" (UID: \"c2eed320-8685-4d5c-8793-e506593a33ca\") " pod="openshift-marketplace/redhat-marketplace-5rwz6" Mar 18 10:26:53 crc kubenswrapper[4733]: I0318 10:26:53.707927 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2eed320-8685-4d5c-8793-e506593a33ca-utilities\") pod \"redhat-marketplace-5rwz6\" (UID: \"c2eed320-8685-4d5c-8793-e506593a33ca\") " pod="openshift-marketplace/redhat-marketplace-5rwz6" Mar 18 10:26:53 crc kubenswrapper[4733]: I0318 10:26:53.708002 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgw6d\" (UniqueName: \"kubernetes.io/projected/c2eed320-8685-4d5c-8793-e506593a33ca-kube-api-access-qgw6d\") pod \"redhat-marketplace-5rwz6\" (UID: \"c2eed320-8685-4d5c-8793-e506593a33ca\") " pod="openshift-marketplace/redhat-marketplace-5rwz6" Mar 18 10:26:53 crc kubenswrapper[4733]: I0318 10:26:53.708214 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-5dd4b96b5d-zqmlh" Mar 18 10:26:53 crc kubenswrapper[4733]: I0318 10:26:53.708275 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5dd4b96b5d-zqmlh" Mar 18 10:26:53 crc kubenswrapper[4733]: I0318 10:26:53.708333 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2eed320-8685-4d5c-8793-e506593a33ca-catalog-content\") pod \"redhat-marketplace-5rwz6\" (UID: \"c2eed320-8685-4d5c-8793-e506593a33ca\") " pod="openshift-marketplace/redhat-marketplace-5rwz6" Mar 18 10:26:53 crc kubenswrapper[4733]: I0318 10:26:53.708703 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2eed320-8685-4d5c-8793-e506593a33ca-utilities\") pod \"redhat-marketplace-5rwz6\" (UID: \"c2eed320-8685-4d5c-8793-e506593a33ca\") " pod="openshift-marketplace/redhat-marketplace-5rwz6" Mar 18 10:26:53 crc kubenswrapper[4733]: I0318 10:26:53.716745 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5dd4b96b5d-zqmlh" Mar 18 10:26:53 crc kubenswrapper[4733]: I0318 10:26:53.734557 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgw6d\" (UniqueName: \"kubernetes.io/projected/c2eed320-8685-4d5c-8793-e506593a33ca-kube-api-access-qgw6d\") pod \"redhat-marketplace-5rwz6\" (UID: \"c2eed320-8685-4d5c-8793-e506593a33ca\") " pod="openshift-marketplace/redhat-marketplace-5rwz6" Mar 18 10:26:53 crc kubenswrapper[4733]: I0318 10:26:53.777197 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5rwz6" Mar 18 10:26:54 crc kubenswrapper[4733]: I0318 10:26:54.028526 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5rwz6"] Mar 18 10:26:54 crc kubenswrapper[4733]: I0318 10:26:54.435321 4733 generic.go:334] "Generic (PLEG): container finished" podID="c2eed320-8685-4d5c-8793-e506593a33ca" containerID="e404d64c5fe9561da0e879dd49ffac0a191569c02ab91f0ac47c23b549c8a732" exitCode=0 Mar 18 10:26:54 crc kubenswrapper[4733]: I0318 10:26:54.435382 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5rwz6" event={"ID":"c2eed320-8685-4d5c-8793-e506593a33ca","Type":"ContainerDied","Data":"e404d64c5fe9561da0e879dd49ffac0a191569c02ab91f0ac47c23b549c8a732"} Mar 18 10:26:54 crc kubenswrapper[4733]: I0318 10:26:54.435847 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5rwz6" event={"ID":"c2eed320-8685-4d5c-8793-e506593a33ca","Type":"ContainerStarted","Data":"a61df60801b4e77be52dfd9c59138a93d02124c8a4487a0d508ef12a254c3910"} Mar 18 10:26:54 crc kubenswrapper[4733]: I0318 10:26:54.444058 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5dd4b96b5d-zqmlh" Mar 18 10:26:54 crc kubenswrapper[4733]: I0318 10:26:54.521228 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-8v244"] Mar 18 10:26:54 crc kubenswrapper[4733]: I0318 10:26:54.697682 4733 scope.go:117] "RemoveContainer" containerID="4d1f85ec68f66c1e8dcc6134fd20cc9907c6036a83ddad6341fd815f0c10f145" Mar 18 10:26:55 crc kubenswrapper[4733]: I0318 10:26:55.444600 4733 generic.go:334] "Generic (PLEG): container finished" podID="c2eed320-8685-4d5c-8793-e506593a33ca" containerID="c6107d273465cbf0ca33eea0159f3ea649eae43a1b11f06f2154604037aa32c1" exitCode=0 Mar 18 10:26:55 crc kubenswrapper[4733]: I0318 10:26:55.444804 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5rwz6" event={"ID":"c2eed320-8685-4d5c-8793-e506593a33ca","Type":"ContainerDied","Data":"c6107d273465cbf0ca33eea0159f3ea649eae43a1b11f06f2154604037aa32c1"} Mar 18 10:26:56 crc kubenswrapper[4733]: I0318 10:26:56.460294 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5rwz6" event={"ID":"c2eed320-8685-4d5c-8793-e506593a33ca","Type":"ContainerStarted","Data":"c6d8dc1c0b97ce23301426f81a422e6e5a4b20b07fb86d1870204fc64f69c8a8"} Mar 18 10:26:56 crc kubenswrapper[4733]: I0318 10:26:56.491729 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5rwz6" podStartSLOduration=2.074472405 podStartE2EDuration="3.491699415s" podCreationTimestamp="2026-03-18 10:26:53 +0000 UTC" firstStartedPulling="2026-03-18 10:26:54.437199714 +0000 UTC m=+853.928934029" lastFinishedPulling="2026-03-18 10:26:55.854426684 +0000 UTC m=+855.346161039" observedRunningTime="2026-03-18 10:26:56.488086062 +0000 UTC m=+855.979820397" watchObservedRunningTime="2026-03-18 10:26:56.491699415 +0000 UTC m=+855.983433730" Mar 18 10:26:58 crc kubenswrapper[4733]: I0318 10:26:58.785603 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qpfbg" Mar 18 10:26:58 crc kubenswrapper[4733]: I0318 10:26:58.786014 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qpfbg" Mar 18 10:26:58 crc kubenswrapper[4733]: I0318 10:26:58.862871 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qpfbg" Mar 18 10:26:59 crc kubenswrapper[4733]: I0318 10:26:59.555720 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qpfbg" Mar 18 10:27:02 crc kubenswrapper[4733]: I0318 10:27:02.219372 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qpfbg"] Mar 18 10:27:02 crc kubenswrapper[4733]: I0318 10:27:02.220444 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qpfbg" podUID="e752ae43-529b-407b-8346-a9eb89990c1f" containerName="registry-server" containerID="cri-o://8a6db4a96078cce01cf2e13028e9a6e04a621dfaaea4705c4e07797befc3b96f" gracePeriod=2 Mar 18 10:27:02 crc kubenswrapper[4733]: I0318 10:27:02.513306 4733 generic.go:334] "Generic (PLEG): container finished" podID="e752ae43-529b-407b-8346-a9eb89990c1f" containerID="8a6db4a96078cce01cf2e13028e9a6e04a621dfaaea4705c4e07797befc3b96f" exitCode=0 Mar 18 10:27:02 crc kubenswrapper[4733]: I0318 10:27:02.513418 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qpfbg" event={"ID":"e752ae43-529b-407b-8346-a9eb89990c1f","Type":"ContainerDied","Data":"8a6db4a96078cce01cf2e13028e9a6e04a621dfaaea4705c4e07797befc3b96f"} Mar 18 10:27:02 crc kubenswrapper[4733]: I0318 10:27:02.674986 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qpfbg" Mar 18 10:27:02 crc kubenswrapper[4733]: I0318 10:27:02.745123 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e752ae43-529b-407b-8346-a9eb89990c1f-utilities\") pod \"e752ae43-529b-407b-8346-a9eb89990c1f\" (UID: \"e752ae43-529b-407b-8346-a9eb89990c1f\") " Mar 18 10:27:02 crc kubenswrapper[4733]: I0318 10:27:02.745262 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e752ae43-529b-407b-8346-a9eb89990c1f-catalog-content\") pod \"e752ae43-529b-407b-8346-a9eb89990c1f\" (UID: \"e752ae43-529b-407b-8346-a9eb89990c1f\") " Mar 18 10:27:02 crc kubenswrapper[4733]: I0318 10:27:02.745358 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wn6vm\" (UniqueName: \"kubernetes.io/projected/e752ae43-529b-407b-8346-a9eb89990c1f-kube-api-access-wn6vm\") pod \"e752ae43-529b-407b-8346-a9eb89990c1f\" (UID: \"e752ae43-529b-407b-8346-a9eb89990c1f\") " Mar 18 10:27:02 crc kubenswrapper[4733]: I0318 10:27:02.746594 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e752ae43-529b-407b-8346-a9eb89990c1f-utilities" (OuterVolumeSpecName: "utilities") pod "e752ae43-529b-407b-8346-a9eb89990c1f" (UID: "e752ae43-529b-407b-8346-a9eb89990c1f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:27:02 crc kubenswrapper[4733]: I0318 10:27:02.751785 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e752ae43-529b-407b-8346-a9eb89990c1f-kube-api-access-wn6vm" (OuterVolumeSpecName: "kube-api-access-wn6vm") pod "e752ae43-529b-407b-8346-a9eb89990c1f" (UID: "e752ae43-529b-407b-8346-a9eb89990c1f"). InnerVolumeSpecName "kube-api-access-wn6vm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:27:02 crc kubenswrapper[4733]: I0318 10:27:02.812450 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e752ae43-529b-407b-8346-a9eb89990c1f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e752ae43-529b-407b-8346-a9eb89990c1f" (UID: "e752ae43-529b-407b-8346-a9eb89990c1f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:27:02 crc kubenswrapper[4733]: I0318 10:27:02.847364 4733 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e752ae43-529b-407b-8346-a9eb89990c1f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 10:27:02 crc kubenswrapper[4733]: I0318 10:27:02.847407 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wn6vm\" (UniqueName: \"kubernetes.io/projected/e752ae43-529b-407b-8346-a9eb89990c1f-kube-api-access-wn6vm\") on node \"crc\" DevicePath \"\"" Mar 18 10:27:02 crc kubenswrapper[4733]: I0318 10:27:02.847421 4733 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e752ae43-529b-407b-8346-a9eb89990c1f-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 10:27:03 crc kubenswrapper[4733]: I0318 10:27:03.391038 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-m6rhx" Mar 18 10:27:03 crc kubenswrapper[4733]: I0318 10:27:03.526057 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qpfbg" event={"ID":"e752ae43-529b-407b-8346-a9eb89990c1f","Type":"ContainerDied","Data":"a969dc374c564078153562a05b2d6e6f30e62624c3adff3d6bf3cbebc8ffe206"} Mar 18 10:27:03 crc kubenswrapper[4733]: I0318 10:27:03.526817 4733 scope.go:117] "RemoveContainer" containerID="8a6db4a96078cce01cf2e13028e9a6e04a621dfaaea4705c4e07797befc3b96f" Mar 18 10:27:03 crc kubenswrapper[4733]: I0318 10:27:03.526144 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qpfbg" Mar 18 10:27:03 crc kubenswrapper[4733]: I0318 10:27:03.560517 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qpfbg"] Mar 18 10:27:03 crc kubenswrapper[4733]: I0318 10:27:03.563928 4733 scope.go:117] "RemoveContainer" containerID="d84796ff2be656e89a749a5e50910da1185cb7b35004070858a0b58ba51bc5ea" Mar 18 10:27:03 crc kubenswrapper[4733]: I0318 10:27:03.564330 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qpfbg"] Mar 18 10:27:03 crc kubenswrapper[4733]: I0318 10:27:03.590041 4733 scope.go:117] "RemoveContainer" containerID="c39a8c0e7870cb3625474393180c8aa4297d3c1427acc66e1c577dd7840a186c" Mar 18 10:27:03 crc kubenswrapper[4733]: I0318 10:27:03.777787 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5rwz6" Mar 18 10:27:03 crc kubenswrapper[4733]: I0318 10:27:03.778249 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5rwz6" Mar 18 10:27:03 crc kubenswrapper[4733]: I0318 10:27:03.846374 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5rwz6" Mar 18 10:27:04 crc kubenswrapper[4733]: I0318 10:27:04.616323 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5rwz6" Mar 18 10:27:05 crc kubenswrapper[4733]: I0318 10:27:05.191085 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e752ae43-529b-407b-8346-a9eb89990c1f" path="/var/lib/kubelet/pods/e752ae43-529b-407b-8346-a9eb89990c1f/volumes" Mar 18 10:27:07 crc kubenswrapper[4733]: I0318 10:27:07.418748 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5rwz6"] Mar 18 10:27:07 crc kubenswrapper[4733]: I0318 10:27:07.419141 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5rwz6" podUID="c2eed320-8685-4d5c-8793-e506593a33ca" containerName="registry-server" containerID="cri-o://c6d8dc1c0b97ce23301426f81a422e6e5a4b20b07fb86d1870204fc64f69c8a8" gracePeriod=2 Mar 18 10:27:07 crc kubenswrapper[4733]: I0318 10:27:07.568012 4733 generic.go:334] "Generic (PLEG): container finished" podID="c2eed320-8685-4d5c-8793-e506593a33ca" containerID="c6d8dc1c0b97ce23301426f81a422e6e5a4b20b07fb86d1870204fc64f69c8a8" exitCode=0 Mar 18 10:27:07 crc kubenswrapper[4733]: I0318 10:27:07.568085 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5rwz6" event={"ID":"c2eed320-8685-4d5c-8793-e506593a33ca","Type":"ContainerDied","Data":"c6d8dc1c0b97ce23301426f81a422e6e5a4b20b07fb86d1870204fc64f69c8a8"} Mar 18 10:27:07 crc kubenswrapper[4733]: I0318 10:27:07.878504 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5rwz6" Mar 18 10:27:08 crc kubenswrapper[4733]: I0318 10:27:08.029831 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgw6d\" (UniqueName: \"kubernetes.io/projected/c2eed320-8685-4d5c-8793-e506593a33ca-kube-api-access-qgw6d\") pod \"c2eed320-8685-4d5c-8793-e506593a33ca\" (UID: \"c2eed320-8685-4d5c-8793-e506593a33ca\") " Mar 18 10:27:08 crc kubenswrapper[4733]: I0318 10:27:08.029925 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2eed320-8685-4d5c-8793-e506593a33ca-catalog-content\") pod \"c2eed320-8685-4d5c-8793-e506593a33ca\" (UID: \"c2eed320-8685-4d5c-8793-e506593a33ca\") " Mar 18 10:27:08 crc kubenswrapper[4733]: I0318 10:27:08.029960 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2eed320-8685-4d5c-8793-e506593a33ca-utilities\") pod \"c2eed320-8685-4d5c-8793-e506593a33ca\" (UID: \"c2eed320-8685-4d5c-8793-e506593a33ca\") " Mar 18 10:27:08 crc kubenswrapper[4733]: I0318 10:27:08.031406 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2eed320-8685-4d5c-8793-e506593a33ca-utilities" (OuterVolumeSpecName: "utilities") pod "c2eed320-8685-4d5c-8793-e506593a33ca" (UID: "c2eed320-8685-4d5c-8793-e506593a33ca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:27:08 crc kubenswrapper[4733]: I0318 10:27:08.045458 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2eed320-8685-4d5c-8793-e506593a33ca-kube-api-access-qgw6d" (OuterVolumeSpecName: "kube-api-access-qgw6d") pod "c2eed320-8685-4d5c-8793-e506593a33ca" (UID: "c2eed320-8685-4d5c-8793-e506593a33ca"). InnerVolumeSpecName "kube-api-access-qgw6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:27:08 crc kubenswrapper[4733]: I0318 10:27:08.077177 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2eed320-8685-4d5c-8793-e506593a33ca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c2eed320-8685-4d5c-8793-e506593a33ca" (UID: "c2eed320-8685-4d5c-8793-e506593a33ca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:27:08 crc kubenswrapper[4733]: I0318 10:27:08.132365 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qgw6d\" (UniqueName: \"kubernetes.io/projected/c2eed320-8685-4d5c-8793-e506593a33ca-kube-api-access-qgw6d\") on node \"crc\" DevicePath \"\"" Mar 18 10:27:08 crc kubenswrapper[4733]: I0318 10:27:08.132400 4733 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2eed320-8685-4d5c-8793-e506593a33ca-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 10:27:08 crc kubenswrapper[4733]: I0318 10:27:08.132410 4733 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2eed320-8685-4d5c-8793-e506593a33ca-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 10:27:08 crc kubenswrapper[4733]: I0318 10:27:08.579358 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5rwz6" event={"ID":"c2eed320-8685-4d5c-8793-e506593a33ca","Type":"ContainerDied","Data":"a61df60801b4e77be52dfd9c59138a93d02124c8a4487a0d508ef12a254c3910"} Mar 18 10:27:08 crc kubenswrapper[4733]: I0318 10:27:08.579517 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5rwz6" Mar 18 10:27:08 crc kubenswrapper[4733]: I0318 10:27:08.579815 4733 scope.go:117] "RemoveContainer" containerID="c6d8dc1c0b97ce23301426f81a422e6e5a4b20b07fb86d1870204fc64f69c8a8" Mar 18 10:27:08 crc kubenswrapper[4733]: I0318 10:27:08.611843 4733 scope.go:117] "RemoveContainer" containerID="c6107d273465cbf0ca33eea0159f3ea649eae43a1b11f06f2154604037aa32c1" Mar 18 10:27:08 crc kubenswrapper[4733]: I0318 10:27:08.641497 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5rwz6"] Mar 18 10:27:08 crc kubenswrapper[4733]: I0318 10:27:08.643144 4733 scope.go:117] "RemoveContainer" containerID="e404d64c5fe9561da0e879dd49ffac0a191569c02ab91f0ac47c23b549c8a732" Mar 18 10:27:08 crc kubenswrapper[4733]: I0318 10:27:08.650771 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5rwz6"] Mar 18 10:27:09 crc kubenswrapper[4733]: I0318 10:27:09.187947 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2eed320-8685-4d5c-8793-e506593a33ca" path="/var/lib/kubelet/pods/c2eed320-8685-4d5c-8793-e506593a33ca/volumes" Mar 18 10:27:13 crc kubenswrapper[4733]: I0318 10:27:13.571519 4733 patch_prober.go:28] interesting pod/machine-config-daemon-2h7dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:27:13 crc kubenswrapper[4733]: I0318 10:27:13.572286 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:27:13 crc kubenswrapper[4733]: I0318 10:27:13.572343 4733 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" Mar 18 10:27:13 crc kubenswrapper[4733]: I0318 10:27:13.573089 4733 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a11e956cdd33846b5919c35822db029436f82987d5e2c2bb6427c6d1dfd2098c"} pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 10:27:13 crc kubenswrapper[4733]: I0318 10:27:13.573165 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" containerName="machine-config-daemon" containerID="cri-o://a11e956cdd33846b5919c35822db029436f82987d5e2c2bb6427c6d1dfd2098c" gracePeriod=600 Mar 18 10:27:14 crc kubenswrapper[4733]: I0318 10:27:14.627629 4733 generic.go:334] "Generic (PLEG): container finished" podID="6f75e1c5-e0c5-43df-944f-77b734070793" containerID="a11e956cdd33846b5919c35822db029436f82987d5e2c2bb6427c6d1dfd2098c" exitCode=0 Mar 18 10:27:14 crc kubenswrapper[4733]: I0318 10:27:14.627722 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" event={"ID":"6f75e1c5-e0c5-43df-944f-77b734070793","Type":"ContainerDied","Data":"a11e956cdd33846b5919c35822db029436f82987d5e2c2bb6427c6d1dfd2098c"} Mar 18 10:27:14 crc kubenswrapper[4733]: I0318 10:27:14.628275 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" event={"ID":"6f75e1c5-e0c5-43df-944f-77b734070793","Type":"ContainerStarted","Data":"2a78644e078fbb319d0fc66d47cfb2501076e4fd678ad793e791ddb4f3d3ee96"} Mar 18 10:27:14 crc kubenswrapper[4733]: I0318 10:27:14.628297 4733 scope.go:117] "RemoveContainer" containerID="bff727181393f1168072f98fbfc5cda5acfb0782a9ae8a688a8335ed7323a527" Mar 18 10:27:18 crc kubenswrapper[4733]: I0318 10:27:18.076552 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d59fb"] Mar 18 10:27:18 crc kubenswrapper[4733]: E0318 10:27:18.079488 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2eed320-8685-4d5c-8793-e506593a33ca" containerName="registry-server" Mar 18 10:27:18 crc kubenswrapper[4733]: I0318 10:27:18.079508 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2eed320-8685-4d5c-8793-e506593a33ca" containerName="registry-server" Mar 18 10:27:18 crc kubenswrapper[4733]: E0318 10:27:18.079523 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2eed320-8685-4d5c-8793-e506593a33ca" containerName="extract-content" Mar 18 10:27:18 crc kubenswrapper[4733]: I0318 10:27:18.079531 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2eed320-8685-4d5c-8793-e506593a33ca" containerName="extract-content" Mar 18 10:27:18 crc kubenswrapper[4733]: E0318 10:27:18.079543 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e752ae43-529b-407b-8346-a9eb89990c1f" containerName="extract-utilities" Mar 18 10:27:18 crc kubenswrapper[4733]: I0318 10:27:18.079551 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="e752ae43-529b-407b-8346-a9eb89990c1f" containerName="extract-utilities" Mar 18 10:27:18 crc kubenswrapper[4733]: E0318 10:27:18.079564 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e752ae43-529b-407b-8346-a9eb89990c1f" containerName="extract-content" Mar 18 10:27:18 crc kubenswrapper[4733]: I0318 10:27:18.079572 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="e752ae43-529b-407b-8346-a9eb89990c1f" containerName="extract-content" Mar 18 10:27:18 crc kubenswrapper[4733]: E0318 10:27:18.079585 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e752ae43-529b-407b-8346-a9eb89990c1f" containerName="registry-server" Mar 18 10:27:18 crc kubenswrapper[4733]: I0318 10:27:18.079592 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="e752ae43-529b-407b-8346-a9eb89990c1f" containerName="registry-server" Mar 18 10:27:18 crc kubenswrapper[4733]: E0318 10:27:18.079600 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2eed320-8685-4d5c-8793-e506593a33ca" containerName="extract-utilities" Mar 18 10:27:18 crc kubenswrapper[4733]: I0318 10:27:18.079608 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2eed320-8685-4d5c-8793-e506593a33ca" containerName="extract-utilities" Mar 18 10:27:18 crc kubenswrapper[4733]: I0318 10:27:18.079728 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2eed320-8685-4d5c-8793-e506593a33ca" containerName="registry-server" Mar 18 10:27:18 crc kubenswrapper[4733]: I0318 10:27:18.079742 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="e752ae43-529b-407b-8346-a9eb89990c1f" containerName="registry-server" Mar 18 10:27:18 crc kubenswrapper[4733]: I0318 10:27:18.080802 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d59fb" Mar 18 10:27:18 crc kubenswrapper[4733]: I0318 10:27:18.089687 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 18 10:27:18 crc kubenswrapper[4733]: I0318 10:27:18.099980 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d59fb"] Mar 18 10:27:18 crc kubenswrapper[4733]: I0318 10:27:18.205545 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2d140fd7-f9d7-4432-aef0-ec0ab2e18cf6-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d59fb\" (UID: \"2d140fd7-f9d7-4432-aef0-ec0ab2e18cf6\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d59fb" Mar 18 10:27:18 crc kubenswrapper[4733]: I0318 10:27:18.205887 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2d140fd7-f9d7-4432-aef0-ec0ab2e18cf6-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d59fb\" (UID: \"2d140fd7-f9d7-4432-aef0-ec0ab2e18cf6\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d59fb" Mar 18 10:27:18 crc kubenswrapper[4733]: I0318 10:27:18.206133 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnrws\" (UniqueName: \"kubernetes.io/projected/2d140fd7-f9d7-4432-aef0-ec0ab2e18cf6-kube-api-access-cnrws\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d59fb\" (UID: \"2d140fd7-f9d7-4432-aef0-ec0ab2e18cf6\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d59fb" Mar 18 10:27:18 crc kubenswrapper[4733]: I0318 10:27:18.308366 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnrws\" (UniqueName: \"kubernetes.io/projected/2d140fd7-f9d7-4432-aef0-ec0ab2e18cf6-kube-api-access-cnrws\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d59fb\" (UID: \"2d140fd7-f9d7-4432-aef0-ec0ab2e18cf6\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d59fb" Mar 18 10:27:18 crc kubenswrapper[4733]: I0318 10:27:18.308520 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2d140fd7-f9d7-4432-aef0-ec0ab2e18cf6-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d59fb\" (UID: \"2d140fd7-f9d7-4432-aef0-ec0ab2e18cf6\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d59fb" Mar 18 10:27:18 crc kubenswrapper[4733]: I0318 10:27:18.308669 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2d140fd7-f9d7-4432-aef0-ec0ab2e18cf6-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d59fb\" (UID: \"2d140fd7-f9d7-4432-aef0-ec0ab2e18cf6\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d59fb" Mar 18 10:27:18 crc kubenswrapper[4733]: I0318 10:27:18.309645 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2d140fd7-f9d7-4432-aef0-ec0ab2e18cf6-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d59fb\" (UID: \"2d140fd7-f9d7-4432-aef0-ec0ab2e18cf6\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d59fb" Mar 18 10:27:18 crc kubenswrapper[4733]: I0318 10:27:18.309817 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2d140fd7-f9d7-4432-aef0-ec0ab2e18cf6-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d59fb\" (UID: \"2d140fd7-f9d7-4432-aef0-ec0ab2e18cf6\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d59fb" Mar 18 10:27:18 crc kubenswrapper[4733]: I0318 10:27:18.342935 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnrws\" (UniqueName: \"kubernetes.io/projected/2d140fd7-f9d7-4432-aef0-ec0ab2e18cf6-kube-api-access-cnrws\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d59fb\" (UID: \"2d140fd7-f9d7-4432-aef0-ec0ab2e18cf6\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d59fb" Mar 18 10:27:18 crc kubenswrapper[4733]: I0318 10:27:18.399522 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d59fb" Mar 18 10:27:18 crc kubenswrapper[4733]: I0318 10:27:18.758267 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d59fb"] Mar 18 10:27:19 crc kubenswrapper[4733]: I0318 10:27:19.560594 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-8v244" podUID="f27409fc-b6dd-4573-918b-7b30b3635cc7" containerName="console" containerID="cri-o://bd6ef4d994ae506be5343c7bd62e3c9d5c8d51a521ee2a66c4d08bede745d9e1" gracePeriod=15 Mar 18 10:27:19 crc kubenswrapper[4733]: I0318 10:27:19.677540 4733 generic.go:334] "Generic (PLEG): container finished" podID="2d140fd7-f9d7-4432-aef0-ec0ab2e18cf6" containerID="45f878e96607b8811e5b3070c9def994347e7a1c80f639b1670a63f141ea7cc8" exitCode=0 Mar 18 10:27:19 crc kubenswrapper[4733]: I0318 10:27:19.677589 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d59fb" event={"ID":"2d140fd7-f9d7-4432-aef0-ec0ab2e18cf6","Type":"ContainerDied","Data":"45f878e96607b8811e5b3070c9def994347e7a1c80f639b1670a63f141ea7cc8"} Mar 18 10:27:19 crc kubenswrapper[4733]: I0318 10:27:19.677618 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d59fb" event={"ID":"2d140fd7-f9d7-4432-aef0-ec0ab2e18cf6","Type":"ContainerStarted","Data":"1be03341496d64c1aaa71d45ded69293eb224d4a64bdfa61810547dec07a2582"} Mar 18 10:27:19 crc kubenswrapper[4733]: I0318 10:27:19.685070 4733 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 10:27:20 crc kubenswrapper[4733]: I0318 10:27:20.066807 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-8v244_f27409fc-b6dd-4573-918b-7b30b3635cc7/console/0.log" Mar 18 10:27:20 crc kubenswrapper[4733]: I0318 10:27:20.067350 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-8v244" Mar 18 10:27:20 crc kubenswrapper[4733]: I0318 10:27:20.249016 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdltm\" (UniqueName: \"kubernetes.io/projected/f27409fc-b6dd-4573-918b-7b30b3635cc7-kube-api-access-cdltm\") pod \"f27409fc-b6dd-4573-918b-7b30b3635cc7\" (UID: \"f27409fc-b6dd-4573-918b-7b30b3635cc7\") " Mar 18 10:27:20 crc kubenswrapper[4733]: I0318 10:27:20.249121 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f27409fc-b6dd-4573-918b-7b30b3635cc7-console-config\") pod \"f27409fc-b6dd-4573-918b-7b30b3635cc7\" (UID: \"f27409fc-b6dd-4573-918b-7b30b3635cc7\") " Mar 18 10:27:20 crc kubenswrapper[4733]: I0318 10:27:20.249165 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f27409fc-b6dd-4573-918b-7b30b3635cc7-service-ca\") pod \"f27409fc-b6dd-4573-918b-7b30b3635cc7\" (UID: \"f27409fc-b6dd-4573-918b-7b30b3635cc7\") " Mar 18 10:27:20 crc kubenswrapper[4733]: I0318 10:27:20.249275 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f27409fc-b6dd-4573-918b-7b30b3635cc7-oauth-serving-cert\") pod \"f27409fc-b6dd-4573-918b-7b30b3635cc7\" (UID: \"f27409fc-b6dd-4573-918b-7b30b3635cc7\") " Mar 18 10:27:20 crc kubenswrapper[4733]: I0318 10:27:20.250270 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f27409fc-b6dd-4573-918b-7b30b3635cc7-console-config" (OuterVolumeSpecName: "console-config") pod "f27409fc-b6dd-4573-918b-7b30b3635cc7" (UID: "f27409fc-b6dd-4573-918b-7b30b3635cc7"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:27:20 crc kubenswrapper[4733]: I0318 10:27:20.250335 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f27409fc-b6dd-4573-918b-7b30b3635cc7-service-ca" (OuterVolumeSpecName: "service-ca") pod "f27409fc-b6dd-4573-918b-7b30b3635cc7" (UID: "f27409fc-b6dd-4573-918b-7b30b3635cc7"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:27:20 crc kubenswrapper[4733]: I0318 10:27:20.250484 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f27409fc-b6dd-4573-918b-7b30b3635cc7-console-oauth-config\") pod \"f27409fc-b6dd-4573-918b-7b30b3635cc7\" (UID: \"f27409fc-b6dd-4573-918b-7b30b3635cc7\") " Mar 18 10:27:20 crc kubenswrapper[4733]: I0318 10:27:20.251320 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f27409fc-b6dd-4573-918b-7b30b3635cc7-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "f27409fc-b6dd-4573-918b-7b30b3635cc7" (UID: "f27409fc-b6dd-4573-918b-7b30b3635cc7"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:27:20 crc kubenswrapper[4733]: I0318 10:27:20.251872 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f27409fc-b6dd-4573-918b-7b30b3635cc7-console-serving-cert\") pod \"f27409fc-b6dd-4573-918b-7b30b3635cc7\" (UID: \"f27409fc-b6dd-4573-918b-7b30b3635cc7\") " Mar 18 10:27:20 crc kubenswrapper[4733]: I0318 10:27:20.251950 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f27409fc-b6dd-4573-918b-7b30b3635cc7-trusted-ca-bundle\") pod \"f27409fc-b6dd-4573-918b-7b30b3635cc7\" (UID: \"f27409fc-b6dd-4573-918b-7b30b3635cc7\") " Mar 18 10:27:20 crc kubenswrapper[4733]: I0318 10:27:20.252510 4733 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f27409fc-b6dd-4573-918b-7b30b3635cc7-console-config\") on node \"crc\" DevicePath \"\"" Mar 18 10:27:20 crc kubenswrapper[4733]: I0318 10:27:20.252538 4733 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f27409fc-b6dd-4573-918b-7b30b3635cc7-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 10:27:20 crc kubenswrapper[4733]: I0318 10:27:20.252558 4733 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f27409fc-b6dd-4573-918b-7b30b3635cc7-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 10:27:20 crc kubenswrapper[4733]: I0318 10:27:20.252538 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f27409fc-b6dd-4573-918b-7b30b3635cc7-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "f27409fc-b6dd-4573-918b-7b30b3635cc7" (UID: "f27409fc-b6dd-4573-918b-7b30b3635cc7"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:27:20 crc kubenswrapper[4733]: I0318 10:27:20.259529 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f27409fc-b6dd-4573-918b-7b30b3635cc7-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "f27409fc-b6dd-4573-918b-7b30b3635cc7" (UID: "f27409fc-b6dd-4573-918b-7b30b3635cc7"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:27:20 crc kubenswrapper[4733]: I0318 10:27:20.260154 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f27409fc-b6dd-4573-918b-7b30b3635cc7-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "f27409fc-b6dd-4573-918b-7b30b3635cc7" (UID: "f27409fc-b6dd-4573-918b-7b30b3635cc7"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:27:20 crc kubenswrapper[4733]: I0318 10:27:20.262022 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f27409fc-b6dd-4573-918b-7b30b3635cc7-kube-api-access-cdltm" (OuterVolumeSpecName: "kube-api-access-cdltm") pod "f27409fc-b6dd-4573-918b-7b30b3635cc7" (UID: "f27409fc-b6dd-4573-918b-7b30b3635cc7"). InnerVolumeSpecName "kube-api-access-cdltm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:27:20 crc kubenswrapper[4733]: I0318 10:27:20.353968 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cdltm\" (UniqueName: \"kubernetes.io/projected/f27409fc-b6dd-4573-918b-7b30b3635cc7-kube-api-access-cdltm\") on node \"crc\" DevicePath \"\"" Mar 18 10:27:20 crc kubenswrapper[4733]: I0318 10:27:20.354031 4733 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f27409fc-b6dd-4573-918b-7b30b3635cc7-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 10:27:20 crc kubenswrapper[4733]: I0318 10:27:20.354051 4733 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f27409fc-b6dd-4573-918b-7b30b3635cc7-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 18 10:27:20 crc kubenswrapper[4733]: I0318 10:27:20.354070 4733 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f27409fc-b6dd-4573-918b-7b30b3635cc7-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 10:27:20 crc kubenswrapper[4733]: I0318 10:27:20.688441 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-8v244_f27409fc-b6dd-4573-918b-7b30b3635cc7/console/0.log" Mar 18 10:27:20 crc kubenswrapper[4733]: I0318 10:27:20.688519 4733 generic.go:334] "Generic (PLEG): container finished" podID="f27409fc-b6dd-4573-918b-7b30b3635cc7" containerID="bd6ef4d994ae506be5343c7bd62e3c9d5c8d51a521ee2a66c4d08bede745d9e1" exitCode=2 Mar 18 10:27:20 crc kubenswrapper[4733]: I0318 10:27:20.688563 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-8v244" event={"ID":"f27409fc-b6dd-4573-918b-7b30b3635cc7","Type":"ContainerDied","Data":"bd6ef4d994ae506be5343c7bd62e3c9d5c8d51a521ee2a66c4d08bede745d9e1"} Mar 18 10:27:20 crc kubenswrapper[4733]: I0318 10:27:20.688606 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-8v244" Mar 18 10:27:20 crc kubenswrapper[4733]: I0318 10:27:20.688634 4733 scope.go:117] "RemoveContainer" containerID="bd6ef4d994ae506be5343c7bd62e3c9d5c8d51a521ee2a66c4d08bede745d9e1" Mar 18 10:27:20 crc kubenswrapper[4733]: I0318 10:27:20.688616 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-8v244" event={"ID":"f27409fc-b6dd-4573-918b-7b30b3635cc7","Type":"ContainerDied","Data":"a5e5da5d6249a1112447a42843768f7217f63fd427eb58063240eac26ad5daee"} Mar 18 10:27:20 crc kubenswrapper[4733]: I0318 10:27:20.715359 4733 scope.go:117] "RemoveContainer" containerID="bd6ef4d994ae506be5343c7bd62e3c9d5c8d51a521ee2a66c4d08bede745d9e1" Mar 18 10:27:20 crc kubenswrapper[4733]: E0318 10:27:20.716162 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd6ef4d994ae506be5343c7bd62e3c9d5c8d51a521ee2a66c4d08bede745d9e1\": container with ID starting with bd6ef4d994ae506be5343c7bd62e3c9d5c8d51a521ee2a66c4d08bede745d9e1 not found: ID does not exist" containerID="bd6ef4d994ae506be5343c7bd62e3c9d5c8d51a521ee2a66c4d08bede745d9e1" Mar 18 10:27:20 crc kubenswrapper[4733]: I0318 10:27:20.716248 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd6ef4d994ae506be5343c7bd62e3c9d5c8d51a521ee2a66c4d08bede745d9e1"} err="failed to get container status \"bd6ef4d994ae506be5343c7bd62e3c9d5c8d51a521ee2a66c4d08bede745d9e1\": rpc error: code = NotFound desc = could not find container \"bd6ef4d994ae506be5343c7bd62e3c9d5c8d51a521ee2a66c4d08bede745d9e1\": container with ID starting with bd6ef4d994ae506be5343c7bd62e3c9d5c8d51a521ee2a66c4d08bede745d9e1 not found: ID does not exist" Mar 18 10:27:20 crc kubenswrapper[4733]: I0318 10:27:20.736951 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-8v244"] Mar 18 10:27:20 crc kubenswrapper[4733]: I0318 10:27:20.743973 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-8v244"] Mar 18 10:27:21 crc kubenswrapper[4733]: I0318 10:27:21.189076 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f27409fc-b6dd-4573-918b-7b30b3635cc7" path="/var/lib/kubelet/pods/f27409fc-b6dd-4573-918b-7b30b3635cc7/volumes" Mar 18 10:27:21 crc kubenswrapper[4733]: I0318 10:27:21.644677 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zcphv"] Mar 18 10:27:21 crc kubenswrapper[4733]: E0318 10:27:21.645611 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f27409fc-b6dd-4573-918b-7b30b3635cc7" containerName="console" Mar 18 10:27:21 crc kubenswrapper[4733]: I0318 10:27:21.645638 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="f27409fc-b6dd-4573-918b-7b30b3635cc7" containerName="console" Mar 18 10:27:21 crc kubenswrapper[4733]: I0318 10:27:21.646044 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="f27409fc-b6dd-4573-918b-7b30b3635cc7" containerName="console" Mar 18 10:27:21 crc kubenswrapper[4733]: I0318 10:27:21.648249 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zcphv" Mar 18 10:27:21 crc kubenswrapper[4733]: I0318 10:27:21.672983 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zcphv"] Mar 18 10:27:21 crc kubenswrapper[4733]: I0318 10:27:21.701393 4733 generic.go:334] "Generic (PLEG): container finished" podID="2d140fd7-f9d7-4432-aef0-ec0ab2e18cf6" containerID="c4d67150099953817c49555f32364536b6d9fd3cb9344b78203a23df29609609" exitCode=0 Mar 18 10:27:21 crc kubenswrapper[4733]: I0318 10:27:21.701455 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d59fb" event={"ID":"2d140fd7-f9d7-4432-aef0-ec0ab2e18cf6","Type":"ContainerDied","Data":"c4d67150099953817c49555f32364536b6d9fd3cb9344b78203a23df29609609"} Mar 18 10:27:21 crc kubenswrapper[4733]: I0318 10:27:21.774927 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a94fa194-a338-4d4d-9b7a-12440afb4e22-utilities\") pod \"redhat-operators-zcphv\" (UID: \"a94fa194-a338-4d4d-9b7a-12440afb4e22\") " pod="openshift-marketplace/redhat-operators-zcphv" Mar 18 10:27:21 crc kubenswrapper[4733]: I0318 10:27:21.775038 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a94fa194-a338-4d4d-9b7a-12440afb4e22-catalog-content\") pod \"redhat-operators-zcphv\" (UID: \"a94fa194-a338-4d4d-9b7a-12440afb4e22\") " pod="openshift-marketplace/redhat-operators-zcphv" Mar 18 10:27:21 crc kubenswrapper[4733]: I0318 10:27:21.775255 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zv2q6\" (UniqueName: \"kubernetes.io/projected/a94fa194-a338-4d4d-9b7a-12440afb4e22-kube-api-access-zv2q6\") pod \"redhat-operators-zcphv\" (UID: \"a94fa194-a338-4d4d-9b7a-12440afb4e22\") " pod="openshift-marketplace/redhat-operators-zcphv" Mar 18 10:27:21 crc kubenswrapper[4733]: I0318 10:27:21.876668 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zv2q6\" (UniqueName: \"kubernetes.io/projected/a94fa194-a338-4d4d-9b7a-12440afb4e22-kube-api-access-zv2q6\") pod \"redhat-operators-zcphv\" (UID: \"a94fa194-a338-4d4d-9b7a-12440afb4e22\") " pod="openshift-marketplace/redhat-operators-zcphv" Mar 18 10:27:21 crc kubenswrapper[4733]: I0318 10:27:21.876749 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a94fa194-a338-4d4d-9b7a-12440afb4e22-utilities\") pod \"redhat-operators-zcphv\" (UID: \"a94fa194-a338-4d4d-9b7a-12440afb4e22\") " pod="openshift-marketplace/redhat-operators-zcphv" Mar 18 10:27:21 crc kubenswrapper[4733]: I0318 10:27:21.876827 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a94fa194-a338-4d4d-9b7a-12440afb4e22-catalog-content\") pod \"redhat-operators-zcphv\" (UID: \"a94fa194-a338-4d4d-9b7a-12440afb4e22\") " pod="openshift-marketplace/redhat-operators-zcphv" Mar 18 10:27:21 crc kubenswrapper[4733]: I0318 10:27:21.877445 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a94fa194-a338-4d4d-9b7a-12440afb4e22-utilities\") pod \"redhat-operators-zcphv\" (UID: \"a94fa194-a338-4d4d-9b7a-12440afb4e22\") " pod="openshift-marketplace/redhat-operators-zcphv" Mar 18 10:27:21 crc kubenswrapper[4733]: I0318 10:27:21.877520 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a94fa194-a338-4d4d-9b7a-12440afb4e22-catalog-content\") pod \"redhat-operators-zcphv\" (UID: \"a94fa194-a338-4d4d-9b7a-12440afb4e22\") " pod="openshift-marketplace/redhat-operators-zcphv" Mar 18 10:27:21 crc kubenswrapper[4733]: I0318 10:27:21.898716 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zv2q6\" (UniqueName: \"kubernetes.io/projected/a94fa194-a338-4d4d-9b7a-12440afb4e22-kube-api-access-zv2q6\") pod \"redhat-operators-zcphv\" (UID: \"a94fa194-a338-4d4d-9b7a-12440afb4e22\") " pod="openshift-marketplace/redhat-operators-zcphv" Mar 18 10:27:21 crc kubenswrapper[4733]: I0318 10:27:21.984784 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zcphv" Mar 18 10:27:22 crc kubenswrapper[4733]: I0318 10:27:22.450509 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zcphv"] Mar 18 10:27:22 crc kubenswrapper[4733]: I0318 10:27:22.714127 4733 generic.go:334] "Generic (PLEG): container finished" podID="2d140fd7-f9d7-4432-aef0-ec0ab2e18cf6" containerID="f480edb077b6a47bec5706c08fdabad3731b30c4200baa35b38bed5c3ca5beac" exitCode=0 Mar 18 10:27:22 crc kubenswrapper[4733]: I0318 10:27:22.714241 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d59fb" event={"ID":"2d140fd7-f9d7-4432-aef0-ec0ab2e18cf6","Type":"ContainerDied","Data":"f480edb077b6a47bec5706c08fdabad3731b30c4200baa35b38bed5c3ca5beac"} Mar 18 10:27:22 crc kubenswrapper[4733]: I0318 10:27:22.716992 4733 generic.go:334] "Generic (PLEG): container finished" podID="a94fa194-a338-4d4d-9b7a-12440afb4e22" containerID="f9bed651ee1eb90b0878bd6c19ea89ecbdc4aec7e6e5c4b2a087036bf1d47f05" exitCode=0 Mar 18 10:27:22 crc kubenswrapper[4733]: I0318 10:27:22.717057 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zcphv" event={"ID":"a94fa194-a338-4d4d-9b7a-12440afb4e22","Type":"ContainerDied","Data":"f9bed651ee1eb90b0878bd6c19ea89ecbdc4aec7e6e5c4b2a087036bf1d47f05"} Mar 18 10:27:22 crc kubenswrapper[4733]: I0318 10:27:22.717094 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zcphv" event={"ID":"a94fa194-a338-4d4d-9b7a-12440afb4e22","Type":"ContainerStarted","Data":"1acf1672f2e3aed16d46401b9ad4350dbc89e72639a53dcefec1594e4af7cc4b"} Mar 18 10:27:23 crc kubenswrapper[4733]: I0318 10:27:23.726201 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zcphv" event={"ID":"a94fa194-a338-4d4d-9b7a-12440afb4e22","Type":"ContainerStarted","Data":"8fef4e90a34ce3bc76baae5a86a5806e4e83002d16b9733dc19bc2636f351214"} Mar 18 10:27:24 crc kubenswrapper[4733]: I0318 10:27:24.046567 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d59fb" Mar 18 10:27:24 crc kubenswrapper[4733]: I0318 10:27:24.210970 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2d140fd7-f9d7-4432-aef0-ec0ab2e18cf6-bundle\") pod \"2d140fd7-f9d7-4432-aef0-ec0ab2e18cf6\" (UID: \"2d140fd7-f9d7-4432-aef0-ec0ab2e18cf6\") " Mar 18 10:27:24 crc kubenswrapper[4733]: I0318 10:27:24.211734 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2d140fd7-f9d7-4432-aef0-ec0ab2e18cf6-util\") pod \"2d140fd7-f9d7-4432-aef0-ec0ab2e18cf6\" (UID: \"2d140fd7-f9d7-4432-aef0-ec0ab2e18cf6\") " Mar 18 10:27:24 crc kubenswrapper[4733]: I0318 10:27:24.211783 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnrws\" (UniqueName: \"kubernetes.io/projected/2d140fd7-f9d7-4432-aef0-ec0ab2e18cf6-kube-api-access-cnrws\") pod \"2d140fd7-f9d7-4432-aef0-ec0ab2e18cf6\" (UID: \"2d140fd7-f9d7-4432-aef0-ec0ab2e18cf6\") " Mar 18 10:27:24 crc kubenswrapper[4733]: I0318 10:27:24.213156 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d140fd7-f9d7-4432-aef0-ec0ab2e18cf6-bundle" (OuterVolumeSpecName: "bundle") pod "2d140fd7-f9d7-4432-aef0-ec0ab2e18cf6" (UID: "2d140fd7-f9d7-4432-aef0-ec0ab2e18cf6"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:27:24 crc kubenswrapper[4733]: I0318 10:27:24.220282 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d140fd7-f9d7-4432-aef0-ec0ab2e18cf6-kube-api-access-cnrws" (OuterVolumeSpecName: "kube-api-access-cnrws") pod "2d140fd7-f9d7-4432-aef0-ec0ab2e18cf6" (UID: "2d140fd7-f9d7-4432-aef0-ec0ab2e18cf6"). InnerVolumeSpecName "kube-api-access-cnrws". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:27:24 crc kubenswrapper[4733]: I0318 10:27:24.314162 4733 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2d140fd7-f9d7-4432-aef0-ec0ab2e18cf6-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 10:27:24 crc kubenswrapper[4733]: I0318 10:27:24.314239 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnrws\" (UniqueName: \"kubernetes.io/projected/2d140fd7-f9d7-4432-aef0-ec0ab2e18cf6-kube-api-access-cnrws\") on node \"crc\" DevicePath \"\"" Mar 18 10:27:24 crc kubenswrapper[4733]: I0318 10:27:24.475463 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d140fd7-f9d7-4432-aef0-ec0ab2e18cf6-util" (OuterVolumeSpecName: "util") pod "2d140fd7-f9d7-4432-aef0-ec0ab2e18cf6" (UID: "2d140fd7-f9d7-4432-aef0-ec0ab2e18cf6"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:27:24 crc kubenswrapper[4733]: I0318 10:27:24.517558 4733 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2d140fd7-f9d7-4432-aef0-ec0ab2e18cf6-util\") on node \"crc\" DevicePath \"\"" Mar 18 10:27:24 crc kubenswrapper[4733]: I0318 10:27:24.737976 4733 generic.go:334] "Generic (PLEG): container finished" podID="a94fa194-a338-4d4d-9b7a-12440afb4e22" containerID="8fef4e90a34ce3bc76baae5a86a5806e4e83002d16b9733dc19bc2636f351214" exitCode=0 Mar 18 10:27:24 crc kubenswrapper[4733]: I0318 10:27:24.738161 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zcphv" event={"ID":"a94fa194-a338-4d4d-9b7a-12440afb4e22","Type":"ContainerDied","Data":"8fef4e90a34ce3bc76baae5a86a5806e4e83002d16b9733dc19bc2636f351214"} Mar 18 10:27:24 crc kubenswrapper[4733]: I0318 10:27:24.742535 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d59fb" event={"ID":"2d140fd7-f9d7-4432-aef0-ec0ab2e18cf6","Type":"ContainerDied","Data":"1be03341496d64c1aaa71d45ded69293eb224d4a64bdfa61810547dec07a2582"} Mar 18 10:27:24 crc kubenswrapper[4733]: I0318 10:27:24.742608 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1be03341496d64c1aaa71d45ded69293eb224d4a64bdfa61810547dec07a2582" Mar 18 10:27:24 crc kubenswrapper[4733]: I0318 10:27:24.742756 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d59fb" Mar 18 10:27:25 crc kubenswrapper[4733]: I0318 10:27:25.755550 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zcphv" event={"ID":"a94fa194-a338-4d4d-9b7a-12440afb4e22","Type":"ContainerStarted","Data":"d10fef6862efb0197934c1e27e0535679692d0bab9a9c4f0d5d699fb93ec91c8"} Mar 18 10:27:25 crc kubenswrapper[4733]: I0318 10:27:25.790982 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zcphv" podStartSLOduration=2.317880858 podStartE2EDuration="4.790955855s" podCreationTimestamp="2026-03-18 10:27:21 +0000 UTC" firstStartedPulling="2026-03-18 10:27:22.71855391 +0000 UTC m=+882.210288235" lastFinishedPulling="2026-03-18 10:27:25.191628867 +0000 UTC m=+884.683363232" observedRunningTime="2026-03-18 10:27:25.782787662 +0000 UTC m=+885.274522047" watchObservedRunningTime="2026-03-18 10:27:25.790955855 +0000 UTC m=+885.282690210" Mar 18 10:27:31 crc kubenswrapper[4733]: I0318 10:27:31.985033 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zcphv" Mar 18 10:27:31 crc kubenswrapper[4733]: I0318 10:27:31.986822 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zcphv" Mar 18 10:27:33 crc kubenswrapper[4733]: I0318 10:27:33.040993 4733 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zcphv" podUID="a94fa194-a338-4d4d-9b7a-12440afb4e22" containerName="registry-server" probeResult="failure" output=< Mar 18 10:27:33 crc kubenswrapper[4733]: timeout: failed to connect service ":50051" within 1s Mar 18 10:27:33 crc kubenswrapper[4733]: > Mar 18 10:27:36 crc kubenswrapper[4733]: I0318 10:27:36.633864 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-5ddc5ff65-jst9z"] Mar 18 10:27:36 crc kubenswrapper[4733]: E0318 10:27:36.634390 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d140fd7-f9d7-4432-aef0-ec0ab2e18cf6" containerName="pull" Mar 18 10:27:36 crc kubenswrapper[4733]: I0318 10:27:36.634404 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d140fd7-f9d7-4432-aef0-ec0ab2e18cf6" containerName="pull" Mar 18 10:27:36 crc kubenswrapper[4733]: E0318 10:27:36.634416 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d140fd7-f9d7-4432-aef0-ec0ab2e18cf6" containerName="util" Mar 18 10:27:36 crc kubenswrapper[4733]: I0318 10:27:36.634422 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d140fd7-f9d7-4432-aef0-ec0ab2e18cf6" containerName="util" Mar 18 10:27:36 crc kubenswrapper[4733]: E0318 10:27:36.634436 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d140fd7-f9d7-4432-aef0-ec0ab2e18cf6" containerName="extract" Mar 18 10:27:36 crc kubenswrapper[4733]: I0318 10:27:36.634443 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d140fd7-f9d7-4432-aef0-ec0ab2e18cf6" containerName="extract" Mar 18 10:27:36 crc kubenswrapper[4733]: I0318 10:27:36.634537 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d140fd7-f9d7-4432-aef0-ec0ab2e18cf6" containerName="extract" Mar 18 10:27:36 crc kubenswrapper[4733]: I0318 10:27:36.634889 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5ddc5ff65-jst9z" Mar 18 10:27:36 crc kubenswrapper[4733]: I0318 10:27:36.637034 4733 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 18 10:27:36 crc kubenswrapper[4733]: I0318 10:27:36.637456 4733 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 18 10:27:36 crc kubenswrapper[4733]: I0318 10:27:36.637515 4733 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-ln5pm" Mar 18 10:27:36 crc kubenswrapper[4733]: I0318 10:27:36.637565 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 18 10:27:36 crc kubenswrapper[4733]: I0318 10:27:36.637588 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 18 10:27:36 crc kubenswrapper[4733]: I0318 10:27:36.668342 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5ddc5ff65-jst9z"] Mar 18 10:27:36 crc kubenswrapper[4733]: I0318 10:27:36.699958 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9731a250-9d78-43e0-bde3-7e769ea43d11-apiservice-cert\") pod \"metallb-operator-controller-manager-5ddc5ff65-jst9z\" (UID: \"9731a250-9d78-43e0-bde3-7e769ea43d11\") " pod="metallb-system/metallb-operator-controller-manager-5ddc5ff65-jst9z" Mar 18 10:27:36 crc kubenswrapper[4733]: I0318 10:27:36.700039 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9731a250-9d78-43e0-bde3-7e769ea43d11-webhook-cert\") pod \"metallb-operator-controller-manager-5ddc5ff65-jst9z\" (UID: \"9731a250-9d78-43e0-bde3-7e769ea43d11\") " pod="metallb-system/metallb-operator-controller-manager-5ddc5ff65-jst9z" Mar 18 10:27:36 crc kubenswrapper[4733]: I0318 10:27:36.700143 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frt5k\" (UniqueName: \"kubernetes.io/projected/9731a250-9d78-43e0-bde3-7e769ea43d11-kube-api-access-frt5k\") pod \"metallb-operator-controller-manager-5ddc5ff65-jst9z\" (UID: \"9731a250-9d78-43e0-bde3-7e769ea43d11\") " pod="metallb-system/metallb-operator-controller-manager-5ddc5ff65-jst9z" Mar 18 10:27:36 crc kubenswrapper[4733]: I0318 10:27:36.800872 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frt5k\" (UniqueName: \"kubernetes.io/projected/9731a250-9d78-43e0-bde3-7e769ea43d11-kube-api-access-frt5k\") pod \"metallb-operator-controller-manager-5ddc5ff65-jst9z\" (UID: \"9731a250-9d78-43e0-bde3-7e769ea43d11\") " pod="metallb-system/metallb-operator-controller-manager-5ddc5ff65-jst9z" Mar 18 10:27:36 crc kubenswrapper[4733]: I0318 10:27:36.800927 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9731a250-9d78-43e0-bde3-7e769ea43d11-apiservice-cert\") pod \"metallb-operator-controller-manager-5ddc5ff65-jst9z\" (UID: \"9731a250-9d78-43e0-bde3-7e769ea43d11\") " pod="metallb-system/metallb-operator-controller-manager-5ddc5ff65-jst9z" Mar 18 10:27:36 crc kubenswrapper[4733]: I0318 10:27:36.800961 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9731a250-9d78-43e0-bde3-7e769ea43d11-webhook-cert\") pod \"metallb-operator-controller-manager-5ddc5ff65-jst9z\" (UID: \"9731a250-9d78-43e0-bde3-7e769ea43d11\") " pod="metallb-system/metallb-operator-controller-manager-5ddc5ff65-jst9z" Mar 18 10:27:36 crc kubenswrapper[4733]: I0318 10:27:36.807951 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9731a250-9d78-43e0-bde3-7e769ea43d11-webhook-cert\") pod \"metallb-operator-controller-manager-5ddc5ff65-jst9z\" (UID: \"9731a250-9d78-43e0-bde3-7e769ea43d11\") " pod="metallb-system/metallb-operator-controller-manager-5ddc5ff65-jst9z" Mar 18 10:27:36 crc kubenswrapper[4733]: I0318 10:27:36.807987 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9731a250-9d78-43e0-bde3-7e769ea43d11-apiservice-cert\") pod \"metallb-operator-controller-manager-5ddc5ff65-jst9z\" (UID: \"9731a250-9d78-43e0-bde3-7e769ea43d11\") " pod="metallb-system/metallb-operator-controller-manager-5ddc5ff65-jst9z" Mar 18 10:27:36 crc kubenswrapper[4733]: I0318 10:27:36.822916 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frt5k\" (UniqueName: \"kubernetes.io/projected/9731a250-9d78-43e0-bde3-7e769ea43d11-kube-api-access-frt5k\") pod \"metallb-operator-controller-manager-5ddc5ff65-jst9z\" (UID: \"9731a250-9d78-43e0-bde3-7e769ea43d11\") " pod="metallb-system/metallb-operator-controller-manager-5ddc5ff65-jst9z" Mar 18 10:27:36 crc kubenswrapper[4733]: I0318 10:27:36.877935 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-c99d9f4d6-n5lc9"] Mar 18 10:27:36 crc kubenswrapper[4733]: I0318 10:27:36.878629 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-c99d9f4d6-n5lc9" Mar 18 10:27:36 crc kubenswrapper[4733]: I0318 10:27:36.880553 4733 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 18 10:27:36 crc kubenswrapper[4733]: I0318 10:27:36.880761 4733 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-xkx8r" Mar 18 10:27:36 crc kubenswrapper[4733]: I0318 10:27:36.882454 4733 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 18 10:27:36 crc kubenswrapper[4733]: I0318 10:27:36.899680 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-c99d9f4d6-n5lc9"] Mar 18 10:27:36 crc kubenswrapper[4733]: I0318 10:27:36.955823 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5ddc5ff65-jst9z" Mar 18 10:27:37 crc kubenswrapper[4733]: I0318 10:27:37.002920 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nqfj\" (UniqueName: \"kubernetes.io/projected/37ecdf54-7bcf-4d33-9cd9-f156974ea7f9-kube-api-access-7nqfj\") pod \"metallb-operator-webhook-server-c99d9f4d6-n5lc9\" (UID: \"37ecdf54-7bcf-4d33-9cd9-f156974ea7f9\") " pod="metallb-system/metallb-operator-webhook-server-c99d9f4d6-n5lc9" Mar 18 10:27:37 crc kubenswrapper[4733]: I0318 10:27:37.002984 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/37ecdf54-7bcf-4d33-9cd9-f156974ea7f9-apiservice-cert\") pod \"metallb-operator-webhook-server-c99d9f4d6-n5lc9\" (UID: \"37ecdf54-7bcf-4d33-9cd9-f156974ea7f9\") " pod="metallb-system/metallb-operator-webhook-server-c99d9f4d6-n5lc9" Mar 18 10:27:37 crc kubenswrapper[4733]: I0318 10:27:37.003046 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/37ecdf54-7bcf-4d33-9cd9-f156974ea7f9-webhook-cert\") pod \"metallb-operator-webhook-server-c99d9f4d6-n5lc9\" (UID: \"37ecdf54-7bcf-4d33-9cd9-f156974ea7f9\") " pod="metallb-system/metallb-operator-webhook-server-c99d9f4d6-n5lc9" Mar 18 10:27:37 crc kubenswrapper[4733]: I0318 10:27:37.104092 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nqfj\" (UniqueName: \"kubernetes.io/projected/37ecdf54-7bcf-4d33-9cd9-f156974ea7f9-kube-api-access-7nqfj\") pod \"metallb-operator-webhook-server-c99d9f4d6-n5lc9\" (UID: \"37ecdf54-7bcf-4d33-9cd9-f156974ea7f9\") " pod="metallb-system/metallb-operator-webhook-server-c99d9f4d6-n5lc9" Mar 18 10:27:37 crc kubenswrapper[4733]: I0318 10:27:37.104456 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/37ecdf54-7bcf-4d33-9cd9-f156974ea7f9-apiservice-cert\") pod \"metallb-operator-webhook-server-c99d9f4d6-n5lc9\" (UID: \"37ecdf54-7bcf-4d33-9cd9-f156974ea7f9\") " pod="metallb-system/metallb-operator-webhook-server-c99d9f4d6-n5lc9" Mar 18 10:27:37 crc kubenswrapper[4733]: I0318 10:27:37.104491 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/37ecdf54-7bcf-4d33-9cd9-f156974ea7f9-webhook-cert\") pod \"metallb-operator-webhook-server-c99d9f4d6-n5lc9\" (UID: \"37ecdf54-7bcf-4d33-9cd9-f156974ea7f9\") " pod="metallb-system/metallb-operator-webhook-server-c99d9f4d6-n5lc9" Mar 18 10:27:37 crc kubenswrapper[4733]: I0318 10:27:37.108092 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/37ecdf54-7bcf-4d33-9cd9-f156974ea7f9-apiservice-cert\") pod \"metallb-operator-webhook-server-c99d9f4d6-n5lc9\" (UID: \"37ecdf54-7bcf-4d33-9cd9-f156974ea7f9\") " pod="metallb-system/metallb-operator-webhook-server-c99d9f4d6-n5lc9" Mar 18 10:27:37 crc kubenswrapper[4733]: I0318 10:27:37.108287 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/37ecdf54-7bcf-4d33-9cd9-f156974ea7f9-webhook-cert\") pod \"metallb-operator-webhook-server-c99d9f4d6-n5lc9\" (UID: \"37ecdf54-7bcf-4d33-9cd9-f156974ea7f9\") " pod="metallb-system/metallb-operator-webhook-server-c99d9f4d6-n5lc9" Mar 18 10:27:37 crc kubenswrapper[4733]: I0318 10:27:37.135645 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nqfj\" (UniqueName: \"kubernetes.io/projected/37ecdf54-7bcf-4d33-9cd9-f156974ea7f9-kube-api-access-7nqfj\") pod \"metallb-operator-webhook-server-c99d9f4d6-n5lc9\" (UID: \"37ecdf54-7bcf-4d33-9cd9-f156974ea7f9\") " pod="metallb-system/metallb-operator-webhook-server-c99d9f4d6-n5lc9" Mar 18 10:27:37 crc kubenswrapper[4733]: I0318 10:27:37.188830 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5ddc5ff65-jst9z"] Mar 18 10:27:37 crc kubenswrapper[4733]: W0318 10:27:37.192527 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9731a250_9d78_43e0_bde3_7e769ea43d11.slice/crio-93342bba1bffe5603753132a52801bfdc07b439a3f8211da7437bac551bcc01e WatchSource:0}: Error finding container 93342bba1bffe5603753132a52801bfdc07b439a3f8211da7437bac551bcc01e: Status 404 returned error can't find the container with id 93342bba1bffe5603753132a52801bfdc07b439a3f8211da7437bac551bcc01e Mar 18 10:27:37 crc kubenswrapper[4733]: I0318 10:27:37.194624 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-c99d9f4d6-n5lc9" Mar 18 10:27:37 crc kubenswrapper[4733]: I0318 10:27:37.535099 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-c99d9f4d6-n5lc9"] Mar 18 10:27:37 crc kubenswrapper[4733]: W0318 10:27:37.544852 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37ecdf54_7bcf_4d33_9cd9_f156974ea7f9.slice/crio-7f2887f7ee581e709c23e9fffb49d16eaaf7ff388a25afd577aed07a80b1fe61 WatchSource:0}: Error finding container 7f2887f7ee581e709c23e9fffb49d16eaaf7ff388a25afd577aed07a80b1fe61: Status 404 returned error can't find the container with id 7f2887f7ee581e709c23e9fffb49d16eaaf7ff388a25afd577aed07a80b1fe61 Mar 18 10:27:37 crc kubenswrapper[4733]: I0318 10:27:37.855215 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-c99d9f4d6-n5lc9" event={"ID":"37ecdf54-7bcf-4d33-9cd9-f156974ea7f9","Type":"ContainerStarted","Data":"7f2887f7ee581e709c23e9fffb49d16eaaf7ff388a25afd577aed07a80b1fe61"} Mar 18 10:27:37 crc kubenswrapper[4733]: I0318 10:27:37.862823 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5ddc5ff65-jst9z" event={"ID":"9731a250-9d78-43e0-bde3-7e769ea43d11","Type":"ContainerStarted","Data":"93342bba1bffe5603753132a52801bfdc07b439a3f8211da7437bac551bcc01e"} Mar 18 10:27:42 crc kubenswrapper[4733]: I0318 10:27:42.023932 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zcphv" Mar 18 10:27:42 crc kubenswrapper[4733]: I0318 10:27:42.075524 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zcphv" Mar 18 10:27:42 crc kubenswrapper[4733]: I0318 10:27:42.262522 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zcphv"] Mar 18 10:27:42 crc kubenswrapper[4733]: I0318 10:27:42.898348 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5ddc5ff65-jst9z" event={"ID":"9731a250-9d78-43e0-bde3-7e769ea43d11","Type":"ContainerStarted","Data":"5ef57c73979a2f29c24bd893efac1018c5c8476eb95c9fd28d2fac812ffd78e5"} Mar 18 10:27:42 crc kubenswrapper[4733]: I0318 10:27:42.898967 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-5ddc5ff65-jst9z" Mar 18 10:27:42 crc kubenswrapper[4733]: I0318 10:27:42.902553 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-c99d9f4d6-n5lc9" event={"ID":"37ecdf54-7bcf-4d33-9cd9-f156974ea7f9","Type":"ContainerStarted","Data":"7dbe890480beca5b6ca7f16045b77235d3e37c02e155e8120b1b7381bfa3e1c0"} Mar 18 10:27:42 crc kubenswrapper[4733]: I0318 10:27:42.902614 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-c99d9f4d6-n5lc9" Mar 18 10:27:42 crc kubenswrapper[4733]: I0318 10:27:42.926636 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-5ddc5ff65-jst9z" podStartSLOduration=1.8370444620000002 podStartE2EDuration="6.926607843s" podCreationTimestamp="2026-03-18 10:27:36 +0000 UTC" firstStartedPulling="2026-03-18 10:27:37.194743217 +0000 UTC m=+896.686477562" lastFinishedPulling="2026-03-18 10:27:42.284306618 +0000 UTC m=+901.776040943" observedRunningTime="2026-03-18 10:27:42.922888587 +0000 UTC m=+902.414622912" watchObservedRunningTime="2026-03-18 10:27:42.926607843 +0000 UTC m=+902.418342188" Mar 18 10:27:42 crc kubenswrapper[4733]: I0318 10:27:42.961110 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-c99d9f4d6-n5lc9" podStartSLOduration=2.208227574 podStartE2EDuration="6.961091228s" podCreationTimestamp="2026-03-18 10:27:36 +0000 UTC" firstStartedPulling="2026-03-18 10:27:37.547528714 +0000 UTC m=+897.039263029" lastFinishedPulling="2026-03-18 10:27:42.300392358 +0000 UTC m=+901.792126683" observedRunningTime="2026-03-18 10:27:42.957277789 +0000 UTC m=+902.449012124" watchObservedRunningTime="2026-03-18 10:27:42.961091228 +0000 UTC m=+902.452825563" Mar 18 10:27:43 crc kubenswrapper[4733]: I0318 10:27:43.907555 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zcphv" podUID="a94fa194-a338-4d4d-9b7a-12440afb4e22" containerName="registry-server" containerID="cri-o://d10fef6862efb0197934c1e27e0535679692d0bab9a9c4f0d5d699fb93ec91c8" gracePeriod=2 Mar 18 10:27:44 crc kubenswrapper[4733]: I0318 10:27:44.305278 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zcphv" Mar 18 10:27:44 crc kubenswrapper[4733]: I0318 10:27:44.309504 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a94fa194-a338-4d4d-9b7a-12440afb4e22-utilities\") pod \"a94fa194-a338-4d4d-9b7a-12440afb4e22\" (UID: \"a94fa194-a338-4d4d-9b7a-12440afb4e22\") " Mar 18 10:27:44 crc kubenswrapper[4733]: I0318 10:27:44.309548 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zv2q6\" (UniqueName: \"kubernetes.io/projected/a94fa194-a338-4d4d-9b7a-12440afb4e22-kube-api-access-zv2q6\") pod \"a94fa194-a338-4d4d-9b7a-12440afb4e22\" (UID: \"a94fa194-a338-4d4d-9b7a-12440afb4e22\") " Mar 18 10:27:44 crc kubenswrapper[4733]: I0318 10:27:44.309570 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a94fa194-a338-4d4d-9b7a-12440afb4e22-catalog-content\") pod \"a94fa194-a338-4d4d-9b7a-12440afb4e22\" (UID: \"a94fa194-a338-4d4d-9b7a-12440afb4e22\") " Mar 18 10:27:44 crc kubenswrapper[4733]: I0318 10:27:44.310302 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a94fa194-a338-4d4d-9b7a-12440afb4e22-utilities" (OuterVolumeSpecName: "utilities") pod "a94fa194-a338-4d4d-9b7a-12440afb4e22" (UID: "a94fa194-a338-4d4d-9b7a-12440afb4e22"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:27:44 crc kubenswrapper[4733]: I0318 10:27:44.316044 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a94fa194-a338-4d4d-9b7a-12440afb4e22-kube-api-access-zv2q6" (OuterVolumeSpecName: "kube-api-access-zv2q6") pod "a94fa194-a338-4d4d-9b7a-12440afb4e22" (UID: "a94fa194-a338-4d4d-9b7a-12440afb4e22"). InnerVolumeSpecName "kube-api-access-zv2q6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:27:44 crc kubenswrapper[4733]: I0318 10:27:44.411333 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zv2q6\" (UniqueName: \"kubernetes.io/projected/a94fa194-a338-4d4d-9b7a-12440afb4e22-kube-api-access-zv2q6\") on node \"crc\" DevicePath \"\"" Mar 18 10:27:44 crc kubenswrapper[4733]: I0318 10:27:44.411367 4733 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a94fa194-a338-4d4d-9b7a-12440afb4e22-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 10:27:44 crc kubenswrapper[4733]: I0318 10:27:44.456127 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a94fa194-a338-4d4d-9b7a-12440afb4e22-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a94fa194-a338-4d4d-9b7a-12440afb4e22" (UID: "a94fa194-a338-4d4d-9b7a-12440afb4e22"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:27:44 crc kubenswrapper[4733]: I0318 10:27:44.512285 4733 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a94fa194-a338-4d4d-9b7a-12440afb4e22-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 10:27:44 crc kubenswrapper[4733]: I0318 10:27:44.918475 4733 generic.go:334] "Generic (PLEG): container finished" podID="a94fa194-a338-4d4d-9b7a-12440afb4e22" containerID="d10fef6862efb0197934c1e27e0535679692d0bab9a9c4f0d5d699fb93ec91c8" exitCode=0 Mar 18 10:27:44 crc kubenswrapper[4733]: I0318 10:27:44.918575 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zcphv" Mar 18 10:27:44 crc kubenswrapper[4733]: I0318 10:27:44.918578 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zcphv" event={"ID":"a94fa194-a338-4d4d-9b7a-12440afb4e22","Type":"ContainerDied","Data":"d10fef6862efb0197934c1e27e0535679692d0bab9a9c4f0d5d699fb93ec91c8"} Mar 18 10:27:44 crc kubenswrapper[4733]: I0318 10:27:44.919510 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zcphv" event={"ID":"a94fa194-a338-4d4d-9b7a-12440afb4e22","Type":"ContainerDied","Data":"1acf1672f2e3aed16d46401b9ad4350dbc89e72639a53dcefec1594e4af7cc4b"} Mar 18 10:27:44 crc kubenswrapper[4733]: I0318 10:27:44.919536 4733 scope.go:117] "RemoveContainer" containerID="d10fef6862efb0197934c1e27e0535679692d0bab9a9c4f0d5d699fb93ec91c8" Mar 18 10:27:44 crc kubenswrapper[4733]: I0318 10:27:44.946135 4733 scope.go:117] "RemoveContainer" containerID="8fef4e90a34ce3bc76baae5a86a5806e4e83002d16b9733dc19bc2636f351214" Mar 18 10:27:44 crc kubenswrapper[4733]: I0318 10:27:44.960691 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zcphv"] Mar 18 10:27:44 crc kubenswrapper[4733]: I0318 10:27:44.968832 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zcphv"] Mar 18 10:27:44 crc kubenswrapper[4733]: I0318 10:27:44.997798 4733 scope.go:117] "RemoveContainer" containerID="f9bed651ee1eb90b0878bd6c19ea89ecbdc4aec7e6e5c4b2a087036bf1d47f05" Mar 18 10:27:45 crc kubenswrapper[4733]: I0318 10:27:45.015961 4733 scope.go:117] "RemoveContainer" containerID="d10fef6862efb0197934c1e27e0535679692d0bab9a9c4f0d5d699fb93ec91c8" Mar 18 10:27:45 crc kubenswrapper[4733]: E0318 10:27:45.017626 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d10fef6862efb0197934c1e27e0535679692d0bab9a9c4f0d5d699fb93ec91c8\": container with ID starting with d10fef6862efb0197934c1e27e0535679692d0bab9a9c4f0d5d699fb93ec91c8 not found: ID does not exist" containerID="d10fef6862efb0197934c1e27e0535679692d0bab9a9c4f0d5d699fb93ec91c8" Mar 18 10:27:45 crc kubenswrapper[4733]: I0318 10:27:45.017830 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d10fef6862efb0197934c1e27e0535679692d0bab9a9c4f0d5d699fb93ec91c8"} err="failed to get container status \"d10fef6862efb0197934c1e27e0535679692d0bab9a9c4f0d5d699fb93ec91c8\": rpc error: code = NotFound desc = could not find container \"d10fef6862efb0197934c1e27e0535679692d0bab9a9c4f0d5d699fb93ec91c8\": container with ID starting with d10fef6862efb0197934c1e27e0535679692d0bab9a9c4f0d5d699fb93ec91c8 not found: ID does not exist" Mar 18 10:27:45 crc kubenswrapper[4733]: I0318 10:27:45.017972 4733 scope.go:117] "RemoveContainer" containerID="8fef4e90a34ce3bc76baae5a86a5806e4e83002d16b9733dc19bc2636f351214" Mar 18 10:27:45 crc kubenswrapper[4733]: E0318 10:27:45.018645 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fef4e90a34ce3bc76baae5a86a5806e4e83002d16b9733dc19bc2636f351214\": container with ID starting with 8fef4e90a34ce3bc76baae5a86a5806e4e83002d16b9733dc19bc2636f351214 not found: ID does not exist" containerID="8fef4e90a34ce3bc76baae5a86a5806e4e83002d16b9733dc19bc2636f351214" Mar 18 10:27:45 crc kubenswrapper[4733]: I0318 10:27:45.018718 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fef4e90a34ce3bc76baae5a86a5806e4e83002d16b9733dc19bc2636f351214"} err="failed to get container status \"8fef4e90a34ce3bc76baae5a86a5806e4e83002d16b9733dc19bc2636f351214\": rpc error: code = NotFound desc = could not find container \"8fef4e90a34ce3bc76baae5a86a5806e4e83002d16b9733dc19bc2636f351214\": container with ID starting with 8fef4e90a34ce3bc76baae5a86a5806e4e83002d16b9733dc19bc2636f351214 not found: ID does not exist" Mar 18 10:27:45 crc kubenswrapper[4733]: I0318 10:27:45.018766 4733 scope.go:117] "RemoveContainer" containerID="f9bed651ee1eb90b0878bd6c19ea89ecbdc4aec7e6e5c4b2a087036bf1d47f05" Mar 18 10:27:45 crc kubenswrapper[4733]: E0318 10:27:45.019371 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9bed651ee1eb90b0878bd6c19ea89ecbdc4aec7e6e5c4b2a087036bf1d47f05\": container with ID starting with f9bed651ee1eb90b0878bd6c19ea89ecbdc4aec7e6e5c4b2a087036bf1d47f05 not found: ID does not exist" containerID="f9bed651ee1eb90b0878bd6c19ea89ecbdc4aec7e6e5c4b2a087036bf1d47f05" Mar 18 10:27:45 crc kubenswrapper[4733]: I0318 10:27:45.019454 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9bed651ee1eb90b0878bd6c19ea89ecbdc4aec7e6e5c4b2a087036bf1d47f05"} err="failed to get container status \"f9bed651ee1eb90b0878bd6c19ea89ecbdc4aec7e6e5c4b2a087036bf1d47f05\": rpc error: code = NotFound desc = could not find container \"f9bed651ee1eb90b0878bd6c19ea89ecbdc4aec7e6e5c4b2a087036bf1d47f05\": container with ID starting with f9bed651ee1eb90b0878bd6c19ea89ecbdc4aec7e6e5c4b2a087036bf1d47f05 not found: ID does not exist" Mar 18 10:27:45 crc kubenswrapper[4733]: I0318 10:27:45.183261 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a94fa194-a338-4d4d-9b7a-12440afb4e22" path="/var/lib/kubelet/pods/a94fa194-a338-4d4d-9b7a-12440afb4e22/volumes" Mar 18 10:27:57 crc kubenswrapper[4733]: I0318 10:27:57.205989 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-c99d9f4d6-n5lc9" Mar 18 10:28:00 crc kubenswrapper[4733]: I0318 10:28:00.152254 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563828-zczv7"] Mar 18 10:28:00 crc kubenswrapper[4733]: E0318 10:28:00.152775 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a94fa194-a338-4d4d-9b7a-12440afb4e22" containerName="registry-server" Mar 18 10:28:00 crc kubenswrapper[4733]: I0318 10:28:00.152788 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="a94fa194-a338-4d4d-9b7a-12440afb4e22" containerName="registry-server" Mar 18 10:28:00 crc kubenswrapper[4733]: E0318 10:28:00.152805 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a94fa194-a338-4d4d-9b7a-12440afb4e22" containerName="extract-content" Mar 18 10:28:00 crc kubenswrapper[4733]: I0318 10:28:00.152813 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="a94fa194-a338-4d4d-9b7a-12440afb4e22" containerName="extract-content" Mar 18 10:28:00 crc kubenswrapper[4733]: E0318 10:28:00.152826 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a94fa194-a338-4d4d-9b7a-12440afb4e22" containerName="extract-utilities" Mar 18 10:28:00 crc kubenswrapper[4733]: I0318 10:28:00.152835 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="a94fa194-a338-4d4d-9b7a-12440afb4e22" containerName="extract-utilities" Mar 18 10:28:00 crc kubenswrapper[4733]: I0318 10:28:00.152973 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="a94fa194-a338-4d4d-9b7a-12440afb4e22" containerName="registry-server" Mar 18 10:28:00 crc kubenswrapper[4733]: I0318 10:28:00.153469 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563828-zczv7" Mar 18 10:28:00 crc kubenswrapper[4733]: I0318 10:28:00.156559 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:28:00 crc kubenswrapper[4733]: I0318 10:28:00.157109 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:28:00 crc kubenswrapper[4733]: I0318 10:28:00.158006 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wmd5k" Mar 18 10:28:00 crc kubenswrapper[4733]: I0318 10:28:00.172863 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563828-zczv7"] Mar 18 10:28:00 crc kubenswrapper[4733]: I0318 10:28:00.346220 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smzsq\" (UniqueName: \"kubernetes.io/projected/68574d72-725d-48c2-b645-bd83dcccbf80-kube-api-access-smzsq\") pod \"auto-csr-approver-29563828-zczv7\" (UID: \"68574d72-725d-48c2-b645-bd83dcccbf80\") " pod="openshift-infra/auto-csr-approver-29563828-zczv7" Mar 18 10:28:00 crc kubenswrapper[4733]: I0318 10:28:00.447246 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smzsq\" (UniqueName: \"kubernetes.io/projected/68574d72-725d-48c2-b645-bd83dcccbf80-kube-api-access-smzsq\") pod \"auto-csr-approver-29563828-zczv7\" (UID: \"68574d72-725d-48c2-b645-bd83dcccbf80\") " pod="openshift-infra/auto-csr-approver-29563828-zczv7" Mar 18 10:28:00 crc kubenswrapper[4733]: I0318 10:28:00.484161 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smzsq\" (UniqueName: \"kubernetes.io/projected/68574d72-725d-48c2-b645-bd83dcccbf80-kube-api-access-smzsq\") pod \"auto-csr-approver-29563828-zczv7\" (UID: \"68574d72-725d-48c2-b645-bd83dcccbf80\") " pod="openshift-infra/auto-csr-approver-29563828-zczv7" Mar 18 10:28:00 crc kubenswrapper[4733]: I0318 10:28:00.484684 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563828-zczv7" Mar 18 10:28:00 crc kubenswrapper[4733]: I0318 10:28:00.779876 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563828-zczv7"] Mar 18 10:28:01 crc kubenswrapper[4733]: I0318 10:28:01.034295 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563828-zczv7" event={"ID":"68574d72-725d-48c2-b645-bd83dcccbf80","Type":"ContainerStarted","Data":"5b7e6b80bc4fb02a1f7afcf04f2e7e7266bf3a39aab1842bdbd19ce39cd1c153"} Mar 18 10:28:03 crc kubenswrapper[4733]: I0318 10:28:03.054551 4733 generic.go:334] "Generic (PLEG): container finished" podID="68574d72-725d-48c2-b645-bd83dcccbf80" containerID="f4a3549ea82cce03bd994263d641938a407bdfdc2f86792bccee0b653493614d" exitCode=0 Mar 18 10:28:03 crc kubenswrapper[4733]: I0318 10:28:03.054645 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563828-zczv7" event={"ID":"68574d72-725d-48c2-b645-bd83dcccbf80","Type":"ContainerDied","Data":"f4a3549ea82cce03bd994263d641938a407bdfdc2f86792bccee0b653493614d"} Mar 18 10:28:04 crc kubenswrapper[4733]: I0318 10:28:04.306954 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563828-zczv7" Mar 18 10:28:04 crc kubenswrapper[4733]: I0318 10:28:04.507787 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smzsq\" (UniqueName: \"kubernetes.io/projected/68574d72-725d-48c2-b645-bd83dcccbf80-kube-api-access-smzsq\") pod \"68574d72-725d-48c2-b645-bd83dcccbf80\" (UID: \"68574d72-725d-48c2-b645-bd83dcccbf80\") " Mar 18 10:28:04 crc kubenswrapper[4733]: I0318 10:28:04.516365 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68574d72-725d-48c2-b645-bd83dcccbf80-kube-api-access-smzsq" (OuterVolumeSpecName: "kube-api-access-smzsq") pod "68574d72-725d-48c2-b645-bd83dcccbf80" (UID: "68574d72-725d-48c2-b645-bd83dcccbf80"). InnerVolumeSpecName "kube-api-access-smzsq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:28:04 crc kubenswrapper[4733]: I0318 10:28:04.609625 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-smzsq\" (UniqueName: \"kubernetes.io/projected/68574d72-725d-48c2-b645-bd83dcccbf80-kube-api-access-smzsq\") on node \"crc\" DevicePath \"\"" Mar 18 10:28:05 crc kubenswrapper[4733]: I0318 10:28:05.071737 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563828-zczv7" event={"ID":"68574d72-725d-48c2-b645-bd83dcccbf80","Type":"ContainerDied","Data":"5b7e6b80bc4fb02a1f7afcf04f2e7e7266bf3a39aab1842bdbd19ce39cd1c153"} Mar 18 10:28:05 crc kubenswrapper[4733]: I0318 10:28:05.071816 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b7e6b80bc4fb02a1f7afcf04f2e7e7266bf3a39aab1842bdbd19ce39cd1c153" Mar 18 10:28:05 crc kubenswrapper[4733]: I0318 10:28:05.071828 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563828-zczv7" Mar 18 10:28:05 crc kubenswrapper[4733]: I0318 10:28:05.375500 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563822-4fvb8"] Mar 18 10:28:05 crc kubenswrapper[4733]: I0318 10:28:05.382542 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563822-4fvb8"] Mar 18 10:28:07 crc kubenswrapper[4733]: I0318 10:28:07.189486 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f93d05a-41f2-4422-88aa-9dfddb13191f" path="/var/lib/kubelet/pods/3f93d05a-41f2-4422-88aa-9dfddb13191f/volumes" Mar 18 10:28:16 crc kubenswrapper[4733]: I0318 10:28:16.961275 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-5ddc5ff65-jst9z" Mar 18 10:28:17 crc kubenswrapper[4733]: I0318 10:28:17.744795 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-pc5zz"] Mar 18 10:28:17 crc kubenswrapper[4733]: E0318 10:28:17.745069 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68574d72-725d-48c2-b645-bd83dcccbf80" containerName="oc" Mar 18 10:28:17 crc kubenswrapper[4733]: I0318 10:28:17.745092 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="68574d72-725d-48c2-b645-bd83dcccbf80" containerName="oc" Mar 18 10:28:17 crc kubenswrapper[4733]: I0318 10:28:17.745253 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="68574d72-725d-48c2-b645-bd83dcccbf80" containerName="oc" Mar 18 10:28:17 crc kubenswrapper[4733]: I0318 10:28:17.747537 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-pc5zz" Mar 18 10:28:17 crc kubenswrapper[4733]: I0318 10:28:17.748990 4733 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 18 10:28:17 crc kubenswrapper[4733]: I0318 10:28:17.751505 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 18 10:28:17 crc kubenswrapper[4733]: I0318 10:28:17.751816 4733 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-5nckh" Mar 18 10:28:17 crc kubenswrapper[4733]: I0318 10:28:17.751883 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-dr9dg"] Mar 18 10:28:17 crc kubenswrapper[4733]: I0318 10:28:17.752935 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-dr9dg" Mar 18 10:28:17 crc kubenswrapper[4733]: I0318 10:28:17.755447 4733 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 18 10:28:17 crc kubenswrapper[4733]: I0318 10:28:17.761833 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-dr9dg"] Mar 18 10:28:17 crc kubenswrapper[4733]: I0318 10:28:17.829905 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-zg5cv"] Mar 18 10:28:17 crc kubenswrapper[4733]: I0318 10:28:17.830965 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-zg5cv" Mar 18 10:28:17 crc kubenswrapper[4733]: I0318 10:28:17.833736 4733 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 18 10:28:17 crc kubenswrapper[4733]: I0318 10:28:17.833768 4733 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 18 10:28:17 crc kubenswrapper[4733]: I0318 10:28:17.833788 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 18 10:28:17 crc kubenswrapper[4733]: I0318 10:28:17.834460 4733 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-cbdb8" Mar 18 10:28:17 crc kubenswrapper[4733]: I0318 10:28:17.846388 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-zsljc"] Mar 18 10:28:17 crc kubenswrapper[4733]: I0318 10:28:17.847541 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-zsljc" Mar 18 10:28:17 crc kubenswrapper[4733]: I0318 10:28:17.849492 4733 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 18 10:28:17 crc kubenswrapper[4733]: I0318 10:28:17.853988 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-zsljc"] Mar 18 10:28:17 crc kubenswrapper[4733]: I0318 10:28:17.900434 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lp57k\" (UniqueName: \"kubernetes.io/projected/03476444-8ff8-4b1e-bcbc-ee654241370b-kube-api-access-lp57k\") pod \"frr-k8s-webhook-server-bcc4b6f68-dr9dg\" (UID: \"03476444-8ff8-4b1e-bcbc-ee654241370b\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-dr9dg" Mar 18 10:28:17 crc kubenswrapper[4733]: I0318 10:28:17.900476 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e-metrics\") pod \"frr-k8s-pc5zz\" (UID: \"4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e\") " pod="metallb-system/frr-k8s-pc5zz" Mar 18 10:28:17 crc kubenswrapper[4733]: I0318 10:28:17.900495 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/03476444-8ff8-4b1e-bcbc-ee654241370b-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-dr9dg\" (UID: \"03476444-8ff8-4b1e-bcbc-ee654241370b\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-dr9dg" Mar 18 10:28:17 crc kubenswrapper[4733]: I0318 10:28:17.900608 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e-frr-startup\") pod \"frr-k8s-pc5zz\" (UID: \"4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e\") " pod="metallb-system/frr-k8s-pc5zz" Mar 18 10:28:17 crc kubenswrapper[4733]: I0318 10:28:17.900627 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e-frr-conf\") pod \"frr-k8s-pc5zz\" (UID: \"4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e\") " pod="metallb-system/frr-k8s-pc5zz" Mar 18 10:28:17 crc kubenswrapper[4733]: I0318 10:28:17.900662 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e-frr-sockets\") pod \"frr-k8s-pc5zz\" (UID: \"4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e\") " pod="metallb-system/frr-k8s-pc5zz" Mar 18 10:28:17 crc kubenswrapper[4733]: I0318 10:28:17.900704 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e-metrics-certs\") pod \"frr-k8s-pc5zz\" (UID: \"4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e\") " pod="metallb-system/frr-k8s-pc5zz" Mar 18 10:28:17 crc kubenswrapper[4733]: I0318 10:28:17.900726 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvj9w\" (UniqueName: \"kubernetes.io/projected/4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e-kube-api-access-pvj9w\") pod \"frr-k8s-pc5zz\" (UID: \"4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e\") " pod="metallb-system/frr-k8s-pc5zz" Mar 18 10:28:17 crc kubenswrapper[4733]: I0318 10:28:17.900812 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e-reloader\") pod \"frr-k8s-pc5zz\" (UID: \"4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e\") " pod="metallb-system/frr-k8s-pc5zz" Mar 18 10:28:18 crc kubenswrapper[4733]: I0318 10:28:18.002589 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/101c5687-bebd-449f-94c8-03077bf596d0-memberlist\") pod \"speaker-zg5cv\" (UID: \"101c5687-bebd-449f-94c8-03077bf596d0\") " pod="metallb-system/speaker-zg5cv" Mar 18 10:28:18 crc kubenswrapper[4733]: I0318 10:28:18.002643 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e-frr-startup\") pod \"frr-k8s-pc5zz\" (UID: \"4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e\") " pod="metallb-system/frr-k8s-pc5zz" Mar 18 10:28:18 crc kubenswrapper[4733]: I0318 10:28:18.002661 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e-frr-conf\") pod \"frr-k8s-pc5zz\" (UID: \"4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e\") " pod="metallb-system/frr-k8s-pc5zz" Mar 18 10:28:18 crc kubenswrapper[4733]: I0318 10:28:18.002688 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zkdn\" (UniqueName: \"kubernetes.io/projected/7eed25d9-11cc-4ca1-b715-0a77d4dcc8e0-kube-api-access-9zkdn\") pod \"controller-7bb4cc7c98-zsljc\" (UID: \"7eed25d9-11cc-4ca1-b715-0a77d4dcc8e0\") " pod="metallb-system/controller-7bb4cc7c98-zsljc" Mar 18 10:28:18 crc kubenswrapper[4733]: I0318 10:28:18.002747 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/101c5687-bebd-449f-94c8-03077bf596d0-metallb-excludel2\") pod \"speaker-zg5cv\" (UID: \"101c5687-bebd-449f-94c8-03077bf596d0\") " pod="metallb-system/speaker-zg5cv" Mar 18 10:28:18 crc kubenswrapper[4733]: I0318 10:28:18.002792 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84g5w\" (UniqueName: \"kubernetes.io/projected/101c5687-bebd-449f-94c8-03077bf596d0-kube-api-access-84g5w\") pod \"speaker-zg5cv\" (UID: \"101c5687-bebd-449f-94c8-03077bf596d0\") " pod="metallb-system/speaker-zg5cv" Mar 18 10:28:18 crc kubenswrapper[4733]: I0318 10:28:18.002844 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e-frr-sockets\") pod \"frr-k8s-pc5zz\" (UID: \"4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e\") " pod="metallb-system/frr-k8s-pc5zz" Mar 18 10:28:18 crc kubenswrapper[4733]: I0318 10:28:18.002866 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e-metrics-certs\") pod \"frr-k8s-pc5zz\" (UID: \"4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e\") " pod="metallb-system/frr-k8s-pc5zz" Mar 18 10:28:18 crc kubenswrapper[4733]: I0318 10:28:18.003148 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e-frr-conf\") pod \"frr-k8s-pc5zz\" (UID: \"4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e\") " pod="metallb-system/frr-k8s-pc5zz" Mar 18 10:28:18 crc kubenswrapper[4733]: I0318 10:28:18.003247 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e-frr-sockets\") pod \"frr-k8s-pc5zz\" (UID: \"4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e\") " pod="metallb-system/frr-k8s-pc5zz" Mar 18 10:28:18 crc kubenswrapper[4733]: I0318 10:28:18.003299 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7eed25d9-11cc-4ca1-b715-0a77d4dcc8e0-metrics-certs\") pod \"controller-7bb4cc7c98-zsljc\" (UID: \"7eed25d9-11cc-4ca1-b715-0a77d4dcc8e0\") " pod="metallb-system/controller-7bb4cc7c98-zsljc" Mar 18 10:28:18 crc kubenswrapper[4733]: I0318 10:28:18.003343 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/101c5687-bebd-449f-94c8-03077bf596d0-metrics-certs\") pod \"speaker-zg5cv\" (UID: \"101c5687-bebd-449f-94c8-03077bf596d0\") " pod="metallb-system/speaker-zg5cv" Mar 18 10:28:18 crc kubenswrapper[4733]: I0318 10:28:18.003363 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvj9w\" (UniqueName: \"kubernetes.io/projected/4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e-kube-api-access-pvj9w\") pod \"frr-k8s-pc5zz\" (UID: \"4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e\") " pod="metallb-system/frr-k8s-pc5zz" Mar 18 10:28:18 crc kubenswrapper[4733]: I0318 10:28:18.003411 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e-reloader\") pod \"frr-k8s-pc5zz\" (UID: \"4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e\") " pod="metallb-system/frr-k8s-pc5zz" Mar 18 10:28:18 crc kubenswrapper[4733]: I0318 10:28:18.003616 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e-frr-startup\") pod \"frr-k8s-pc5zz\" (UID: \"4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e\") " pod="metallb-system/frr-k8s-pc5zz" Mar 18 10:28:18 crc kubenswrapper[4733]: I0318 10:28:18.003633 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e-reloader\") pod \"frr-k8s-pc5zz\" (UID: \"4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e\") " pod="metallb-system/frr-k8s-pc5zz" Mar 18 10:28:18 crc kubenswrapper[4733]: I0318 10:28:18.003430 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lp57k\" (UniqueName: \"kubernetes.io/projected/03476444-8ff8-4b1e-bcbc-ee654241370b-kube-api-access-lp57k\") pod \"frr-k8s-webhook-server-bcc4b6f68-dr9dg\" (UID: \"03476444-8ff8-4b1e-bcbc-ee654241370b\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-dr9dg" Mar 18 10:28:18 crc kubenswrapper[4733]: I0318 10:28:18.003680 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e-metrics\") pod \"frr-k8s-pc5zz\" (UID: \"4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e\") " pod="metallb-system/frr-k8s-pc5zz" Mar 18 10:28:18 crc kubenswrapper[4733]: I0318 10:28:18.003695 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7eed25d9-11cc-4ca1-b715-0a77d4dcc8e0-cert\") pod \"controller-7bb4cc7c98-zsljc\" (UID: \"7eed25d9-11cc-4ca1-b715-0a77d4dcc8e0\") " pod="metallb-system/controller-7bb4cc7c98-zsljc" Mar 18 10:28:18 crc kubenswrapper[4733]: I0318 10:28:18.003950 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e-metrics\") pod \"frr-k8s-pc5zz\" (UID: \"4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e\") " pod="metallb-system/frr-k8s-pc5zz" Mar 18 10:28:18 crc kubenswrapper[4733]: I0318 10:28:18.004010 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/03476444-8ff8-4b1e-bcbc-ee654241370b-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-dr9dg\" (UID: \"03476444-8ff8-4b1e-bcbc-ee654241370b\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-dr9dg" Mar 18 10:28:18 crc kubenswrapper[4733]: I0318 10:28:18.008543 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e-metrics-certs\") pod \"frr-k8s-pc5zz\" (UID: \"4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e\") " pod="metallb-system/frr-k8s-pc5zz" Mar 18 10:28:18 crc kubenswrapper[4733]: I0318 10:28:18.016358 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/03476444-8ff8-4b1e-bcbc-ee654241370b-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-dr9dg\" (UID: \"03476444-8ff8-4b1e-bcbc-ee654241370b\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-dr9dg" Mar 18 10:28:18 crc kubenswrapper[4733]: I0318 10:28:18.020545 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lp57k\" (UniqueName: \"kubernetes.io/projected/03476444-8ff8-4b1e-bcbc-ee654241370b-kube-api-access-lp57k\") pod \"frr-k8s-webhook-server-bcc4b6f68-dr9dg\" (UID: \"03476444-8ff8-4b1e-bcbc-ee654241370b\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-dr9dg" Mar 18 10:28:18 crc kubenswrapper[4733]: I0318 10:28:18.023642 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvj9w\" (UniqueName: \"kubernetes.io/projected/4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e-kube-api-access-pvj9w\") pod \"frr-k8s-pc5zz\" (UID: \"4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e\") " pod="metallb-system/frr-k8s-pc5zz" Mar 18 10:28:18 crc kubenswrapper[4733]: I0318 10:28:18.066618 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-pc5zz" Mar 18 10:28:18 crc kubenswrapper[4733]: I0318 10:28:18.074491 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-dr9dg" Mar 18 10:28:18 crc kubenswrapper[4733]: I0318 10:28:18.105376 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/101c5687-bebd-449f-94c8-03077bf596d0-memberlist\") pod \"speaker-zg5cv\" (UID: \"101c5687-bebd-449f-94c8-03077bf596d0\") " pod="metallb-system/speaker-zg5cv" Mar 18 10:28:18 crc kubenswrapper[4733]: I0318 10:28:18.105459 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zkdn\" (UniqueName: \"kubernetes.io/projected/7eed25d9-11cc-4ca1-b715-0a77d4dcc8e0-kube-api-access-9zkdn\") pod \"controller-7bb4cc7c98-zsljc\" (UID: \"7eed25d9-11cc-4ca1-b715-0a77d4dcc8e0\") " pod="metallb-system/controller-7bb4cc7c98-zsljc" Mar 18 10:28:18 crc kubenswrapper[4733]: I0318 10:28:18.105488 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/101c5687-bebd-449f-94c8-03077bf596d0-metallb-excludel2\") pod \"speaker-zg5cv\" (UID: \"101c5687-bebd-449f-94c8-03077bf596d0\") " pod="metallb-system/speaker-zg5cv" Mar 18 10:28:18 crc kubenswrapper[4733]: I0318 10:28:18.105511 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84g5w\" (UniqueName: \"kubernetes.io/projected/101c5687-bebd-449f-94c8-03077bf596d0-kube-api-access-84g5w\") pod \"speaker-zg5cv\" (UID: \"101c5687-bebd-449f-94c8-03077bf596d0\") " pod="metallb-system/speaker-zg5cv" Mar 18 10:28:18 crc kubenswrapper[4733]: E0318 10:28:18.105516 4733 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 18 10:28:18 crc kubenswrapper[4733]: I0318 10:28:18.105553 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7eed25d9-11cc-4ca1-b715-0a77d4dcc8e0-metrics-certs\") pod \"controller-7bb4cc7c98-zsljc\" (UID: \"7eed25d9-11cc-4ca1-b715-0a77d4dcc8e0\") " pod="metallb-system/controller-7bb4cc7c98-zsljc" Mar 18 10:28:18 crc kubenswrapper[4733]: E0318 10:28:18.105589 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/101c5687-bebd-449f-94c8-03077bf596d0-memberlist podName:101c5687-bebd-449f-94c8-03077bf596d0 nodeName:}" failed. No retries permitted until 2026-03-18 10:28:18.605567199 +0000 UTC m=+938.097301524 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/101c5687-bebd-449f-94c8-03077bf596d0-memberlist") pod "speaker-zg5cv" (UID: "101c5687-bebd-449f-94c8-03077bf596d0") : secret "metallb-memberlist" not found Mar 18 10:28:18 crc kubenswrapper[4733]: I0318 10:28:18.105613 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/101c5687-bebd-449f-94c8-03077bf596d0-metrics-certs\") pod \"speaker-zg5cv\" (UID: \"101c5687-bebd-449f-94c8-03077bf596d0\") " pod="metallb-system/speaker-zg5cv" Mar 18 10:28:18 crc kubenswrapper[4733]: I0318 10:28:18.105716 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7eed25d9-11cc-4ca1-b715-0a77d4dcc8e0-cert\") pod \"controller-7bb4cc7c98-zsljc\" (UID: \"7eed25d9-11cc-4ca1-b715-0a77d4dcc8e0\") " pod="metallb-system/controller-7bb4cc7c98-zsljc" Mar 18 10:28:18 crc kubenswrapper[4733]: I0318 10:28:18.106754 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/101c5687-bebd-449f-94c8-03077bf596d0-metallb-excludel2\") pod \"speaker-zg5cv\" (UID: \"101c5687-bebd-449f-94c8-03077bf596d0\") " pod="metallb-system/speaker-zg5cv" Mar 18 10:28:18 crc kubenswrapper[4733]: I0318 10:28:18.107984 4733 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 18 10:28:18 crc kubenswrapper[4733]: I0318 10:28:18.109479 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7eed25d9-11cc-4ca1-b715-0a77d4dcc8e0-metrics-certs\") pod \"controller-7bb4cc7c98-zsljc\" (UID: \"7eed25d9-11cc-4ca1-b715-0a77d4dcc8e0\") " pod="metallb-system/controller-7bb4cc7c98-zsljc" Mar 18 10:28:18 crc kubenswrapper[4733]: I0318 10:28:18.112519 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/101c5687-bebd-449f-94c8-03077bf596d0-metrics-certs\") pod \"speaker-zg5cv\" (UID: \"101c5687-bebd-449f-94c8-03077bf596d0\") " pod="metallb-system/speaker-zg5cv" Mar 18 10:28:18 crc kubenswrapper[4733]: I0318 10:28:18.120642 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7eed25d9-11cc-4ca1-b715-0a77d4dcc8e0-cert\") pod \"controller-7bb4cc7c98-zsljc\" (UID: \"7eed25d9-11cc-4ca1-b715-0a77d4dcc8e0\") " pod="metallb-system/controller-7bb4cc7c98-zsljc" Mar 18 10:28:18 crc kubenswrapper[4733]: I0318 10:28:18.133346 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84g5w\" (UniqueName: \"kubernetes.io/projected/101c5687-bebd-449f-94c8-03077bf596d0-kube-api-access-84g5w\") pod \"speaker-zg5cv\" (UID: \"101c5687-bebd-449f-94c8-03077bf596d0\") " pod="metallb-system/speaker-zg5cv" Mar 18 10:28:18 crc kubenswrapper[4733]: I0318 10:28:18.140999 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zkdn\" (UniqueName: \"kubernetes.io/projected/7eed25d9-11cc-4ca1-b715-0a77d4dcc8e0-kube-api-access-9zkdn\") pod \"controller-7bb4cc7c98-zsljc\" (UID: \"7eed25d9-11cc-4ca1-b715-0a77d4dcc8e0\") " pod="metallb-system/controller-7bb4cc7c98-zsljc" Mar 18 10:28:18 crc kubenswrapper[4733]: I0318 10:28:18.159951 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-zsljc" Mar 18 10:28:18 crc kubenswrapper[4733]: I0318 10:28:18.390809 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-zsljc"] Mar 18 10:28:18 crc kubenswrapper[4733]: I0318 10:28:18.543499 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-dr9dg"] Mar 18 10:28:18 crc kubenswrapper[4733]: W0318 10:28:18.553517 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod03476444_8ff8_4b1e_bcbc_ee654241370b.slice/crio-2e377ff8591027d484d966029946b79e8ff3f7ceac70bb72e86775a2f0b2d378 WatchSource:0}: Error finding container 2e377ff8591027d484d966029946b79e8ff3f7ceac70bb72e86775a2f0b2d378: Status 404 returned error can't find the container with id 2e377ff8591027d484d966029946b79e8ff3f7ceac70bb72e86775a2f0b2d378 Mar 18 10:28:18 crc kubenswrapper[4733]: I0318 10:28:18.610716 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/101c5687-bebd-449f-94c8-03077bf596d0-memberlist\") pod \"speaker-zg5cv\" (UID: \"101c5687-bebd-449f-94c8-03077bf596d0\") " pod="metallb-system/speaker-zg5cv" Mar 18 10:28:18 crc kubenswrapper[4733]: I0318 10:28:18.616386 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/101c5687-bebd-449f-94c8-03077bf596d0-memberlist\") pod \"speaker-zg5cv\" (UID: \"101c5687-bebd-449f-94c8-03077bf596d0\") " pod="metallb-system/speaker-zg5cv" Mar 18 10:28:18 crc kubenswrapper[4733]: I0318 10:28:18.743283 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-zg5cv" Mar 18 10:28:18 crc kubenswrapper[4733]: W0318 10:28:18.767720 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod101c5687_bebd_449f_94c8_03077bf596d0.slice/crio-361d42d5a7326ad4e5eae10ddda3f7b97c0199c2d631383b5fa2146dfd825fdd WatchSource:0}: Error finding container 361d42d5a7326ad4e5eae10ddda3f7b97c0199c2d631383b5fa2146dfd825fdd: Status 404 returned error can't find the container with id 361d42d5a7326ad4e5eae10ddda3f7b97c0199c2d631383b5fa2146dfd825fdd Mar 18 10:28:19 crc kubenswrapper[4733]: I0318 10:28:19.212845 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-zsljc" event={"ID":"7eed25d9-11cc-4ca1-b715-0a77d4dcc8e0","Type":"ContainerStarted","Data":"ba7b307fc134f9369d0894115c979129b8eb46fcc62b5c869e057899a1e2b1cf"} Mar 18 10:28:19 crc kubenswrapper[4733]: I0318 10:28:19.213288 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-zsljc" Mar 18 10:28:19 crc kubenswrapper[4733]: I0318 10:28:19.213309 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-zsljc" event={"ID":"7eed25d9-11cc-4ca1-b715-0a77d4dcc8e0","Type":"ContainerStarted","Data":"73ba0541c10f5b0a50e8ce47bb8a17737bc73f260ca54632bdd84ee798bebe1f"} Mar 18 10:28:19 crc kubenswrapper[4733]: I0318 10:28:19.213322 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-zsljc" event={"ID":"7eed25d9-11cc-4ca1-b715-0a77d4dcc8e0","Type":"ContainerStarted","Data":"115c888652219a3db8c1591a5cf20ced90238786397975c66c708d7deb9caf63"} Mar 18 10:28:19 crc kubenswrapper[4733]: I0318 10:28:19.214857 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-zg5cv" event={"ID":"101c5687-bebd-449f-94c8-03077bf596d0","Type":"ContainerStarted","Data":"492daea53269b124abeba4fabdb1687ec7eb9f9c7b9c930a5b934ddd7bd1690a"} Mar 18 10:28:19 crc kubenswrapper[4733]: I0318 10:28:19.214891 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-zg5cv" event={"ID":"101c5687-bebd-449f-94c8-03077bf596d0","Type":"ContainerStarted","Data":"361d42d5a7326ad4e5eae10ddda3f7b97c0199c2d631383b5fa2146dfd825fdd"} Mar 18 10:28:19 crc kubenswrapper[4733]: I0318 10:28:19.215939 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pc5zz" event={"ID":"4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e","Type":"ContainerStarted","Data":"d3ce1c0fa6046f7621200f51af9eca75a94739beedb1c458b841612084cce14d"} Mar 18 10:28:19 crc kubenswrapper[4733]: I0318 10:28:19.216970 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-dr9dg" event={"ID":"03476444-8ff8-4b1e-bcbc-ee654241370b","Type":"ContainerStarted","Data":"2e377ff8591027d484d966029946b79e8ff3f7ceac70bb72e86775a2f0b2d378"} Mar 18 10:28:19 crc kubenswrapper[4733]: I0318 10:28:19.235009 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-zsljc" podStartSLOduration=2.234987441 podStartE2EDuration="2.234987441s" podCreationTimestamp="2026-03-18 10:28:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:28:19.230663368 +0000 UTC m=+938.722397713" watchObservedRunningTime="2026-03-18 10:28:19.234987441 +0000 UTC m=+938.726721766" Mar 18 10:28:20 crc kubenswrapper[4733]: I0318 10:28:20.227833 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-zg5cv" event={"ID":"101c5687-bebd-449f-94c8-03077bf596d0","Type":"ContainerStarted","Data":"8b1c9943dcbc575fc8ba1fcceaa064a66e93c0d1b38dd42aa5436b6fff4c0b33"} Mar 18 10:28:20 crc kubenswrapper[4733]: I0318 10:28:20.246838 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-zg5cv" podStartSLOduration=3.246824036 podStartE2EDuration="3.246824036s" podCreationTimestamp="2026-03-18 10:28:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:28:20.244866141 +0000 UTC m=+939.736600466" watchObservedRunningTime="2026-03-18 10:28:20.246824036 +0000 UTC m=+939.738558361" Mar 18 10:28:21 crc kubenswrapper[4733]: I0318 10:28:21.232276 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-zg5cv" Mar 18 10:28:26 crc kubenswrapper[4733]: I0318 10:28:26.270113 4733 generic.go:334] "Generic (PLEG): container finished" podID="4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e" containerID="77592e00ae10c5cb3abbce4b303f094fcb801175e4f65997fe669e8b2217f4a0" exitCode=0 Mar 18 10:28:26 crc kubenswrapper[4733]: I0318 10:28:26.270251 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pc5zz" event={"ID":"4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e","Type":"ContainerDied","Data":"77592e00ae10c5cb3abbce4b303f094fcb801175e4f65997fe669e8b2217f4a0"} Mar 18 10:28:26 crc kubenswrapper[4733]: I0318 10:28:26.274977 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-dr9dg" event={"ID":"03476444-8ff8-4b1e-bcbc-ee654241370b","Type":"ContainerStarted","Data":"15e8ccb7c43f4feaba91b540617a0850e2756a1b17a1f22655bbc0aab2ef0119"} Mar 18 10:28:26 crc kubenswrapper[4733]: I0318 10:28:26.275215 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-dr9dg" Mar 18 10:28:26 crc kubenswrapper[4733]: I0318 10:28:26.335621 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-dr9dg" podStartSLOduration=1.930053147 podStartE2EDuration="9.335594652s" podCreationTimestamp="2026-03-18 10:28:17 +0000 UTC" firstStartedPulling="2026-03-18 10:28:18.556155486 +0000 UTC m=+938.047889831" lastFinishedPulling="2026-03-18 10:28:25.961697011 +0000 UTC m=+945.453431336" observedRunningTime="2026-03-18 10:28:26.326412982 +0000 UTC m=+945.818147347" watchObservedRunningTime="2026-03-18 10:28:26.335594652 +0000 UTC m=+945.827329007" Mar 18 10:28:27 crc kubenswrapper[4733]: I0318 10:28:27.285684 4733 generic.go:334] "Generic (PLEG): container finished" podID="4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e" containerID="19ffeb7c6c69ef8972961611d99d630a3ffff4ad8ae9c2bdcefdae455d13de4e" exitCode=0 Mar 18 10:28:27 crc kubenswrapper[4733]: I0318 10:28:27.285763 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pc5zz" event={"ID":"4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e","Type":"ContainerDied","Data":"19ffeb7c6c69ef8972961611d99d630a3ffff4ad8ae9c2bdcefdae455d13de4e"} Mar 18 10:28:28 crc kubenswrapper[4733]: I0318 10:28:28.165488 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-zsljc" Mar 18 10:28:28 crc kubenswrapper[4733]: I0318 10:28:28.308895 4733 generic.go:334] "Generic (PLEG): container finished" podID="4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e" containerID="8f29496ea299a3c1c374b7f0ca06a25badcf18e2cc3b29bdd493053b8730b4c7" exitCode=0 Mar 18 10:28:28 crc kubenswrapper[4733]: I0318 10:28:28.308939 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pc5zz" event={"ID":"4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e","Type":"ContainerDied","Data":"8f29496ea299a3c1c374b7f0ca06a25badcf18e2cc3b29bdd493053b8730b4c7"} Mar 18 10:28:29 crc kubenswrapper[4733]: I0318 10:28:29.324552 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pc5zz" event={"ID":"4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e","Type":"ContainerStarted","Data":"b1c58c746b575489046dabc4b9328bd430507d943f24275c2f719c3f76f7aae8"} Mar 18 10:28:29 crc kubenswrapper[4733]: I0318 10:28:29.324895 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pc5zz" event={"ID":"4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e","Type":"ContainerStarted","Data":"e474f08af12ff42635fc4df678c01fa724e980a724fc8145d689081cc137a0f0"} Mar 18 10:28:29 crc kubenswrapper[4733]: I0318 10:28:29.324906 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pc5zz" event={"ID":"4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e","Type":"ContainerStarted","Data":"89057b4cfb461860fbc73703161e75c7cc575a895caae65b565138321b2f5dbd"} Mar 18 10:28:29 crc kubenswrapper[4733]: I0318 10:28:29.324915 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pc5zz" event={"ID":"4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e","Type":"ContainerStarted","Data":"417383e21e4a51247d15e7ba43740fe7b14bcc54788cb95fc208160cf62e42b7"} Mar 18 10:28:29 crc kubenswrapper[4733]: I0318 10:28:29.324923 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pc5zz" event={"ID":"4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e","Type":"ContainerStarted","Data":"9a6288e6c81f2c40bc1cb5404e34f2002e68018920483c192b0523532e3f0a05"} Mar 18 10:28:30 crc kubenswrapper[4733]: I0318 10:28:30.338107 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pc5zz" event={"ID":"4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e","Type":"ContainerStarted","Data":"ecdc9b2404abc2ca69731d8334d73462bea7eaeba4cb929a36168096f466678a"} Mar 18 10:28:30 crc kubenswrapper[4733]: I0318 10:28:30.339062 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-pc5zz" Mar 18 10:28:30 crc kubenswrapper[4733]: I0318 10:28:30.373625 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-pc5zz" podStartSLOduration=5.651094269 podStartE2EDuration="13.373600419s" podCreationTimestamp="2026-03-18 10:28:17 +0000 UTC" firstStartedPulling="2026-03-18 10:28:18.250307247 +0000 UTC m=+937.742041572" lastFinishedPulling="2026-03-18 10:28:25.972813397 +0000 UTC m=+945.464547722" observedRunningTime="2026-03-18 10:28:30.371224052 +0000 UTC m=+949.862958397" watchObservedRunningTime="2026-03-18 10:28:30.373600419 +0000 UTC m=+949.865334754" Mar 18 10:28:33 crc kubenswrapper[4733]: I0318 10:28:33.067120 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-pc5zz" Mar 18 10:28:33 crc kubenswrapper[4733]: I0318 10:28:33.123814 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-pc5zz" Mar 18 10:28:38 crc kubenswrapper[4733]: I0318 10:28:38.073137 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-pc5zz" Mar 18 10:28:38 crc kubenswrapper[4733]: I0318 10:28:38.079079 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-dr9dg" Mar 18 10:28:38 crc kubenswrapper[4733]: I0318 10:28:38.750605 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-zg5cv" Mar 18 10:28:45 crc kubenswrapper[4733]: I0318 10:28:45.091801 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-g2m9r"] Mar 18 10:28:45 crc kubenswrapper[4733]: I0318 10:28:45.094418 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-g2m9r" Mar 18 10:28:45 crc kubenswrapper[4733]: I0318 10:28:45.099917 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-8s8qq" Mar 18 10:28:45 crc kubenswrapper[4733]: I0318 10:28:45.100866 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 18 10:28:45 crc kubenswrapper[4733]: I0318 10:28:45.100872 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 18 10:28:45 crc kubenswrapper[4733]: I0318 10:28:45.118898 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-g2m9r"] Mar 18 10:28:45 crc kubenswrapper[4733]: I0318 10:28:45.199519 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqf9r\" (UniqueName: \"kubernetes.io/projected/45605961-e7c2-4bd3-a670-d8541124408a-kube-api-access-hqf9r\") pod \"openstack-operator-index-g2m9r\" (UID: \"45605961-e7c2-4bd3-a670-d8541124408a\") " pod="openstack-operators/openstack-operator-index-g2m9r" Mar 18 10:28:45 crc kubenswrapper[4733]: I0318 10:28:45.300647 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqf9r\" (UniqueName: \"kubernetes.io/projected/45605961-e7c2-4bd3-a670-d8541124408a-kube-api-access-hqf9r\") pod \"openstack-operator-index-g2m9r\" (UID: \"45605961-e7c2-4bd3-a670-d8541124408a\") " pod="openstack-operators/openstack-operator-index-g2m9r" Mar 18 10:28:45 crc kubenswrapper[4733]: I0318 10:28:45.318810 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqf9r\" (UniqueName: \"kubernetes.io/projected/45605961-e7c2-4bd3-a670-d8541124408a-kube-api-access-hqf9r\") pod \"openstack-operator-index-g2m9r\" (UID: \"45605961-e7c2-4bd3-a670-d8541124408a\") " pod="openstack-operators/openstack-operator-index-g2m9r" Mar 18 10:28:45 crc kubenswrapper[4733]: I0318 10:28:45.419049 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-g2m9r" Mar 18 10:28:45 crc kubenswrapper[4733]: I0318 10:28:45.672802 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-g2m9r"] Mar 18 10:28:46 crc kubenswrapper[4733]: I0318 10:28:46.492056 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-g2m9r" event={"ID":"45605961-e7c2-4bd3-a670-d8541124408a","Type":"ContainerStarted","Data":"bf248de56e76050e45ec04123a6cc0d376a854e1d370cac6f0db486c1ee041fb"} Mar 18 10:28:49 crc kubenswrapper[4733]: I0318 10:28:49.524363 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-g2m9r" event={"ID":"45605961-e7c2-4bd3-a670-d8541124408a","Type":"ContainerStarted","Data":"df46845da741ad71124c2ae43e10c583672f7a52bd0107a068c1058d79eaf580"} Mar 18 10:28:49 crc kubenswrapper[4733]: I0318 10:28:49.548175 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-g2m9r" podStartSLOduration=1.752075413 podStartE2EDuration="4.548149963s" podCreationTimestamp="2026-03-18 10:28:45 +0000 UTC" firstStartedPulling="2026-03-18 10:28:45.681947573 +0000 UTC m=+965.173681908" lastFinishedPulling="2026-03-18 10:28:48.478022093 +0000 UTC m=+967.969756458" observedRunningTime="2026-03-18 10:28:49.544441488 +0000 UTC m=+969.036175843" watchObservedRunningTime="2026-03-18 10:28:49.548149963 +0000 UTC m=+969.039884328" Mar 18 10:28:54 crc kubenswrapper[4733]: I0318 10:28:54.837741 4733 scope.go:117] "RemoveContainer" containerID="36c7a80bc1a34092c9183dbd958b5c05ea904377be8cffacb7112a1b4663e6a6" Mar 18 10:28:55 crc kubenswrapper[4733]: I0318 10:28:55.420491 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-g2m9r" Mar 18 10:28:55 crc kubenswrapper[4733]: I0318 10:28:55.422165 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-g2m9r" Mar 18 10:28:55 crc kubenswrapper[4733]: I0318 10:28:55.466512 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-g2m9r" Mar 18 10:28:55 crc kubenswrapper[4733]: I0318 10:28:55.610611 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-g2m9r" Mar 18 10:28:57 crc kubenswrapper[4733]: I0318 10:28:57.139691 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/77b7da525d1abc8152b798823d798d773ced7ad76161af6957e3c157386hj67"] Mar 18 10:28:57 crc kubenswrapper[4733]: I0318 10:28:57.142231 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/77b7da525d1abc8152b798823d798d773ced7ad76161af6957e3c157386hj67" Mar 18 10:28:57 crc kubenswrapper[4733]: I0318 10:28:57.145349 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-wwmnn" Mar 18 10:28:57 crc kubenswrapper[4733]: I0318 10:28:57.158658 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/77b7da525d1abc8152b798823d798d773ced7ad76161af6957e3c157386hj67"] Mar 18 10:28:57 crc kubenswrapper[4733]: I0318 10:28:57.170941 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j47dh\" (UniqueName: \"kubernetes.io/projected/53c111d7-ea42-4913-b378-ec44062b0691-kube-api-access-j47dh\") pod \"77b7da525d1abc8152b798823d798d773ced7ad76161af6957e3c157386hj67\" (UID: \"53c111d7-ea42-4913-b378-ec44062b0691\") " pod="openstack-operators/77b7da525d1abc8152b798823d798d773ced7ad76161af6957e3c157386hj67" Mar 18 10:28:57 crc kubenswrapper[4733]: I0318 10:28:57.171737 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53c111d7-ea42-4913-b378-ec44062b0691-bundle\") pod \"77b7da525d1abc8152b798823d798d773ced7ad76161af6957e3c157386hj67\" (UID: \"53c111d7-ea42-4913-b378-ec44062b0691\") " pod="openstack-operators/77b7da525d1abc8152b798823d798d773ced7ad76161af6957e3c157386hj67" Mar 18 10:28:57 crc kubenswrapper[4733]: I0318 10:28:57.171975 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53c111d7-ea42-4913-b378-ec44062b0691-util\") pod \"77b7da525d1abc8152b798823d798d773ced7ad76161af6957e3c157386hj67\" (UID: \"53c111d7-ea42-4913-b378-ec44062b0691\") " pod="openstack-operators/77b7da525d1abc8152b798823d798d773ced7ad76161af6957e3c157386hj67" Mar 18 10:28:57 crc kubenswrapper[4733]: I0318 10:28:57.273640 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53c111d7-ea42-4913-b378-ec44062b0691-util\") pod \"77b7da525d1abc8152b798823d798d773ced7ad76161af6957e3c157386hj67\" (UID: \"53c111d7-ea42-4913-b378-ec44062b0691\") " pod="openstack-operators/77b7da525d1abc8152b798823d798d773ced7ad76161af6957e3c157386hj67" Mar 18 10:28:57 crc kubenswrapper[4733]: I0318 10:28:57.273814 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j47dh\" (UniqueName: \"kubernetes.io/projected/53c111d7-ea42-4913-b378-ec44062b0691-kube-api-access-j47dh\") pod \"77b7da525d1abc8152b798823d798d773ced7ad76161af6957e3c157386hj67\" (UID: \"53c111d7-ea42-4913-b378-ec44062b0691\") " pod="openstack-operators/77b7da525d1abc8152b798823d798d773ced7ad76161af6957e3c157386hj67" Mar 18 10:28:57 crc kubenswrapper[4733]: I0318 10:28:57.273872 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53c111d7-ea42-4913-b378-ec44062b0691-bundle\") pod \"77b7da525d1abc8152b798823d798d773ced7ad76161af6957e3c157386hj67\" (UID: \"53c111d7-ea42-4913-b378-ec44062b0691\") " pod="openstack-operators/77b7da525d1abc8152b798823d798d773ced7ad76161af6957e3c157386hj67" Mar 18 10:28:57 crc kubenswrapper[4733]: I0318 10:28:57.274799 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53c111d7-ea42-4913-b378-ec44062b0691-util\") pod \"77b7da525d1abc8152b798823d798d773ced7ad76161af6957e3c157386hj67\" (UID: \"53c111d7-ea42-4913-b378-ec44062b0691\") " pod="openstack-operators/77b7da525d1abc8152b798823d798d773ced7ad76161af6957e3c157386hj67" Mar 18 10:28:57 crc kubenswrapper[4733]: I0318 10:28:57.274821 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53c111d7-ea42-4913-b378-ec44062b0691-bundle\") pod \"77b7da525d1abc8152b798823d798d773ced7ad76161af6957e3c157386hj67\" (UID: \"53c111d7-ea42-4913-b378-ec44062b0691\") " pod="openstack-operators/77b7da525d1abc8152b798823d798d773ced7ad76161af6957e3c157386hj67" Mar 18 10:28:57 crc kubenswrapper[4733]: I0318 10:28:57.297384 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j47dh\" (UniqueName: \"kubernetes.io/projected/53c111d7-ea42-4913-b378-ec44062b0691-kube-api-access-j47dh\") pod \"77b7da525d1abc8152b798823d798d773ced7ad76161af6957e3c157386hj67\" (UID: \"53c111d7-ea42-4913-b378-ec44062b0691\") " pod="openstack-operators/77b7da525d1abc8152b798823d798d773ced7ad76161af6957e3c157386hj67" Mar 18 10:28:57 crc kubenswrapper[4733]: I0318 10:28:57.478684 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/77b7da525d1abc8152b798823d798d773ced7ad76161af6957e3c157386hj67" Mar 18 10:28:58 crc kubenswrapper[4733]: I0318 10:28:58.006308 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/77b7da525d1abc8152b798823d798d773ced7ad76161af6957e3c157386hj67"] Mar 18 10:28:58 crc kubenswrapper[4733]: I0318 10:28:58.596002 4733 generic.go:334] "Generic (PLEG): container finished" podID="53c111d7-ea42-4913-b378-ec44062b0691" containerID="0db45fe63d0f98cced7909596d8d0f8df1bf9f9dfa3875dad551f78b45821259" exitCode=0 Mar 18 10:28:58 crc kubenswrapper[4733]: I0318 10:28:58.596378 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/77b7da525d1abc8152b798823d798d773ced7ad76161af6957e3c157386hj67" event={"ID":"53c111d7-ea42-4913-b378-ec44062b0691","Type":"ContainerDied","Data":"0db45fe63d0f98cced7909596d8d0f8df1bf9f9dfa3875dad551f78b45821259"} Mar 18 10:28:58 crc kubenswrapper[4733]: I0318 10:28:58.596417 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/77b7da525d1abc8152b798823d798d773ced7ad76161af6957e3c157386hj67" event={"ID":"53c111d7-ea42-4913-b378-ec44062b0691","Type":"ContainerStarted","Data":"ff66937e27c0a955eaec5cd058090359982cd344502efa748ae95de95e350359"} Mar 18 10:28:59 crc kubenswrapper[4733]: I0318 10:28:59.609165 4733 generic.go:334] "Generic (PLEG): container finished" podID="53c111d7-ea42-4913-b378-ec44062b0691" containerID="49ad44c2d5b98fc6566f4f4bfdc1f9289d943440590ef96e4199d07928e7605a" exitCode=0 Mar 18 10:28:59 crc kubenswrapper[4733]: I0318 10:28:59.609250 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/77b7da525d1abc8152b798823d798d773ced7ad76161af6957e3c157386hj67" event={"ID":"53c111d7-ea42-4913-b378-ec44062b0691","Type":"ContainerDied","Data":"49ad44c2d5b98fc6566f4f4bfdc1f9289d943440590ef96e4199d07928e7605a"} Mar 18 10:29:00 crc kubenswrapper[4733]: I0318 10:29:00.621300 4733 generic.go:334] "Generic (PLEG): container finished" podID="53c111d7-ea42-4913-b378-ec44062b0691" containerID="49b5d86705056140c0d3154ecb828d2e235cb59eee079edb114bc8df64c3ff85" exitCode=0 Mar 18 10:29:00 crc kubenswrapper[4733]: I0318 10:29:00.621423 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/77b7da525d1abc8152b798823d798d773ced7ad76161af6957e3c157386hj67" event={"ID":"53c111d7-ea42-4913-b378-ec44062b0691","Type":"ContainerDied","Data":"49b5d86705056140c0d3154ecb828d2e235cb59eee079edb114bc8df64c3ff85"} Mar 18 10:29:01 crc kubenswrapper[4733]: I0318 10:29:01.953437 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/77b7da525d1abc8152b798823d798d773ced7ad76161af6957e3c157386hj67" Mar 18 10:29:02 crc kubenswrapper[4733]: I0318 10:29:02.055934 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j47dh\" (UniqueName: \"kubernetes.io/projected/53c111d7-ea42-4913-b378-ec44062b0691-kube-api-access-j47dh\") pod \"53c111d7-ea42-4913-b378-ec44062b0691\" (UID: \"53c111d7-ea42-4913-b378-ec44062b0691\") " Mar 18 10:29:02 crc kubenswrapper[4733]: I0318 10:29:02.056026 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53c111d7-ea42-4913-b378-ec44062b0691-bundle\") pod \"53c111d7-ea42-4913-b378-ec44062b0691\" (UID: \"53c111d7-ea42-4913-b378-ec44062b0691\") " Mar 18 10:29:02 crc kubenswrapper[4733]: I0318 10:29:02.056102 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53c111d7-ea42-4913-b378-ec44062b0691-util\") pod \"53c111d7-ea42-4913-b378-ec44062b0691\" (UID: \"53c111d7-ea42-4913-b378-ec44062b0691\") " Mar 18 10:29:02 crc kubenswrapper[4733]: I0318 10:29:02.057136 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53c111d7-ea42-4913-b378-ec44062b0691-bundle" (OuterVolumeSpecName: "bundle") pod "53c111d7-ea42-4913-b378-ec44062b0691" (UID: "53c111d7-ea42-4913-b378-ec44062b0691"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:29:02 crc kubenswrapper[4733]: I0318 10:29:02.065878 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53c111d7-ea42-4913-b378-ec44062b0691-kube-api-access-j47dh" (OuterVolumeSpecName: "kube-api-access-j47dh") pod "53c111d7-ea42-4913-b378-ec44062b0691" (UID: "53c111d7-ea42-4913-b378-ec44062b0691"). InnerVolumeSpecName "kube-api-access-j47dh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:29:02 crc kubenswrapper[4733]: I0318 10:29:02.085826 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53c111d7-ea42-4913-b378-ec44062b0691-util" (OuterVolumeSpecName: "util") pod "53c111d7-ea42-4913-b378-ec44062b0691" (UID: "53c111d7-ea42-4913-b378-ec44062b0691"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:29:02 crc kubenswrapper[4733]: I0318 10:29:02.158052 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j47dh\" (UniqueName: \"kubernetes.io/projected/53c111d7-ea42-4913-b378-ec44062b0691-kube-api-access-j47dh\") on node \"crc\" DevicePath \"\"" Mar 18 10:29:02 crc kubenswrapper[4733]: I0318 10:29:02.158106 4733 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53c111d7-ea42-4913-b378-ec44062b0691-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 10:29:02 crc kubenswrapper[4733]: I0318 10:29:02.158124 4733 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53c111d7-ea42-4913-b378-ec44062b0691-util\") on node \"crc\" DevicePath \"\"" Mar 18 10:29:02 crc kubenswrapper[4733]: I0318 10:29:02.644326 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/77b7da525d1abc8152b798823d798d773ced7ad76161af6957e3c157386hj67" event={"ID":"53c111d7-ea42-4913-b378-ec44062b0691","Type":"ContainerDied","Data":"ff66937e27c0a955eaec5cd058090359982cd344502efa748ae95de95e350359"} Mar 18 10:29:02 crc kubenswrapper[4733]: I0318 10:29:02.644361 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff66937e27c0a955eaec5cd058090359982cd344502efa748ae95de95e350359" Mar 18 10:29:02 crc kubenswrapper[4733]: I0318 10:29:02.644510 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/77b7da525d1abc8152b798823d798d773ced7ad76161af6957e3c157386hj67" Mar 18 10:29:06 crc kubenswrapper[4733]: I0318 10:29:06.448668 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-579f7bfb88-sfsb4"] Mar 18 10:29:06 crc kubenswrapper[4733]: E0318 10:29:06.449511 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53c111d7-ea42-4913-b378-ec44062b0691" containerName="extract" Mar 18 10:29:06 crc kubenswrapper[4733]: I0318 10:29:06.449525 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="53c111d7-ea42-4913-b378-ec44062b0691" containerName="extract" Mar 18 10:29:06 crc kubenswrapper[4733]: E0318 10:29:06.449549 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53c111d7-ea42-4913-b378-ec44062b0691" containerName="util" Mar 18 10:29:06 crc kubenswrapper[4733]: I0318 10:29:06.449557 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="53c111d7-ea42-4913-b378-ec44062b0691" containerName="util" Mar 18 10:29:06 crc kubenswrapper[4733]: E0318 10:29:06.449568 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53c111d7-ea42-4913-b378-ec44062b0691" containerName="pull" Mar 18 10:29:06 crc kubenswrapper[4733]: I0318 10:29:06.449576 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="53c111d7-ea42-4913-b378-ec44062b0691" containerName="pull" Mar 18 10:29:06 crc kubenswrapper[4733]: I0318 10:29:06.449692 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="53c111d7-ea42-4913-b378-ec44062b0691" containerName="extract" Mar 18 10:29:06 crc kubenswrapper[4733]: I0318 10:29:06.450127 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-579f7bfb88-sfsb4" Mar 18 10:29:06 crc kubenswrapper[4733]: I0318 10:29:06.452448 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-7kkxj" Mar 18 10:29:06 crc kubenswrapper[4733]: I0318 10:29:06.527686 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kh4c\" (UniqueName: \"kubernetes.io/projected/d1b10458-2335-4b46-9f63-c8a005096ff7-kube-api-access-2kh4c\") pod \"openstack-operator-controller-init-579f7bfb88-sfsb4\" (UID: \"d1b10458-2335-4b46-9f63-c8a005096ff7\") " pod="openstack-operators/openstack-operator-controller-init-579f7bfb88-sfsb4" Mar 18 10:29:06 crc kubenswrapper[4733]: I0318 10:29:06.544194 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-579f7bfb88-sfsb4"] Mar 18 10:29:06 crc kubenswrapper[4733]: I0318 10:29:06.629109 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kh4c\" (UniqueName: \"kubernetes.io/projected/d1b10458-2335-4b46-9f63-c8a005096ff7-kube-api-access-2kh4c\") pod \"openstack-operator-controller-init-579f7bfb88-sfsb4\" (UID: \"d1b10458-2335-4b46-9f63-c8a005096ff7\") " pod="openstack-operators/openstack-operator-controller-init-579f7bfb88-sfsb4" Mar 18 10:29:06 crc kubenswrapper[4733]: I0318 10:29:06.650324 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kh4c\" (UniqueName: \"kubernetes.io/projected/d1b10458-2335-4b46-9f63-c8a005096ff7-kube-api-access-2kh4c\") pod \"openstack-operator-controller-init-579f7bfb88-sfsb4\" (UID: \"d1b10458-2335-4b46-9f63-c8a005096ff7\") " pod="openstack-operators/openstack-operator-controller-init-579f7bfb88-sfsb4" Mar 18 10:29:06 crc kubenswrapper[4733]: I0318 10:29:06.830616 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-579f7bfb88-sfsb4" Mar 18 10:29:07 crc kubenswrapper[4733]: W0318 10:29:07.186079 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b10458_2335_4b46_9f63_c8a005096ff7.slice/crio-457e41d42cd11946e9089bf9f569c743a3e1481e1a8619f952166b14fdb3f696 WatchSource:0}: Error finding container 457e41d42cd11946e9089bf9f569c743a3e1481e1a8619f952166b14fdb3f696: Status 404 returned error can't find the container with id 457e41d42cd11946e9089bf9f569c743a3e1481e1a8619f952166b14fdb3f696 Mar 18 10:29:07 crc kubenswrapper[4733]: I0318 10:29:07.187602 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-579f7bfb88-sfsb4"] Mar 18 10:29:07 crc kubenswrapper[4733]: I0318 10:29:07.678830 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-579f7bfb88-sfsb4" event={"ID":"d1b10458-2335-4b46-9f63-c8a005096ff7","Type":"ContainerStarted","Data":"457e41d42cd11946e9089bf9f569c743a3e1481e1a8619f952166b14fdb3f696"} Mar 18 10:29:11 crc kubenswrapper[4733]: I0318 10:29:11.707646 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-579f7bfb88-sfsb4" event={"ID":"d1b10458-2335-4b46-9f63-c8a005096ff7","Type":"ContainerStarted","Data":"08e673cd0aec30d6831dfb6c5a4e7caae3e8dde0cd321bb81c38a42ca8bdceb2"} Mar 18 10:29:11 crc kubenswrapper[4733]: I0318 10:29:11.708338 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-579f7bfb88-sfsb4" Mar 18 10:29:11 crc kubenswrapper[4733]: I0318 10:29:11.730818 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-579f7bfb88-sfsb4" podStartSLOduration=2.188099076 podStartE2EDuration="5.730801277s" podCreationTimestamp="2026-03-18 10:29:06 +0000 UTC" firstStartedPulling="2026-03-18 10:29:07.18751695 +0000 UTC m=+986.679251275" lastFinishedPulling="2026-03-18 10:29:10.730219151 +0000 UTC m=+990.221953476" observedRunningTime="2026-03-18 10:29:11.729735887 +0000 UTC m=+991.221470202" watchObservedRunningTime="2026-03-18 10:29:11.730801277 +0000 UTC m=+991.222535602" Mar 18 10:29:13 crc kubenswrapper[4733]: I0318 10:29:13.571085 4733 patch_prober.go:28] interesting pod/machine-config-daemon-2h7dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:29:13 crc kubenswrapper[4733]: I0318 10:29:13.571614 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:29:16 crc kubenswrapper[4733]: I0318 10:29:16.833877 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-579f7bfb88-sfsb4" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.172333 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-sfv8v"] Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.174301 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-sfv8v" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.176775 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-vnvsj" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.180146 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-v6zxn"] Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.181100 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-v6zxn" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.188279 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-mh8jr" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.198493 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-sfv8v"] Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.201625 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrfg8\" (UniqueName: \"kubernetes.io/projected/8fe910c4-798b-4381-a71d-697459f7f79a-kube-api-access-lrfg8\") pod \"cinder-operator-controller-manager-8d58dc466-v6zxn\" (UID: \"8fe910c4-798b-4381-a71d-697459f7f79a\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-v6zxn" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.201722 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsmw6\" (UniqueName: \"kubernetes.io/projected/0fb2ba68-fa0f-4483-afdf-2eb381c54320-kube-api-access-fsmw6\") pod \"barbican-operator-controller-manager-59bc569d95-sfv8v\" (UID: \"0fb2ba68-fa0f-4483-afdf-2eb381c54320\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-sfv8v" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.205282 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-t8796"] Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.206412 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-t8796" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.210876 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-ks7mv" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.216282 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-v6zxn"] Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.221307 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-ljvrt"] Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.222087 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-ljvrt" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.224850 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-p5z7d" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.250156 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-t8796"] Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.304665 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsmw6\" (UniqueName: \"kubernetes.io/projected/0fb2ba68-fa0f-4483-afdf-2eb381c54320-kube-api-access-fsmw6\") pod \"barbican-operator-controller-manager-59bc569d95-sfv8v\" (UID: \"0fb2ba68-fa0f-4483-afdf-2eb381c54320\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-sfv8v" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.304764 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhzwx\" (UniqueName: \"kubernetes.io/projected/bc0e28fc-cff0-4c39-8073-61d5d6481866-kube-api-access-nhzwx\") pod \"glance-operator-controller-manager-79df6bcc97-ljvrt\" (UID: \"bc0e28fc-cff0-4c39-8073-61d5d6481866\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-ljvrt" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.304824 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pv8xv\" (UniqueName: \"kubernetes.io/projected/748f4855-3978-4ecd-805e-0fee34ce0094-kube-api-access-pv8xv\") pod \"designate-operator-controller-manager-588d4d986b-t8796\" (UID: \"748f4855-3978-4ecd-805e-0fee34ce0094\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-t8796" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.305098 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrfg8\" (UniqueName: \"kubernetes.io/projected/8fe910c4-798b-4381-a71d-697459f7f79a-kube-api-access-lrfg8\") pod \"cinder-operator-controller-manager-8d58dc466-v6zxn\" (UID: \"8fe910c4-798b-4381-a71d-697459f7f79a\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-v6zxn" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.308295 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-ljvrt"] Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.318382 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-cxlns"] Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.319463 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-cxlns" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.326295 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-h62lg" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.339730 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-wkjtf"] Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.340704 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-wkjtf" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.343869 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsmw6\" (UniqueName: \"kubernetes.io/projected/0fb2ba68-fa0f-4483-afdf-2eb381c54320-kube-api-access-fsmw6\") pod \"barbican-operator-controller-manager-59bc569d95-sfv8v\" (UID: \"0fb2ba68-fa0f-4483-afdf-2eb381c54320\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-sfv8v" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.346505 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-bjq7m" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.346836 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrfg8\" (UniqueName: \"kubernetes.io/projected/8fe910c4-798b-4381-a71d-697459f7f79a-kube-api-access-lrfg8\") pod \"cinder-operator-controller-manager-8d58dc466-v6zxn\" (UID: \"8fe910c4-798b-4381-a71d-697459f7f79a\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-v6zxn" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.362882 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-74c694b97b-j4snz"] Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.364091 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-74c694b97b-j4snz" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.365684 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.366058 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-vp5bk" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.370965 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-cxlns"] Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.382372 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-pcscc"] Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.383262 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-pcscc" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.387941 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-fd55l" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.407105 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-74c694b97b-j4snz"] Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.407461 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhzwx\" (UniqueName: \"kubernetes.io/projected/bc0e28fc-cff0-4c39-8073-61d5d6481866-kube-api-access-nhzwx\") pod \"glance-operator-controller-manager-79df6bcc97-ljvrt\" (UID: \"bc0e28fc-cff0-4c39-8073-61d5d6481866\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-ljvrt" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.407844 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pv8xv\" (UniqueName: \"kubernetes.io/projected/748f4855-3978-4ecd-805e-0fee34ce0094-kube-api-access-pv8xv\") pod \"designate-operator-controller-manager-588d4d986b-t8796\" (UID: \"748f4855-3978-4ecd-805e-0fee34ce0094\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-t8796" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.425065 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-wkjtf"] Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.451244 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-pcscc"] Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.459639 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-chmbd"] Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.460452 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-chmbd" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.460796 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pv8xv\" (UniqueName: \"kubernetes.io/projected/748f4855-3978-4ecd-805e-0fee34ce0094-kube-api-access-pv8xv\") pod \"designate-operator-controller-manager-588d4d986b-t8796\" (UID: \"748f4855-3978-4ecd-805e-0fee34ce0094\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-t8796" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.463395 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-28sd9" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.464088 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhzwx\" (UniqueName: \"kubernetes.io/projected/bc0e28fc-cff0-4c39-8073-61d5d6481866-kube-api-access-nhzwx\") pod \"glance-operator-controller-manager-79df6bcc97-ljvrt\" (UID: \"bc0e28fc-cff0-4c39-8073-61d5d6481866\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-ljvrt" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.470102 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-tp4s7"] Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.471054 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-tp4s7" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.472039 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-4xzlc"] Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.473493 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-4xzlc" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.475069 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-7h96r" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.478597 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-4n8bg" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.494470 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-sfv8v" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.502177 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-v6zxn" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.503407 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-4xzlc"] Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.508508 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-chmbd"] Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.509457 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/651c7dd5-3adc-48b4-b579-309258aa3735-cert\") pod \"infra-operator-controller-manager-74c694b97b-j4snz\" (UID: \"651c7dd5-3adc-48b4-b579-309258aa3735\") " pod="openstack-operators/infra-operator-controller-manager-74c694b97b-j4snz" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.509508 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktf2p\" (UniqueName: \"kubernetes.io/projected/79dfdcde-0538-4777-959e-1daf2b6263de-kube-api-access-ktf2p\") pod \"mariadb-operator-controller-manager-67ccfc9778-4xzlc\" (UID: \"79dfdcde-0538-4777-959e-1daf2b6263de\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-4xzlc" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.509531 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnppd\" (UniqueName: \"kubernetes.io/projected/651c7dd5-3adc-48b4-b579-309258aa3735-kube-api-access-pnppd\") pod \"infra-operator-controller-manager-74c694b97b-j4snz\" (UID: \"651c7dd5-3adc-48b4-b579-309258aa3735\") " pod="openstack-operators/infra-operator-controller-manager-74c694b97b-j4snz" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.509550 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2plph\" (UniqueName: \"kubernetes.io/projected/bd5ae902-d036-4e52-983d-aa3e1a86dca8-kube-api-access-2plph\") pod \"heat-operator-controller-manager-67dd5f86f5-cxlns\" (UID: \"bd5ae902-d036-4e52-983d-aa3e1a86dca8\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-cxlns" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.509572 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g27px\" (UniqueName: \"kubernetes.io/projected/fd146b1e-59a9-4246-9520-f2d6f6cf6cd1-kube-api-access-g27px\") pod \"ironic-operator-controller-manager-6f787dddc9-pcscc\" (UID: \"fd146b1e-59a9-4246-9520-f2d6f6cf6cd1\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-pcscc" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.509598 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dv7tf\" (UniqueName: \"kubernetes.io/projected/ae8a8fbc-d425-4da5-afb3-438a85a43722-kube-api-access-dv7tf\") pod \"manila-operator-controller-manager-55f864c847-chmbd\" (UID: \"ae8a8fbc-d425-4da5-afb3-438a85a43722\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-chmbd" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.509618 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4sjl\" (UniqueName: \"kubernetes.io/projected/838f8a80-01c0-41d8-b431-2a23c9235fab-kube-api-access-w4sjl\") pod \"horizon-operator-controller-manager-8464cc45fb-wkjtf\" (UID: \"838f8a80-01c0-41d8-b431-2a23c9235fab\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-wkjtf" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.509654 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8pd6\" (UniqueName: \"kubernetes.io/projected/de7565f5-677b-4aeb-90ab-0d632b28b295-kube-api-access-p8pd6\") pod \"keystone-operator-controller-manager-768b96df4c-tp4s7\" (UID: \"de7565f5-677b-4aeb-90ab-0d632b28b295\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-tp4s7" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.524311 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-tp4s7"] Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.542851 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-gkndg"] Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.543771 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-gkndg" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.545120 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-t8796" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.547440 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-jbq96" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.571738 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-jmwdk"] Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.572713 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-jmwdk" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.579400 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-8kpfz" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.580161 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-ljvrt" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.590110 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-gkndg"] Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.609298 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-jmwdk"] Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.611644 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2plph\" (UniqueName: \"kubernetes.io/projected/bd5ae902-d036-4e52-983d-aa3e1a86dca8-kube-api-access-2plph\") pod \"heat-operator-controller-manager-67dd5f86f5-cxlns\" (UID: \"bd5ae902-d036-4e52-983d-aa3e1a86dca8\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-cxlns" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.611701 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g27px\" (UniqueName: \"kubernetes.io/projected/fd146b1e-59a9-4246-9520-f2d6f6cf6cd1-kube-api-access-g27px\") pod \"ironic-operator-controller-manager-6f787dddc9-pcscc\" (UID: \"fd146b1e-59a9-4246-9520-f2d6f6cf6cd1\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-pcscc" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.611727 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcwp7\" (UniqueName: \"kubernetes.io/projected/f93025ae-ebc3-4aed-bfde-e514d8b814ce-kube-api-access-lcwp7\") pod \"nova-operator-controller-manager-5d488d59fb-jmwdk\" (UID: \"f93025ae-ebc3-4aed-bfde-e514d8b814ce\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-jmwdk" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.611769 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dv7tf\" (UniqueName: \"kubernetes.io/projected/ae8a8fbc-d425-4da5-afb3-438a85a43722-kube-api-access-dv7tf\") pod \"manila-operator-controller-manager-55f864c847-chmbd\" (UID: \"ae8a8fbc-d425-4da5-afb3-438a85a43722\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-chmbd" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.611796 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4sjl\" (UniqueName: \"kubernetes.io/projected/838f8a80-01c0-41d8-b431-2a23c9235fab-kube-api-access-w4sjl\") pod \"horizon-operator-controller-manager-8464cc45fb-wkjtf\" (UID: \"838f8a80-01c0-41d8-b431-2a23c9235fab\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-wkjtf" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.611863 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8pd6\" (UniqueName: \"kubernetes.io/projected/de7565f5-677b-4aeb-90ab-0d632b28b295-kube-api-access-p8pd6\") pod \"keystone-operator-controller-manager-768b96df4c-tp4s7\" (UID: \"de7565f5-677b-4aeb-90ab-0d632b28b295\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-tp4s7" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.611893 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/651c7dd5-3adc-48b4-b579-309258aa3735-cert\") pod \"infra-operator-controller-manager-74c694b97b-j4snz\" (UID: \"651c7dd5-3adc-48b4-b579-309258aa3735\") " pod="openstack-operators/infra-operator-controller-manager-74c694b97b-j4snz" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.611919 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqvtv\" (UniqueName: \"kubernetes.io/projected/216f9239-7d2e-483e-a89f-0955a518aa4a-kube-api-access-qqvtv\") pod \"neutron-operator-controller-manager-767865f676-gkndg\" (UID: \"216f9239-7d2e-483e-a89f-0955a518aa4a\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-gkndg" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.611953 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktf2p\" (UniqueName: \"kubernetes.io/projected/79dfdcde-0538-4777-959e-1daf2b6263de-kube-api-access-ktf2p\") pod \"mariadb-operator-controller-manager-67ccfc9778-4xzlc\" (UID: \"79dfdcde-0538-4777-959e-1daf2b6263de\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-4xzlc" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.611979 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnppd\" (UniqueName: \"kubernetes.io/projected/651c7dd5-3adc-48b4-b579-309258aa3735-kube-api-access-pnppd\") pod \"infra-operator-controller-manager-74c694b97b-j4snz\" (UID: \"651c7dd5-3adc-48b4-b579-309258aa3735\") " pod="openstack-operators/infra-operator-controller-manager-74c694b97b-j4snz" Mar 18 10:29:36 crc kubenswrapper[4733]: E0318 10:29:36.612888 4733 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 10:29:36 crc kubenswrapper[4733]: E0318 10:29:36.612940 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/651c7dd5-3adc-48b4-b579-309258aa3735-cert podName:651c7dd5-3adc-48b4-b579-309258aa3735 nodeName:}" failed. No retries permitted until 2026-03-18 10:29:37.112921979 +0000 UTC m=+1016.604656304 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/651c7dd5-3adc-48b4-b579-309258aa3735-cert") pod "infra-operator-controller-manager-74c694b97b-j4snz" (UID: "651c7dd5-3adc-48b4-b579-309258aa3735") : secret "infra-operator-webhook-server-cert" not found Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.624772 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-22wt5"] Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.625750 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-22wt5" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.629991 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-22wt5"] Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.635040 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-ft7lx" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.640446 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dv7tf\" (UniqueName: \"kubernetes.io/projected/ae8a8fbc-d425-4da5-afb3-438a85a43722-kube-api-access-dv7tf\") pod \"manila-operator-controller-manager-55f864c847-chmbd\" (UID: \"ae8a8fbc-d425-4da5-afb3-438a85a43722\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-chmbd" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.643578 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8pd6\" (UniqueName: \"kubernetes.io/projected/de7565f5-677b-4aeb-90ab-0d632b28b295-kube-api-access-p8pd6\") pod \"keystone-operator-controller-manager-768b96df4c-tp4s7\" (UID: \"de7565f5-677b-4aeb-90ab-0d632b28b295\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-tp4s7" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.646359 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktf2p\" (UniqueName: \"kubernetes.io/projected/79dfdcde-0538-4777-959e-1daf2b6263de-kube-api-access-ktf2p\") pod \"mariadb-operator-controller-manager-67ccfc9778-4xzlc\" (UID: \"79dfdcde-0538-4777-959e-1daf2b6263de\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-4xzlc" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.648116 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-s6rbv"] Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.649224 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g27px\" (UniqueName: \"kubernetes.io/projected/fd146b1e-59a9-4246-9520-f2d6f6cf6cd1-kube-api-access-g27px\") pod \"ironic-operator-controller-manager-6f787dddc9-pcscc\" (UID: \"fd146b1e-59a9-4246-9520-f2d6f6cf6cd1\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-pcscc" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.648749 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2plph\" (UniqueName: \"kubernetes.io/projected/bd5ae902-d036-4e52-983d-aa3e1a86dca8-kube-api-access-2plph\") pod \"heat-operator-controller-manager-67dd5f86f5-cxlns\" (UID: \"bd5ae902-d036-4e52-983d-aa3e1a86dca8\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-cxlns" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.649663 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4sjl\" (UniqueName: \"kubernetes.io/projected/838f8a80-01c0-41d8-b431-2a23c9235fab-kube-api-access-w4sjl\") pod \"horizon-operator-controller-manager-8464cc45fb-wkjtf\" (UID: \"838f8a80-01c0-41d8-b431-2a23c9235fab\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-wkjtf" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.655938 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-s6rbv" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.661064 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-gwtrb" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.668019 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-flv24"] Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.673452 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.675563 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-flv24" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.675933 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnppd\" (UniqueName: \"kubernetes.io/projected/651c7dd5-3adc-48b4-b579-309258aa3735-kube-api-access-pnppd\") pod \"infra-operator-controller-manager-74c694b97b-j4snz\" (UID: \"651c7dd5-3adc-48b4-b579-309258aa3735\") " pod="openstack-operators/infra-operator-controller-manager-74c694b97b-j4snz" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.679946 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-vdfsq" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.683817 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-s6rbv"] Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.695669 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-cxlns" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.702111 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-flv24"] Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.709366 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-9txbj"] Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.710386 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-9txbj" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.711097 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-wkjtf" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.712511 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqvtv\" (UniqueName: \"kubernetes.io/projected/216f9239-7d2e-483e-a89f-0955a518aa4a-kube-api-access-qqvtv\") pod \"neutron-operator-controller-manager-767865f676-gkndg\" (UID: \"216f9239-7d2e-483e-a89f-0955a518aa4a\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-gkndg" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.712570 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcwp7\" (UniqueName: \"kubernetes.io/projected/f93025ae-ebc3-4aed-bfde-e514d8b814ce-kube-api-access-lcwp7\") pod \"nova-operator-controller-manager-5d488d59fb-jmwdk\" (UID: \"f93025ae-ebc3-4aed-bfde-e514d8b814ce\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-jmwdk" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.712597 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54rxf\" (UniqueName: \"kubernetes.io/projected/4ad2d88a-c733-4409-b07b-5ff4661e1b68-kube-api-access-54rxf\") pod \"placement-operator-controller-manager-5784578c99-9txbj\" (UID: \"4ad2d88a-c733-4409-b07b-5ff4661e1b68\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-9txbj" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.712633 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6eca2f16-53b8-4173-ace4-18b7292b1369-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-s6rbv\" (UID: \"6eca2f16-53b8-4173-ace4-18b7292b1369\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-s6rbv" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.712649 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rc2l9\" (UniqueName: \"kubernetes.io/projected/31999dbe-554e-4168-a902-1f62e82ce854-kube-api-access-rc2l9\") pod \"octavia-operator-controller-manager-5b9f45d989-22wt5\" (UID: \"31999dbe-554e-4168-a902-1f62e82ce854\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-22wt5" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.712667 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntcl4\" (UniqueName: \"kubernetes.io/projected/6762c515-b422-4157-a8ce-b9ca4781e134-kube-api-access-ntcl4\") pod \"ovn-operator-controller-manager-884679f54-flv24\" (UID: \"6762c515-b422-4157-a8ce-b9ca4781e134\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-flv24" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.712694 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4h9v6\" (UniqueName: \"kubernetes.io/projected/6eca2f16-53b8-4173-ace4-18b7292b1369-kube-api-access-4h9v6\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-s6rbv\" (UID: \"6eca2f16-53b8-4173-ace4-18b7292b1369\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-s6rbv" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.712762 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-l6bcs" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.747254 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-pcscc" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.757706 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-9txbj"] Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.767036 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcwp7\" (UniqueName: \"kubernetes.io/projected/f93025ae-ebc3-4aed-bfde-e514d8b814ce-kube-api-access-lcwp7\") pod \"nova-operator-controller-manager-5d488d59fb-jmwdk\" (UID: \"f93025ae-ebc3-4aed-bfde-e514d8b814ce\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-jmwdk" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.781825 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-v2pb2"] Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.784786 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-v2pb2" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.785423 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqvtv\" (UniqueName: \"kubernetes.io/projected/216f9239-7d2e-483e-a89f-0955a518aa4a-kube-api-access-qqvtv\") pod \"neutron-operator-controller-manager-767865f676-gkndg\" (UID: \"216f9239-7d2e-483e-a89f-0955a518aa4a\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-gkndg" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.786345 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-qc8z2" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.798059 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-fd4t7"] Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.798892 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-fd4t7" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.810706 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-jnvkk" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.813302 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntcl4\" (UniqueName: \"kubernetes.io/projected/6762c515-b422-4157-a8ce-b9ca4781e134-kube-api-access-ntcl4\") pod \"ovn-operator-controller-manager-884679f54-flv24\" (UID: \"6762c515-b422-4157-a8ce-b9ca4781e134\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-flv24" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.813347 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4h9v6\" (UniqueName: \"kubernetes.io/projected/6eca2f16-53b8-4173-ace4-18b7292b1369-kube-api-access-4h9v6\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-s6rbv\" (UID: \"6eca2f16-53b8-4173-ace4-18b7292b1369\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-s6rbv" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.813418 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54rxf\" (UniqueName: \"kubernetes.io/projected/4ad2d88a-c733-4409-b07b-5ff4661e1b68-kube-api-access-54rxf\") pod \"placement-operator-controller-manager-5784578c99-9txbj\" (UID: \"4ad2d88a-c733-4409-b07b-5ff4661e1b68\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-9txbj" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.813453 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6eca2f16-53b8-4173-ace4-18b7292b1369-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-s6rbv\" (UID: \"6eca2f16-53b8-4173-ace4-18b7292b1369\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-s6rbv" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.813469 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rc2l9\" (UniqueName: \"kubernetes.io/projected/31999dbe-554e-4168-a902-1f62e82ce854-kube-api-access-rc2l9\") pod \"octavia-operator-controller-manager-5b9f45d989-22wt5\" (UID: \"31999dbe-554e-4168-a902-1f62e82ce854\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-22wt5" Mar 18 10:29:36 crc kubenswrapper[4733]: E0318 10:29:36.813952 4733 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 10:29:36 crc kubenswrapper[4733]: E0318 10:29:36.814013 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6eca2f16-53b8-4173-ace4-18b7292b1369-cert podName:6eca2f16-53b8-4173-ace4-18b7292b1369 nodeName:}" failed. No retries permitted until 2026-03-18 10:29:37.313994376 +0000 UTC m=+1016.805728701 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6eca2f16-53b8-4173-ace4-18b7292b1369-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-s6rbv" (UID: "6eca2f16-53b8-4173-ace4-18b7292b1369") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.817943 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-v2pb2"] Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.822646 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-chmbd" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.824598 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-fd4t7"] Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.835369 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntcl4\" (UniqueName: \"kubernetes.io/projected/6762c515-b422-4157-a8ce-b9ca4781e134-kube-api-access-ntcl4\") pod \"ovn-operator-controller-manager-884679f54-flv24\" (UID: \"6762c515-b422-4157-a8ce-b9ca4781e134\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-flv24" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.842116 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54rxf\" (UniqueName: \"kubernetes.io/projected/4ad2d88a-c733-4409-b07b-5ff4661e1b68-kube-api-access-54rxf\") pod \"placement-operator-controller-manager-5784578c99-9txbj\" (UID: \"4ad2d88a-c733-4409-b07b-5ff4661e1b68\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-9txbj" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.843326 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4h9v6\" (UniqueName: \"kubernetes.io/projected/6eca2f16-53b8-4173-ace4-18b7292b1369-kube-api-access-4h9v6\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-s6rbv\" (UID: \"6eca2f16-53b8-4173-ace4-18b7292b1369\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-s6rbv" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.853552 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rc2l9\" (UniqueName: \"kubernetes.io/projected/31999dbe-554e-4168-a902-1f62e82ce854-kube-api-access-rc2l9\") pod \"octavia-operator-controller-manager-5b9f45d989-22wt5\" (UID: \"31999dbe-554e-4168-a902-1f62e82ce854\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-22wt5" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.868884 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-nskpj"] Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.870629 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-nskpj" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.873291 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-tp4s7" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.880625 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-nskpj"] Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.880844 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-vfhsx" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.894716 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-4xzlc" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.915578 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-gkndg" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.915922 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsj46\" (UniqueName: \"kubernetes.io/projected/759f85a1-4e24-4b61-879b-90801d648683-kube-api-access-jsj46\") pod \"telemetry-operator-controller-manager-d6b694c5-fd4t7\" (UID: \"759f85a1-4e24-4b61-879b-90801d648683\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-fd4t7" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.916007 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7l8w\" (UniqueName: \"kubernetes.io/projected/6ea742ac-3be9-4067-ab5a-032365494fde-kube-api-access-v7l8w\") pod \"swift-operator-controller-manager-c674c5965-v2pb2\" (UID: \"6ea742ac-3be9-4067-ab5a-032365494fde\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-v2pb2" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.917389 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-sqr4g"] Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.921303 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-sqr4g" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.922128 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-sqr4g"] Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.924518 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-qhdl6" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.936443 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-jmwdk" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.938582 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-85877db48-qvlf2"] Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.942026 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-85877db48-qvlf2" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.948381 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.948557 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.948666 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-5xhfs" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.955598 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-22wt5" Mar 18 10:29:36 crc kubenswrapper[4733]: I0318 10:29:36.966255 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-85877db48-qvlf2"] Mar 18 10:29:37 crc kubenswrapper[4733]: I0318 10:29:37.001698 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-k64ch"] Mar 18 10:29:37 crc kubenswrapper[4733]: I0318 10:29:37.002559 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-k64ch" Mar 18 10:29:37 crc kubenswrapper[4733]: I0318 10:29:37.007931 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-6f5z6" Mar 18 10:29:37 crc kubenswrapper[4733]: I0318 10:29:37.008451 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-k64ch"] Mar 18 10:29:37 crc kubenswrapper[4733]: I0318 10:29:37.017583 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7l8w\" (UniqueName: \"kubernetes.io/projected/6ea742ac-3be9-4067-ab5a-032365494fde-kube-api-access-v7l8w\") pod \"swift-operator-controller-manager-c674c5965-v2pb2\" (UID: \"6ea742ac-3be9-4067-ab5a-032365494fde\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-v2pb2" Mar 18 10:29:37 crc kubenswrapper[4733]: I0318 10:29:37.017638 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cc79f\" (UniqueName: \"kubernetes.io/projected/6152e0d7-6362-4c7d-ba2b-4a1e55ca4f54-kube-api-access-cc79f\") pod \"test-operator-controller-manager-5c5cb9c4d7-nskpj\" (UID: \"6152e0d7-6362-4c7d-ba2b-4a1e55ca4f54\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-nskpj" Mar 18 10:29:37 crc kubenswrapper[4733]: I0318 10:29:37.017710 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsj46\" (UniqueName: \"kubernetes.io/projected/759f85a1-4e24-4b61-879b-90801d648683-kube-api-access-jsj46\") pod \"telemetry-operator-controller-manager-d6b694c5-fd4t7\" (UID: \"759f85a1-4e24-4b61-879b-90801d648683\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-fd4t7" Mar 18 10:29:37 crc kubenswrapper[4733]: I0318 10:29:37.061574 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsj46\" (UniqueName: \"kubernetes.io/projected/759f85a1-4e24-4b61-879b-90801d648683-kube-api-access-jsj46\") pod \"telemetry-operator-controller-manager-d6b694c5-fd4t7\" (UID: \"759f85a1-4e24-4b61-879b-90801d648683\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-fd4t7" Mar 18 10:29:37 crc kubenswrapper[4733]: I0318 10:29:37.064399 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7l8w\" (UniqueName: \"kubernetes.io/projected/6ea742ac-3be9-4067-ab5a-032365494fde-kube-api-access-v7l8w\") pod \"swift-operator-controller-manager-c674c5965-v2pb2\" (UID: \"6ea742ac-3be9-4067-ab5a-032365494fde\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-v2pb2" Mar 18 10:29:37 crc kubenswrapper[4733]: I0318 10:29:37.084493 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-flv24" Mar 18 10:29:37 crc kubenswrapper[4733]: I0318 10:29:37.112515 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-9txbj" Mar 18 10:29:37 crc kubenswrapper[4733]: I0318 10:29:37.119476 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/651c7dd5-3adc-48b4-b579-309258aa3735-cert\") pod \"infra-operator-controller-manager-74c694b97b-j4snz\" (UID: \"651c7dd5-3adc-48b4-b579-309258aa3735\") " pod="openstack-operators/infra-operator-controller-manager-74c694b97b-j4snz" Mar 18 10:29:37 crc kubenswrapper[4733]: I0318 10:29:37.119521 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cc79f\" (UniqueName: \"kubernetes.io/projected/6152e0d7-6362-4c7d-ba2b-4a1e55ca4f54-kube-api-access-cc79f\") pod \"test-operator-controller-manager-5c5cb9c4d7-nskpj\" (UID: \"6152e0d7-6362-4c7d-ba2b-4a1e55ca4f54\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-nskpj" Mar 18 10:29:37 crc kubenswrapper[4733]: I0318 10:29:37.119679 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a4b7e706-a9a7-490a-84a8-094d1d909ba8-webhook-certs\") pod \"openstack-operator-controller-manager-85877db48-qvlf2\" (UID: \"a4b7e706-a9a7-490a-84a8-094d1d909ba8\") " pod="openstack-operators/openstack-operator-controller-manager-85877db48-qvlf2" Mar 18 10:29:37 crc kubenswrapper[4733]: I0318 10:29:37.119707 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psssn\" (UniqueName: \"kubernetes.io/projected/cd9234ed-fcbc-4d81-9034-27d39b3df6ee-kube-api-access-psssn\") pod \"watcher-operator-controller-manager-6c4d75f7f9-sqr4g\" (UID: \"cd9234ed-fcbc-4d81-9034-27d39b3df6ee\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-sqr4g" Mar 18 10:29:37 crc kubenswrapper[4733]: I0318 10:29:37.119733 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cb9b\" (UniqueName: \"kubernetes.io/projected/a4b7e706-a9a7-490a-84a8-094d1d909ba8-kube-api-access-4cb9b\") pod \"openstack-operator-controller-manager-85877db48-qvlf2\" (UID: \"a4b7e706-a9a7-490a-84a8-094d1d909ba8\") " pod="openstack-operators/openstack-operator-controller-manager-85877db48-qvlf2" Mar 18 10:29:37 crc kubenswrapper[4733]: I0318 10:29:37.119766 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brp9l\" (UniqueName: \"kubernetes.io/projected/e64c7cd6-a04b-440e-ac47-40f672fbc333-kube-api-access-brp9l\") pod \"rabbitmq-cluster-operator-manager-668c99d594-k64ch\" (UID: \"e64c7cd6-a04b-440e-ac47-40f672fbc333\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-k64ch" Mar 18 10:29:37 crc kubenswrapper[4733]: I0318 10:29:37.119812 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a4b7e706-a9a7-490a-84a8-094d1d909ba8-metrics-certs\") pod \"openstack-operator-controller-manager-85877db48-qvlf2\" (UID: \"a4b7e706-a9a7-490a-84a8-094d1d909ba8\") " pod="openstack-operators/openstack-operator-controller-manager-85877db48-qvlf2" Mar 18 10:29:37 crc kubenswrapper[4733]: E0318 10:29:37.119959 4733 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 10:29:37 crc kubenswrapper[4733]: E0318 10:29:37.120010 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/651c7dd5-3adc-48b4-b579-309258aa3735-cert podName:651c7dd5-3adc-48b4-b579-309258aa3735 nodeName:}" failed. No retries permitted until 2026-03-18 10:29:38.119992109 +0000 UTC m=+1017.611726434 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/651c7dd5-3adc-48b4-b579-309258aa3735-cert") pod "infra-operator-controller-manager-74c694b97b-j4snz" (UID: "651c7dd5-3adc-48b4-b579-309258aa3735") : secret "infra-operator-webhook-server-cert" not found Mar 18 10:29:37 crc kubenswrapper[4733]: I0318 10:29:37.141220 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cc79f\" (UniqueName: \"kubernetes.io/projected/6152e0d7-6362-4c7d-ba2b-4a1e55ca4f54-kube-api-access-cc79f\") pod \"test-operator-controller-manager-5c5cb9c4d7-nskpj\" (UID: \"6152e0d7-6362-4c7d-ba2b-4a1e55ca4f54\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-nskpj" Mar 18 10:29:37 crc kubenswrapper[4733]: I0318 10:29:37.170869 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-fd4t7" Mar 18 10:29:37 crc kubenswrapper[4733]: I0318 10:29:37.172736 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-v2pb2" Mar 18 10:29:37 crc kubenswrapper[4733]: I0318 10:29:37.222734 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a4b7e706-a9a7-490a-84a8-094d1d909ba8-webhook-certs\") pod \"openstack-operator-controller-manager-85877db48-qvlf2\" (UID: \"a4b7e706-a9a7-490a-84a8-094d1d909ba8\") " pod="openstack-operators/openstack-operator-controller-manager-85877db48-qvlf2" Mar 18 10:29:37 crc kubenswrapper[4733]: I0318 10:29:37.222779 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psssn\" (UniqueName: \"kubernetes.io/projected/cd9234ed-fcbc-4d81-9034-27d39b3df6ee-kube-api-access-psssn\") pod \"watcher-operator-controller-manager-6c4d75f7f9-sqr4g\" (UID: \"cd9234ed-fcbc-4d81-9034-27d39b3df6ee\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-sqr4g" Mar 18 10:29:37 crc kubenswrapper[4733]: I0318 10:29:37.222797 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cb9b\" (UniqueName: \"kubernetes.io/projected/a4b7e706-a9a7-490a-84a8-094d1d909ba8-kube-api-access-4cb9b\") pod \"openstack-operator-controller-manager-85877db48-qvlf2\" (UID: \"a4b7e706-a9a7-490a-84a8-094d1d909ba8\") " pod="openstack-operators/openstack-operator-controller-manager-85877db48-qvlf2" Mar 18 10:29:37 crc kubenswrapper[4733]: I0318 10:29:37.222831 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brp9l\" (UniqueName: \"kubernetes.io/projected/e64c7cd6-a04b-440e-ac47-40f672fbc333-kube-api-access-brp9l\") pod \"rabbitmq-cluster-operator-manager-668c99d594-k64ch\" (UID: \"e64c7cd6-a04b-440e-ac47-40f672fbc333\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-k64ch" Mar 18 10:29:37 crc kubenswrapper[4733]: I0318 10:29:37.222866 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a4b7e706-a9a7-490a-84a8-094d1d909ba8-metrics-certs\") pod \"openstack-operator-controller-manager-85877db48-qvlf2\" (UID: \"a4b7e706-a9a7-490a-84a8-094d1d909ba8\") " pod="openstack-operators/openstack-operator-controller-manager-85877db48-qvlf2" Mar 18 10:29:37 crc kubenswrapper[4733]: E0318 10:29:37.222928 4733 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 10:29:37 crc kubenswrapper[4733]: E0318 10:29:37.223000 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4b7e706-a9a7-490a-84a8-094d1d909ba8-webhook-certs podName:a4b7e706-a9a7-490a-84a8-094d1d909ba8 nodeName:}" failed. No retries permitted until 2026-03-18 10:29:37.722983342 +0000 UTC m=+1017.214717667 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a4b7e706-a9a7-490a-84a8-094d1d909ba8-webhook-certs") pod "openstack-operator-controller-manager-85877db48-qvlf2" (UID: "a4b7e706-a9a7-490a-84a8-094d1d909ba8") : secret "webhook-server-cert" not found Mar 18 10:29:37 crc kubenswrapper[4733]: E0318 10:29:37.223012 4733 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 10:29:37 crc kubenswrapper[4733]: E0318 10:29:37.223055 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4b7e706-a9a7-490a-84a8-094d1d909ba8-metrics-certs podName:a4b7e706-a9a7-490a-84a8-094d1d909ba8 nodeName:}" failed. No retries permitted until 2026-03-18 10:29:37.723043083 +0000 UTC m=+1017.214777398 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a4b7e706-a9a7-490a-84a8-094d1d909ba8-metrics-certs") pod "openstack-operator-controller-manager-85877db48-qvlf2" (UID: "a4b7e706-a9a7-490a-84a8-094d1d909ba8") : secret "metrics-server-cert" not found Mar 18 10:29:37 crc kubenswrapper[4733]: I0318 10:29:37.229439 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-v6zxn"] Mar 18 10:29:37 crc kubenswrapper[4733]: I0318 10:29:37.245231 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-nskpj" Mar 18 10:29:37 crc kubenswrapper[4733]: I0318 10:29:37.249065 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psssn\" (UniqueName: \"kubernetes.io/projected/cd9234ed-fcbc-4d81-9034-27d39b3df6ee-kube-api-access-psssn\") pod \"watcher-operator-controller-manager-6c4d75f7f9-sqr4g\" (UID: \"cd9234ed-fcbc-4d81-9034-27d39b3df6ee\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-sqr4g" Mar 18 10:29:37 crc kubenswrapper[4733]: I0318 10:29:37.250011 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brp9l\" (UniqueName: \"kubernetes.io/projected/e64c7cd6-a04b-440e-ac47-40f672fbc333-kube-api-access-brp9l\") pod \"rabbitmq-cluster-operator-manager-668c99d594-k64ch\" (UID: \"e64c7cd6-a04b-440e-ac47-40f672fbc333\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-k64ch" Mar 18 10:29:37 crc kubenswrapper[4733]: I0318 10:29:37.254370 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cb9b\" (UniqueName: \"kubernetes.io/projected/a4b7e706-a9a7-490a-84a8-094d1d909ba8-kube-api-access-4cb9b\") pod \"openstack-operator-controller-manager-85877db48-qvlf2\" (UID: \"a4b7e706-a9a7-490a-84a8-094d1d909ba8\") " pod="openstack-operators/openstack-operator-controller-manager-85877db48-qvlf2" Mar 18 10:29:37 crc kubenswrapper[4733]: I0318 10:29:37.262144 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-sqr4g" Mar 18 10:29:37 crc kubenswrapper[4733]: I0318 10:29:37.325203 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6eca2f16-53b8-4173-ace4-18b7292b1369-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-s6rbv\" (UID: \"6eca2f16-53b8-4173-ace4-18b7292b1369\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-s6rbv" Mar 18 10:29:37 crc kubenswrapper[4733]: E0318 10:29:37.325377 4733 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 10:29:37 crc kubenswrapper[4733]: E0318 10:29:37.325439 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6eca2f16-53b8-4173-ace4-18b7292b1369-cert podName:6eca2f16-53b8-4173-ace4-18b7292b1369 nodeName:}" failed. No retries permitted until 2026-03-18 10:29:38.325419249 +0000 UTC m=+1017.817153574 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6eca2f16-53b8-4173-ace4-18b7292b1369-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-s6rbv" (UID: "6eca2f16-53b8-4173-ace4-18b7292b1369") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 10:29:37 crc kubenswrapper[4733]: W0318 10:29:37.326587 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8fe910c4_798b_4381_a71d_697459f7f79a.slice/crio-3f68db198b7463ac4b05d69c1568aa5c5301de1dce7b45824d44840ffbb5e0c5 WatchSource:0}: Error finding container 3f68db198b7463ac4b05d69c1568aa5c5301de1dce7b45824d44840ffbb5e0c5: Status 404 returned error can't find the container with id 3f68db198b7463ac4b05d69c1568aa5c5301de1dce7b45824d44840ffbb5e0c5 Mar 18 10:29:37 crc kubenswrapper[4733]: I0318 10:29:37.433847 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-v6zxn" event={"ID":"8fe910c4-798b-4381-a71d-697459f7f79a","Type":"ContainerStarted","Data":"3f68db198b7463ac4b05d69c1568aa5c5301de1dce7b45824d44840ffbb5e0c5"} Mar 18 10:29:37 crc kubenswrapper[4733]: I0318 10:29:37.485616 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-k64ch" Mar 18 10:29:37 crc kubenswrapper[4733]: I0318 10:29:37.734829 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a4b7e706-a9a7-490a-84a8-094d1d909ba8-webhook-certs\") pod \"openstack-operator-controller-manager-85877db48-qvlf2\" (UID: \"a4b7e706-a9a7-490a-84a8-094d1d909ba8\") " pod="openstack-operators/openstack-operator-controller-manager-85877db48-qvlf2" Mar 18 10:29:37 crc kubenswrapper[4733]: I0318 10:29:37.734901 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a4b7e706-a9a7-490a-84a8-094d1d909ba8-metrics-certs\") pod \"openstack-operator-controller-manager-85877db48-qvlf2\" (UID: \"a4b7e706-a9a7-490a-84a8-094d1d909ba8\") " pod="openstack-operators/openstack-operator-controller-manager-85877db48-qvlf2" Mar 18 10:29:37 crc kubenswrapper[4733]: E0318 10:29:37.735032 4733 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 10:29:37 crc kubenswrapper[4733]: E0318 10:29:37.735112 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4b7e706-a9a7-490a-84a8-094d1d909ba8-metrics-certs podName:a4b7e706-a9a7-490a-84a8-094d1d909ba8 nodeName:}" failed. No retries permitted until 2026-03-18 10:29:38.735098655 +0000 UTC m=+1018.226832980 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a4b7e706-a9a7-490a-84a8-094d1d909ba8-metrics-certs") pod "openstack-operator-controller-manager-85877db48-qvlf2" (UID: "a4b7e706-a9a7-490a-84a8-094d1d909ba8") : secret "metrics-server-cert" not found Mar 18 10:29:37 crc kubenswrapper[4733]: E0318 10:29:37.736796 4733 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 10:29:37 crc kubenswrapper[4733]: E0318 10:29:37.736897 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4b7e706-a9a7-490a-84a8-094d1d909ba8-webhook-certs podName:a4b7e706-a9a7-490a-84a8-094d1d909ba8 nodeName:}" failed. No retries permitted until 2026-03-18 10:29:38.736877676 +0000 UTC m=+1018.228612001 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a4b7e706-a9a7-490a-84a8-094d1d909ba8-webhook-certs") pod "openstack-operator-controller-manager-85877db48-qvlf2" (UID: "a4b7e706-a9a7-490a-84a8-094d1d909ba8") : secret "webhook-server-cert" not found Mar 18 10:29:37 crc kubenswrapper[4733]: I0318 10:29:37.754763 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-t8796"] Mar 18 10:29:37 crc kubenswrapper[4733]: I0318 10:29:37.796671 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-sfv8v"] Mar 18 10:29:37 crc kubenswrapper[4733]: W0318 10:29:37.800527 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0fb2ba68_fa0f_4483_afdf_2eb381c54320.slice/crio-088842a868b02ec21f48d7509be6d13ba2f98f0fb318430fd498f2f31804837a WatchSource:0}: Error finding container 088842a868b02ec21f48d7509be6d13ba2f98f0fb318430fd498f2f31804837a: Status 404 returned error can't find the container with id 088842a868b02ec21f48d7509be6d13ba2f98f0fb318430fd498f2f31804837a Mar 18 10:29:38 crc kubenswrapper[4733]: I0318 10:29:38.090949 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-ljvrt"] Mar 18 10:29:38 crc kubenswrapper[4733]: I0318 10:29:38.099111 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-tp4s7"] Mar 18 10:29:38 crc kubenswrapper[4733]: I0318 10:29:38.105611 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-4xzlc"] Mar 18 10:29:38 crc kubenswrapper[4733]: I0318 10:29:38.127642 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-cxlns"] Mar 18 10:29:38 crc kubenswrapper[4733]: I0318 10:29:38.144019 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/651c7dd5-3adc-48b4-b579-309258aa3735-cert\") pod \"infra-operator-controller-manager-74c694b97b-j4snz\" (UID: \"651c7dd5-3adc-48b4-b579-309258aa3735\") " pod="openstack-operators/infra-operator-controller-manager-74c694b97b-j4snz" Mar 18 10:29:38 crc kubenswrapper[4733]: E0318 10:29:38.144311 4733 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 10:29:38 crc kubenswrapper[4733]: E0318 10:29:38.144359 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/651c7dd5-3adc-48b4-b579-309258aa3735-cert podName:651c7dd5-3adc-48b4-b579-309258aa3735 nodeName:}" failed. No retries permitted until 2026-03-18 10:29:40.14434358 +0000 UTC m=+1019.636077905 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/651c7dd5-3adc-48b4-b579-309258aa3735-cert") pod "infra-operator-controller-manager-74c694b97b-j4snz" (UID: "651c7dd5-3adc-48b4-b579-309258aa3735") : secret "infra-operator-webhook-server-cert" not found Mar 18 10:29:38 crc kubenswrapper[4733]: I0318 10:29:38.168781 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-pcscc"] Mar 18 10:29:38 crc kubenswrapper[4733]: I0318 10:29:38.187117 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-wkjtf"] Mar 18 10:29:38 crc kubenswrapper[4733]: I0318 10:29:38.200850 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-gkndg"] Mar 18 10:29:38 crc kubenswrapper[4733]: W0318 10:29:38.204178 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod216f9239_7d2e_483e_a89f_0955a518aa4a.slice/crio-717164cede3a5aeefb57345884e06771787e1902761118a64f9a630f0786dfab WatchSource:0}: Error finding container 717164cede3a5aeefb57345884e06771787e1902761118a64f9a630f0786dfab: Status 404 returned error can't find the container with id 717164cede3a5aeefb57345884e06771787e1902761118a64f9a630f0786dfab Mar 18 10:29:38 crc kubenswrapper[4733]: W0318 10:29:38.204398 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf93025ae_ebc3_4aed_bfde_e514d8b814ce.slice/crio-7923bc481871d27755290fc6dca542bca37b41aefe9f9c9377630357efee6cc5 WatchSource:0}: Error finding container 7923bc481871d27755290fc6dca542bca37b41aefe9f9c9377630357efee6cc5: Status 404 returned error can't find the container with id 7923bc481871d27755290fc6dca542bca37b41aefe9f9c9377630357efee6cc5 Mar 18 10:29:38 crc kubenswrapper[4733]: W0318 10:29:38.206044 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae8a8fbc_d425_4da5_afb3_438a85a43722.slice/crio-52924df880af4e3e594152ddc2ac8ec42109dc8f64facd9860786f838b963f8c WatchSource:0}: Error finding container 52924df880af4e3e594152ddc2ac8ec42109dc8f64facd9860786f838b963f8c: Status 404 returned error can't find the container with id 52924df880af4e3e594152ddc2ac8ec42109dc8f64facd9860786f838b963f8c Mar 18 10:29:38 crc kubenswrapper[4733]: E0318 10:29:38.212296 4733 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:f2e0b0fb34995b8acbbf1b0b60b5dbcf488b4f3899d1bb0763ae7dcee9bae6da,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dv7tf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-55f864c847-chmbd_openstack-operators(ae8a8fbc-d425-4da5-afb3-438a85a43722): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 10:29:38 crc kubenswrapper[4733]: I0318 10:29:38.215048 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-jmwdk"] Mar 18 10:29:38 crc kubenswrapper[4733]: E0318 10:29:38.215126 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/manila-operator-controller-manager-55f864c847-chmbd" podUID="ae8a8fbc-d425-4da5-afb3-438a85a43722" Mar 18 10:29:38 crc kubenswrapper[4733]: I0318 10:29:38.219833 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-chmbd"] Mar 18 10:29:38 crc kubenswrapper[4733]: I0318 10:29:38.225032 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-flv24"] Mar 18 10:29:38 crc kubenswrapper[4733]: I0318 10:29:38.228465 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-22wt5"] Mar 18 10:29:38 crc kubenswrapper[4733]: I0318 10:29:38.296096 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-v2pb2"] Mar 18 10:29:38 crc kubenswrapper[4733]: I0318 10:29:38.305549 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-nskpj"] Mar 18 10:29:38 crc kubenswrapper[4733]: E0318 10:29:38.310352 4733 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jsj46,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-d6b694c5-fd4t7_openstack-operators(759f85a1-4e24-4b61-879b-90801d648683): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 10:29:38 crc kubenswrapper[4733]: E0318 10:29:38.311789 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-fd4t7" podUID="759f85a1-4e24-4b61-879b-90801d648683" Mar 18 10:29:38 crc kubenswrapper[4733]: I0318 10:29:38.312274 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-fd4t7"] Mar 18 10:29:38 crc kubenswrapper[4733]: E0318 10:29:38.312799 4733 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-cc79f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5c5cb9c4d7-nskpj_openstack-operators(6152e0d7-6362-4c7d-ba2b-4a1e55ca4f54): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 10:29:38 crc kubenswrapper[4733]: E0318 10:29:38.314573 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-nskpj" podUID="6152e0d7-6362-4c7d-ba2b-4a1e55ca4f54" Mar 18 10:29:38 crc kubenswrapper[4733]: W0318 10:29:38.317613 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ea742ac_3be9_4067_ab5a_032365494fde.slice/crio-a4a2daa0e2c90160ccc562b8b047470f9d8ba2da90c1e739c76cddd6f5c0d339 WatchSource:0}: Error finding container a4a2daa0e2c90160ccc562b8b047470f9d8ba2da90c1e739c76cddd6f5c0d339: Status 404 returned error can't find the container with id a4a2daa0e2c90160ccc562b8b047470f9d8ba2da90c1e739c76cddd6f5c0d339 Mar 18 10:29:38 crc kubenswrapper[4733]: I0318 10:29:38.319993 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-k64ch"] Mar 18 10:29:38 crc kubenswrapper[4733]: W0318 10:29:38.324689 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ad2d88a_c733_4409_b07b_5ff4661e1b68.slice/crio-810ef674bbeea3daf9a479b03dc0f650895e68ece3f19c3fba39a64077d8a2da WatchSource:0}: Error finding container 810ef674bbeea3daf9a479b03dc0f650895e68ece3f19c3fba39a64077d8a2da: Status 404 returned error can't find the container with id 810ef674bbeea3daf9a479b03dc0f650895e68ece3f19c3fba39a64077d8a2da Mar 18 10:29:38 crc kubenswrapper[4733]: I0318 10:29:38.327612 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-9txbj"] Mar 18 10:29:38 crc kubenswrapper[4733]: E0318 10:29:38.329596 4733 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-brp9l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-k64ch_openstack-operators(e64c7cd6-a04b-440e-ac47-40f672fbc333): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 10:29:38 crc kubenswrapper[4733]: E0318 10:29:38.329764 4733 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-54rxf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-5784578c99-9txbj_openstack-operators(4ad2d88a-c733-4409-b07b-5ff4661e1b68): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 10:29:38 crc kubenswrapper[4733]: E0318 10:29:38.330547 4733 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-v7l8w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-c674c5965-v2pb2_openstack-operators(6ea742ac-3be9-4067-ab5a-032365494fde): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 10:29:38 crc kubenswrapper[4733]: E0318 10:29:38.331612 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-5784578c99-9txbj" podUID="4ad2d88a-c733-4409-b07b-5ff4661e1b68" Mar 18 10:29:38 crc kubenswrapper[4733]: E0318 10:29:38.331672 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-v2pb2" podUID="6ea742ac-3be9-4067-ab5a-032365494fde" Mar 18 10:29:38 crc kubenswrapper[4733]: E0318 10:29:38.331703 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-k64ch" podUID="e64c7cd6-a04b-440e-ac47-40f672fbc333" Mar 18 10:29:38 crc kubenswrapper[4733]: I0318 10:29:38.333545 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-sqr4g"] Mar 18 10:29:38 crc kubenswrapper[4733]: W0318 10:29:38.334025 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd9234ed_fcbc_4d81_9034_27d39b3df6ee.slice/crio-f653d2a2a5115dd8636cc652178e2a29bc712a9f89a4aa8314c3a96375f2887f WatchSource:0}: Error finding container f653d2a2a5115dd8636cc652178e2a29bc712a9f89a4aa8314c3a96375f2887f: Status 404 returned error can't find the container with id f653d2a2a5115dd8636cc652178e2a29bc712a9f89a4aa8314c3a96375f2887f Mar 18 10:29:38 crc kubenswrapper[4733]: E0318 10:29:38.336012 4733 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-psssn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6c4d75f7f9-sqr4g_openstack-operators(cd9234ed-fcbc-4d81-9034-27d39b3df6ee): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 10:29:38 crc kubenswrapper[4733]: E0318 10:29:38.337197 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-sqr4g" podUID="cd9234ed-fcbc-4d81-9034-27d39b3df6ee" Mar 18 10:29:38 crc kubenswrapper[4733]: I0318 10:29:38.360316 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6eca2f16-53b8-4173-ace4-18b7292b1369-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-s6rbv\" (UID: \"6eca2f16-53b8-4173-ace4-18b7292b1369\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-s6rbv" Mar 18 10:29:38 crc kubenswrapper[4733]: E0318 10:29:38.360505 4733 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 10:29:38 crc kubenswrapper[4733]: E0318 10:29:38.360557 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6eca2f16-53b8-4173-ace4-18b7292b1369-cert podName:6eca2f16-53b8-4173-ace4-18b7292b1369 nodeName:}" failed. No retries permitted until 2026-03-18 10:29:40.360540975 +0000 UTC m=+1019.852275310 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6eca2f16-53b8-4173-ace4-18b7292b1369-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-s6rbv" (UID: "6eca2f16-53b8-4173-ace4-18b7292b1369") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 10:29:38 crc kubenswrapper[4733]: I0318 10:29:38.443783 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-chmbd" event={"ID":"ae8a8fbc-d425-4da5-afb3-438a85a43722","Type":"ContainerStarted","Data":"52924df880af4e3e594152ddc2ac8ec42109dc8f64facd9860786f838b963f8c"} Mar 18 10:29:38 crc kubenswrapper[4733]: E0318 10:29:38.446434 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:f2e0b0fb34995b8acbbf1b0b60b5dbcf488b4f3899d1bb0763ae7dcee9bae6da\\\"\"" pod="openstack-operators/manila-operator-controller-manager-55f864c847-chmbd" podUID="ae8a8fbc-d425-4da5-afb3-438a85a43722" Mar 18 10:29:38 crc kubenswrapper[4733]: I0318 10:29:38.448925 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-nskpj" event={"ID":"6152e0d7-6362-4c7d-ba2b-4a1e55ca4f54","Type":"ContainerStarted","Data":"b1c5ce101e4b02d694f4e46490df1e2a8d3c161b0cea82fab9664954d93969c4"} Mar 18 10:29:38 crc kubenswrapper[4733]: E0318 10:29:38.450657 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-nskpj" podUID="6152e0d7-6362-4c7d-ba2b-4a1e55ca4f54" Mar 18 10:29:38 crc kubenswrapper[4733]: I0318 10:29:38.451848 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-fd4t7" event={"ID":"759f85a1-4e24-4b61-879b-90801d648683","Type":"ContainerStarted","Data":"99b01cc5f636ca11bb07336e824f83b58a31e7c5eb67b8193ba81955c51fdacc"} Mar 18 10:29:38 crc kubenswrapper[4733]: I0318 10:29:38.453690 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-gkndg" event={"ID":"216f9239-7d2e-483e-a89f-0955a518aa4a","Type":"ContainerStarted","Data":"717164cede3a5aeefb57345884e06771787e1902761118a64f9a630f0786dfab"} Mar 18 10:29:38 crc kubenswrapper[4733]: E0318 10:29:38.454384 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-fd4t7" podUID="759f85a1-4e24-4b61-879b-90801d648683" Mar 18 10:29:38 crc kubenswrapper[4733]: I0318 10:29:38.454960 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-v2pb2" event={"ID":"6ea742ac-3be9-4067-ab5a-032365494fde","Type":"ContainerStarted","Data":"a4a2daa0e2c90160ccc562b8b047470f9d8ba2da90c1e739c76cddd6f5c0d339"} Mar 18 10:29:38 crc kubenswrapper[4733]: I0318 10:29:38.455873 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-pcscc" event={"ID":"fd146b1e-59a9-4246-9520-f2d6f6cf6cd1","Type":"ContainerStarted","Data":"42b2fa3897738c27aae27e64fe8fd12e27930763f1c5ab42d550a86a108f7054"} Mar 18 10:29:38 crc kubenswrapper[4733]: E0318 10:29:38.456245 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-v2pb2" podUID="6ea742ac-3be9-4067-ab5a-032365494fde" Mar 18 10:29:38 crc kubenswrapper[4733]: I0318 10:29:38.461248 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-9txbj" event={"ID":"4ad2d88a-c733-4409-b07b-5ff4661e1b68","Type":"ContainerStarted","Data":"810ef674bbeea3daf9a479b03dc0f650895e68ece3f19c3fba39a64077d8a2da"} Mar 18 10:29:38 crc kubenswrapper[4733]: E0318 10:29:38.463041 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5784578c99-9txbj" podUID="4ad2d88a-c733-4409-b07b-5ff4661e1b68" Mar 18 10:29:38 crc kubenswrapper[4733]: I0318 10:29:38.463732 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-tp4s7" event={"ID":"de7565f5-677b-4aeb-90ab-0d632b28b295","Type":"ContainerStarted","Data":"7c8b4e22673abb3f2244e1c43a8b76bdb4ef12d3caa4a97558cbbf7d2372e1f7"} Mar 18 10:29:38 crc kubenswrapper[4733]: I0318 10:29:38.466460 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-jmwdk" event={"ID":"f93025ae-ebc3-4aed-bfde-e514d8b814ce","Type":"ContainerStarted","Data":"7923bc481871d27755290fc6dca542bca37b41aefe9f9c9377630357efee6cc5"} Mar 18 10:29:38 crc kubenswrapper[4733]: I0318 10:29:38.468030 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-sfv8v" event={"ID":"0fb2ba68-fa0f-4483-afdf-2eb381c54320","Type":"ContainerStarted","Data":"088842a868b02ec21f48d7509be6d13ba2f98f0fb318430fd498f2f31804837a"} Mar 18 10:29:38 crc kubenswrapper[4733]: I0318 10:29:38.472180 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-ljvrt" event={"ID":"bc0e28fc-cff0-4c39-8073-61d5d6481866","Type":"ContainerStarted","Data":"0dcd0d8305b68c607bd9507592f538c00ad22c5eee49e0dc66436e89416262f9"} Mar 18 10:29:38 crc kubenswrapper[4733]: I0318 10:29:38.476553 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-t8796" event={"ID":"748f4855-3978-4ecd-805e-0fee34ce0094","Type":"ContainerStarted","Data":"d0174c0fd3f0a67e352fae83a5dcec557c52831c71be828615bf69e0cd5d5c94"} Mar 18 10:29:38 crc kubenswrapper[4733]: I0318 10:29:38.478118 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-wkjtf" event={"ID":"838f8a80-01c0-41d8-b431-2a23c9235fab","Type":"ContainerStarted","Data":"26393f7f6af834af39a55ae9cf8af4ddf7cc7d8a2b57dfc8c2a817b2a41425bd"} Mar 18 10:29:38 crc kubenswrapper[4733]: I0318 10:29:38.487270 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-flv24" event={"ID":"6762c515-b422-4157-a8ce-b9ca4781e134","Type":"ContainerStarted","Data":"f18e7a9763b0f4e2f719e251cdfefbaab95d7906fe1753a5def0e9d13fcf2f00"} Mar 18 10:29:38 crc kubenswrapper[4733]: I0318 10:29:38.489908 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-22wt5" event={"ID":"31999dbe-554e-4168-a902-1f62e82ce854","Type":"ContainerStarted","Data":"5206dc219d51946d37b6adb4bfce1ad2ae9878b57407a33030422fe6a224973c"} Mar 18 10:29:38 crc kubenswrapper[4733]: I0318 10:29:38.492784 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-k64ch" event={"ID":"e64c7cd6-a04b-440e-ac47-40f672fbc333","Type":"ContainerStarted","Data":"8e41e541f602b6562d26e9625b6313839135de78bad4176fe60420dda46a9b37"} Mar 18 10:29:38 crc kubenswrapper[4733]: E0318 10:29:38.493933 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-k64ch" podUID="e64c7cd6-a04b-440e-ac47-40f672fbc333" Mar 18 10:29:38 crc kubenswrapper[4733]: I0318 10:29:38.495263 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-sqr4g" event={"ID":"cd9234ed-fcbc-4d81-9034-27d39b3df6ee","Type":"ContainerStarted","Data":"f653d2a2a5115dd8636cc652178e2a29bc712a9f89a4aa8314c3a96375f2887f"} Mar 18 10:29:38 crc kubenswrapper[4733]: E0318 10:29:38.498169 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-sqr4g" podUID="cd9234ed-fcbc-4d81-9034-27d39b3df6ee" Mar 18 10:29:38 crc kubenswrapper[4733]: I0318 10:29:38.503808 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-cxlns" event={"ID":"bd5ae902-d036-4e52-983d-aa3e1a86dca8","Type":"ContainerStarted","Data":"be9db7d7665259b9c7f9456c3fa4100b6ed161f66799a501bf6e6d7163bd9be4"} Mar 18 10:29:38 crc kubenswrapper[4733]: I0318 10:29:38.507098 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-4xzlc" event={"ID":"79dfdcde-0538-4777-959e-1daf2b6263de","Type":"ContainerStarted","Data":"fb165269bb33745d1b2a430bb177e8169a32c140810add4c6ba806589792fde4"} Mar 18 10:29:38 crc kubenswrapper[4733]: I0318 10:29:38.770354 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a4b7e706-a9a7-490a-84a8-094d1d909ba8-webhook-certs\") pod \"openstack-operator-controller-manager-85877db48-qvlf2\" (UID: \"a4b7e706-a9a7-490a-84a8-094d1d909ba8\") " pod="openstack-operators/openstack-operator-controller-manager-85877db48-qvlf2" Mar 18 10:29:38 crc kubenswrapper[4733]: I0318 10:29:38.770455 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a4b7e706-a9a7-490a-84a8-094d1d909ba8-metrics-certs\") pod \"openstack-operator-controller-manager-85877db48-qvlf2\" (UID: \"a4b7e706-a9a7-490a-84a8-094d1d909ba8\") " pod="openstack-operators/openstack-operator-controller-manager-85877db48-qvlf2" Mar 18 10:29:38 crc kubenswrapper[4733]: E0318 10:29:38.770605 4733 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 10:29:38 crc kubenswrapper[4733]: E0318 10:29:38.770709 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4b7e706-a9a7-490a-84a8-094d1d909ba8-metrics-certs podName:a4b7e706-a9a7-490a-84a8-094d1d909ba8 nodeName:}" failed. No retries permitted until 2026-03-18 10:29:40.770682036 +0000 UTC m=+1020.262416361 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a4b7e706-a9a7-490a-84a8-094d1d909ba8-metrics-certs") pod "openstack-operator-controller-manager-85877db48-qvlf2" (UID: "a4b7e706-a9a7-490a-84a8-094d1d909ba8") : secret "metrics-server-cert" not found Mar 18 10:29:38 crc kubenswrapper[4733]: E0318 10:29:38.770707 4733 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 10:29:38 crc kubenswrapper[4733]: E0318 10:29:38.770799 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4b7e706-a9a7-490a-84a8-094d1d909ba8-webhook-certs podName:a4b7e706-a9a7-490a-84a8-094d1d909ba8 nodeName:}" failed. No retries permitted until 2026-03-18 10:29:40.770774158 +0000 UTC m=+1020.262508513 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a4b7e706-a9a7-490a-84a8-094d1d909ba8-webhook-certs") pod "openstack-operator-controller-manager-85877db48-qvlf2" (UID: "a4b7e706-a9a7-490a-84a8-094d1d909ba8") : secret "webhook-server-cert" not found Mar 18 10:29:39 crc kubenswrapper[4733]: E0318 10:29:39.519244 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-k64ch" podUID="e64c7cd6-a04b-440e-ac47-40f672fbc333" Mar 18 10:29:39 crc kubenswrapper[4733]: E0318 10:29:39.519716 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-sqr4g" podUID="cd9234ed-fcbc-4d81-9034-27d39b3df6ee" Mar 18 10:29:39 crc kubenswrapper[4733]: E0318 10:29:39.519741 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-fd4t7" podUID="759f85a1-4e24-4b61-879b-90801d648683" Mar 18 10:29:39 crc kubenswrapper[4733]: E0318 10:29:39.519763 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-v2pb2" podUID="6ea742ac-3be9-4067-ab5a-032365494fde" Mar 18 10:29:39 crc kubenswrapper[4733]: E0318 10:29:39.521600 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-nskpj" podUID="6152e0d7-6362-4c7d-ba2b-4a1e55ca4f54" Mar 18 10:29:39 crc kubenswrapper[4733]: E0318 10:29:39.521594 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:f2e0b0fb34995b8acbbf1b0b60b5dbcf488b4f3899d1bb0763ae7dcee9bae6da\\\"\"" pod="openstack-operators/manila-operator-controller-manager-55f864c847-chmbd" podUID="ae8a8fbc-d425-4da5-afb3-438a85a43722" Mar 18 10:29:39 crc kubenswrapper[4733]: E0318 10:29:39.547628 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5784578c99-9txbj" podUID="4ad2d88a-c733-4409-b07b-5ff4661e1b68" Mar 18 10:29:40 crc kubenswrapper[4733]: I0318 10:29:40.194973 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/651c7dd5-3adc-48b4-b579-309258aa3735-cert\") pod \"infra-operator-controller-manager-74c694b97b-j4snz\" (UID: \"651c7dd5-3adc-48b4-b579-309258aa3735\") " pod="openstack-operators/infra-operator-controller-manager-74c694b97b-j4snz" Mar 18 10:29:40 crc kubenswrapper[4733]: E0318 10:29:40.195163 4733 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 10:29:40 crc kubenswrapper[4733]: E0318 10:29:40.195252 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/651c7dd5-3adc-48b4-b579-309258aa3735-cert podName:651c7dd5-3adc-48b4-b579-309258aa3735 nodeName:}" failed. No retries permitted until 2026-03-18 10:29:44.195232674 +0000 UTC m=+1023.686966999 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/651c7dd5-3adc-48b4-b579-309258aa3735-cert") pod "infra-operator-controller-manager-74c694b97b-j4snz" (UID: "651c7dd5-3adc-48b4-b579-309258aa3735") : secret "infra-operator-webhook-server-cert" not found Mar 18 10:29:40 crc kubenswrapper[4733]: I0318 10:29:40.397923 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6eca2f16-53b8-4173-ace4-18b7292b1369-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-s6rbv\" (UID: \"6eca2f16-53b8-4173-ace4-18b7292b1369\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-s6rbv" Mar 18 10:29:40 crc kubenswrapper[4733]: E0318 10:29:40.398097 4733 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 10:29:40 crc kubenswrapper[4733]: E0318 10:29:40.398166 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6eca2f16-53b8-4173-ace4-18b7292b1369-cert podName:6eca2f16-53b8-4173-ace4-18b7292b1369 nodeName:}" failed. No retries permitted until 2026-03-18 10:29:44.398149803 +0000 UTC m=+1023.889884128 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6eca2f16-53b8-4173-ace4-18b7292b1369-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-s6rbv" (UID: "6eca2f16-53b8-4173-ace4-18b7292b1369") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 10:29:40 crc kubenswrapper[4733]: I0318 10:29:40.814680 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a4b7e706-a9a7-490a-84a8-094d1d909ba8-webhook-certs\") pod \"openstack-operator-controller-manager-85877db48-qvlf2\" (UID: \"a4b7e706-a9a7-490a-84a8-094d1d909ba8\") " pod="openstack-operators/openstack-operator-controller-manager-85877db48-qvlf2" Mar 18 10:29:40 crc kubenswrapper[4733]: I0318 10:29:40.814763 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a4b7e706-a9a7-490a-84a8-094d1d909ba8-metrics-certs\") pod \"openstack-operator-controller-manager-85877db48-qvlf2\" (UID: \"a4b7e706-a9a7-490a-84a8-094d1d909ba8\") " pod="openstack-operators/openstack-operator-controller-manager-85877db48-qvlf2" Mar 18 10:29:40 crc kubenswrapper[4733]: E0318 10:29:40.814849 4733 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 10:29:40 crc kubenswrapper[4733]: E0318 10:29:40.814910 4733 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 10:29:40 crc kubenswrapper[4733]: E0318 10:29:40.814916 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4b7e706-a9a7-490a-84a8-094d1d909ba8-webhook-certs podName:a4b7e706-a9a7-490a-84a8-094d1d909ba8 nodeName:}" failed. No retries permitted until 2026-03-18 10:29:44.814900229 +0000 UTC m=+1024.306634554 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a4b7e706-a9a7-490a-84a8-094d1d909ba8-webhook-certs") pod "openstack-operator-controller-manager-85877db48-qvlf2" (UID: "a4b7e706-a9a7-490a-84a8-094d1d909ba8") : secret "webhook-server-cert" not found Mar 18 10:29:40 crc kubenswrapper[4733]: E0318 10:29:40.814969 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4b7e706-a9a7-490a-84a8-094d1d909ba8-metrics-certs podName:a4b7e706-a9a7-490a-84a8-094d1d909ba8 nodeName:}" failed. No retries permitted until 2026-03-18 10:29:44.814955801 +0000 UTC m=+1024.306690126 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a4b7e706-a9a7-490a-84a8-094d1d909ba8-metrics-certs") pod "openstack-operator-controller-manager-85877db48-qvlf2" (UID: "a4b7e706-a9a7-490a-84a8-094d1d909ba8") : secret "metrics-server-cert" not found Mar 18 10:29:43 crc kubenswrapper[4733]: I0318 10:29:43.570885 4733 patch_prober.go:28] interesting pod/machine-config-daemon-2h7dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:29:43 crc kubenswrapper[4733]: I0318 10:29:43.571323 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:29:44 crc kubenswrapper[4733]: I0318 10:29:44.268304 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/651c7dd5-3adc-48b4-b579-309258aa3735-cert\") pod \"infra-operator-controller-manager-74c694b97b-j4snz\" (UID: \"651c7dd5-3adc-48b4-b579-309258aa3735\") " pod="openstack-operators/infra-operator-controller-manager-74c694b97b-j4snz" Mar 18 10:29:44 crc kubenswrapper[4733]: E0318 10:29:44.268528 4733 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 10:29:44 crc kubenswrapper[4733]: E0318 10:29:44.268614 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/651c7dd5-3adc-48b4-b579-309258aa3735-cert podName:651c7dd5-3adc-48b4-b579-309258aa3735 nodeName:}" failed. No retries permitted until 2026-03-18 10:29:52.268590943 +0000 UTC m=+1031.760325338 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/651c7dd5-3adc-48b4-b579-309258aa3735-cert") pod "infra-operator-controller-manager-74c694b97b-j4snz" (UID: "651c7dd5-3adc-48b4-b579-309258aa3735") : secret "infra-operator-webhook-server-cert" not found Mar 18 10:29:44 crc kubenswrapper[4733]: I0318 10:29:44.471589 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6eca2f16-53b8-4173-ace4-18b7292b1369-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-s6rbv\" (UID: \"6eca2f16-53b8-4173-ace4-18b7292b1369\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-s6rbv" Mar 18 10:29:44 crc kubenswrapper[4733]: E0318 10:29:44.471839 4733 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 10:29:44 crc kubenswrapper[4733]: E0318 10:29:44.472020 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6eca2f16-53b8-4173-ace4-18b7292b1369-cert podName:6eca2f16-53b8-4173-ace4-18b7292b1369 nodeName:}" failed. No retries permitted until 2026-03-18 10:29:52.471997016 +0000 UTC m=+1031.963731361 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6eca2f16-53b8-4173-ace4-18b7292b1369-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-s6rbv" (UID: "6eca2f16-53b8-4173-ace4-18b7292b1369") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 10:29:44 crc kubenswrapper[4733]: I0318 10:29:44.876420 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a4b7e706-a9a7-490a-84a8-094d1d909ba8-metrics-certs\") pod \"openstack-operator-controller-manager-85877db48-qvlf2\" (UID: \"a4b7e706-a9a7-490a-84a8-094d1d909ba8\") " pod="openstack-operators/openstack-operator-controller-manager-85877db48-qvlf2" Mar 18 10:29:44 crc kubenswrapper[4733]: I0318 10:29:44.876587 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a4b7e706-a9a7-490a-84a8-094d1d909ba8-webhook-certs\") pod \"openstack-operator-controller-manager-85877db48-qvlf2\" (UID: \"a4b7e706-a9a7-490a-84a8-094d1d909ba8\") " pod="openstack-operators/openstack-operator-controller-manager-85877db48-qvlf2" Mar 18 10:29:44 crc kubenswrapper[4733]: E0318 10:29:44.876635 4733 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 10:29:44 crc kubenswrapper[4733]: E0318 10:29:44.876725 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4b7e706-a9a7-490a-84a8-094d1d909ba8-metrics-certs podName:a4b7e706-a9a7-490a-84a8-094d1d909ba8 nodeName:}" failed. No retries permitted until 2026-03-18 10:29:52.876702051 +0000 UTC m=+1032.368436446 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a4b7e706-a9a7-490a-84a8-094d1d909ba8-metrics-certs") pod "openstack-operator-controller-manager-85877db48-qvlf2" (UID: "a4b7e706-a9a7-490a-84a8-094d1d909ba8") : secret "metrics-server-cert" not found Mar 18 10:29:44 crc kubenswrapper[4733]: E0318 10:29:44.876786 4733 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 10:29:44 crc kubenswrapper[4733]: E0318 10:29:44.876854 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4b7e706-a9a7-490a-84a8-094d1d909ba8-webhook-certs podName:a4b7e706-a9a7-490a-84a8-094d1d909ba8 nodeName:}" failed. No retries permitted until 2026-03-18 10:29:52.876833655 +0000 UTC m=+1032.368568040 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a4b7e706-a9a7-490a-84a8-094d1d909ba8-webhook-certs") pod "openstack-operator-controller-manager-85877db48-qvlf2" (UID: "a4b7e706-a9a7-490a-84a8-094d1d909ba8") : secret "webhook-server-cert" not found Mar 18 10:29:51 crc kubenswrapper[4733]: I0318 10:29:51.622637 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-jmwdk" event={"ID":"f93025ae-ebc3-4aed-bfde-e514d8b814ce","Type":"ContainerStarted","Data":"471ea8594a79d8f2f603499be05353299288503e28f7302da27e714c8e91c9f3"} Mar 18 10:29:51 crc kubenswrapper[4733]: I0318 10:29:51.623163 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-jmwdk" Mar 18 10:29:51 crc kubenswrapper[4733]: I0318 10:29:51.633779 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-22wt5" event={"ID":"31999dbe-554e-4168-a902-1f62e82ce854","Type":"ContainerStarted","Data":"9855c34d85194f685066208c952a3c912a12cac69f6aa9c2e16fa29a45dc6639"} Mar 18 10:29:51 crc kubenswrapper[4733]: I0318 10:29:51.633918 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-22wt5" Mar 18 10:29:51 crc kubenswrapper[4733]: I0318 10:29:51.636337 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-sfv8v" event={"ID":"0fb2ba68-fa0f-4483-afdf-2eb381c54320","Type":"ContainerStarted","Data":"bd08a464f25b997614e25816121dcdf634e8a1cdc018e7a9099a3dd0fcc112c1"} Mar 18 10:29:51 crc kubenswrapper[4733]: I0318 10:29:51.636802 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-sfv8v" Mar 18 10:29:51 crc kubenswrapper[4733]: I0318 10:29:51.643514 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-4xzlc" event={"ID":"79dfdcde-0538-4777-959e-1daf2b6263de","Type":"ContainerStarted","Data":"c71bed7fb0c0f1badd4e2e99f5fdb45c98e995f1b2e899eea883d9051044b4d5"} Mar 18 10:29:51 crc kubenswrapper[4733]: I0318 10:29:51.644067 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-4xzlc" Mar 18 10:29:51 crc kubenswrapper[4733]: I0318 10:29:51.648170 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-ljvrt" event={"ID":"bc0e28fc-cff0-4c39-8073-61d5d6481866","Type":"ContainerStarted","Data":"52486ba6a080e1684f6bf49c4cc34a4c30accc27c2f37344a6e7db1fb04a9ff2"} Mar 18 10:29:51 crc kubenswrapper[4733]: I0318 10:29:51.648858 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-ljvrt" Mar 18 10:29:51 crc kubenswrapper[4733]: I0318 10:29:51.695070 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-tp4s7" event={"ID":"de7565f5-677b-4aeb-90ab-0d632b28b295","Type":"ContainerStarted","Data":"73bbdd7a5a2137122c760ed8676dff0b80127d8b5f32cd1ed3d915ab84e65207"} Mar 18 10:29:51 crc kubenswrapper[4733]: I0318 10:29:51.695822 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-tp4s7" Mar 18 10:29:51 crc kubenswrapper[4733]: I0318 10:29:51.700093 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-cxlns" event={"ID":"bd5ae902-d036-4e52-983d-aa3e1a86dca8","Type":"ContainerStarted","Data":"9275c970ce11d7157c2d8eb0d744d0b6dbffe6ca0eb408e5963bb46c740847c2"} Mar 18 10:29:51 crc kubenswrapper[4733]: I0318 10:29:51.700626 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-cxlns" Mar 18 10:29:51 crc kubenswrapper[4733]: I0318 10:29:51.715440 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-v6zxn" event={"ID":"8fe910c4-798b-4381-a71d-697459f7f79a","Type":"ContainerStarted","Data":"738fe7648e2881bb14f06a1f26c69cafb94899de265e9a0d967a8c1fc7512891"} Mar 18 10:29:51 crc kubenswrapper[4733]: I0318 10:29:51.716134 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-v6zxn" Mar 18 10:29:51 crc kubenswrapper[4733]: I0318 10:29:51.721028 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-gkndg" event={"ID":"216f9239-7d2e-483e-a89f-0955a518aa4a","Type":"ContainerStarted","Data":"e2fc3c58d162844f3cb91d62cbe3c85e1f5f2ef70322bba2afe7c5f97c10d1d5"} Mar 18 10:29:51 crc kubenswrapper[4733]: I0318 10:29:51.721739 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-767865f676-gkndg" Mar 18 10:29:51 crc kubenswrapper[4733]: I0318 10:29:51.723805 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-pcscc" event={"ID":"fd146b1e-59a9-4246-9520-f2d6f6cf6cd1","Type":"ContainerStarted","Data":"f17da01172944e9d80d11af12daace16ec5a3506eaaaefab4432d565d3f6802d"} Mar 18 10:29:51 crc kubenswrapper[4733]: I0318 10:29:51.724211 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-pcscc" Mar 18 10:29:51 crc kubenswrapper[4733]: I0318 10:29:51.727396 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-flv24" event={"ID":"6762c515-b422-4157-a8ce-b9ca4781e134","Type":"ContainerStarted","Data":"0c36a7a036f48577f6a51a7367515826c8a5b92f44acf10407371c9c3f26f282"} Mar 18 10:29:51 crc kubenswrapper[4733]: I0318 10:29:51.727819 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-884679f54-flv24" Mar 18 10:29:51 crc kubenswrapper[4733]: I0318 10:29:51.727935 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-jmwdk" podStartSLOduration=3.168346439 podStartE2EDuration="15.727919735s" podCreationTimestamp="2026-03-18 10:29:36 +0000 UTC" firstStartedPulling="2026-03-18 10:29:38.209049506 +0000 UTC m=+1017.700783831" lastFinishedPulling="2026-03-18 10:29:50.768622782 +0000 UTC m=+1030.260357127" observedRunningTime="2026-03-18 10:29:51.726589307 +0000 UTC m=+1031.218323632" watchObservedRunningTime="2026-03-18 10:29:51.727919735 +0000 UTC m=+1031.219654060" Mar 18 10:29:51 crc kubenswrapper[4733]: I0318 10:29:51.754208 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-t8796" event={"ID":"748f4855-3978-4ecd-805e-0fee34ce0094","Type":"ContainerStarted","Data":"249205d73477e093aa2dbec75fa5a6f569b11b1f3effb2950ecd8e672a994f59"} Mar 18 10:29:51 crc kubenswrapper[4733]: I0318 10:29:51.754254 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-t8796" Mar 18 10:29:51 crc kubenswrapper[4733]: I0318 10:29:51.765391 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-wkjtf" event={"ID":"838f8a80-01c0-41d8-b431-2a23c9235fab","Type":"ContainerStarted","Data":"68c85c5e0e8adc64073999b3dc1858a7182233bef9805a6deb0937d07a7a72df"} Mar 18 10:29:51 crc kubenswrapper[4733]: I0318 10:29:51.766004 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-wkjtf" Mar 18 10:29:51 crc kubenswrapper[4733]: I0318 10:29:51.972032 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-cxlns" podStartSLOduration=3.379777279 podStartE2EDuration="15.972012792s" podCreationTimestamp="2026-03-18 10:29:36 +0000 UTC" firstStartedPulling="2026-03-18 10:29:38.127306456 +0000 UTC m=+1017.619040781" lastFinishedPulling="2026-03-18 10:29:50.719541979 +0000 UTC m=+1030.211276294" observedRunningTime="2026-03-18 10:29:51.97123657 +0000 UTC m=+1031.462970895" watchObservedRunningTime="2026-03-18 10:29:51.972012792 +0000 UTC m=+1031.463747127" Mar 18 10:29:52 crc kubenswrapper[4733]: I0318 10:29:52.145839 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-tp4s7" podStartSLOduration=3.51262107 podStartE2EDuration="16.145816615s" podCreationTimestamp="2026-03-18 10:29:36 +0000 UTC" firstStartedPulling="2026-03-18 10:29:38.120787181 +0000 UTC m=+1017.612521496" lastFinishedPulling="2026-03-18 10:29:50.753982716 +0000 UTC m=+1030.245717041" observedRunningTime="2026-03-18 10:29:52.134864394 +0000 UTC m=+1031.626598719" watchObservedRunningTime="2026-03-18 10:29:52.145816615 +0000 UTC m=+1031.637550960" Mar 18 10:29:52 crc kubenswrapper[4733]: I0318 10:29:52.285124 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-pcscc" podStartSLOduration=3.714139679 podStartE2EDuration="16.285106748s" podCreationTimestamp="2026-03-18 10:29:36 +0000 UTC" firstStartedPulling="2026-03-18 10:29:38.138599986 +0000 UTC m=+1017.630334311" lastFinishedPulling="2026-03-18 10:29:50.709567055 +0000 UTC m=+1030.201301380" observedRunningTime="2026-03-18 10:29:52.264513024 +0000 UTC m=+1031.756247349" watchObservedRunningTime="2026-03-18 10:29:52.285106748 +0000 UTC m=+1031.776841073" Mar 18 10:29:52 crc kubenswrapper[4733]: I0318 10:29:52.294101 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/651c7dd5-3adc-48b4-b579-309258aa3735-cert\") pod \"infra-operator-controller-manager-74c694b97b-j4snz\" (UID: \"651c7dd5-3adc-48b4-b579-309258aa3735\") " pod="openstack-operators/infra-operator-controller-manager-74c694b97b-j4snz" Mar 18 10:29:52 crc kubenswrapper[4733]: E0318 10:29:52.294332 4733 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 10:29:52 crc kubenswrapper[4733]: E0318 10:29:52.294380 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/651c7dd5-3adc-48b4-b579-309258aa3735-cert podName:651c7dd5-3adc-48b4-b579-309258aa3735 nodeName:}" failed. No retries permitted until 2026-03-18 10:30:08.294367061 +0000 UTC m=+1047.786101376 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/651c7dd5-3adc-48b4-b579-309258aa3735-cert") pod "infra-operator-controller-manager-74c694b97b-j4snz" (UID: "651c7dd5-3adc-48b4-b579-309258aa3735") : secret "infra-operator-webhook-server-cert" not found Mar 18 10:29:52 crc kubenswrapper[4733]: I0318 10:29:52.339458 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-4xzlc" podStartSLOduration=3.724522144 podStartE2EDuration="16.33943963s" podCreationTimestamp="2026-03-18 10:29:36 +0000 UTC" firstStartedPulling="2026-03-18 10:29:38.122120819 +0000 UTC m=+1017.613855144" lastFinishedPulling="2026-03-18 10:29:50.737038295 +0000 UTC m=+1030.228772630" observedRunningTime="2026-03-18 10:29:52.338523944 +0000 UTC m=+1031.830258269" watchObservedRunningTime="2026-03-18 10:29:52.33943963 +0000 UTC m=+1031.831173955" Mar 18 10:29:52 crc kubenswrapper[4733]: I0318 10:29:52.419834 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-767865f676-gkndg" podStartSLOduration=3.929982484 podStartE2EDuration="16.419820801s" podCreationTimestamp="2026-03-18 10:29:36 +0000 UTC" firstStartedPulling="2026-03-18 10:29:38.209038265 +0000 UTC m=+1017.700772590" lastFinishedPulling="2026-03-18 10:29:50.698876582 +0000 UTC m=+1030.190610907" observedRunningTime="2026-03-18 10:29:52.417534386 +0000 UTC m=+1031.909268711" watchObservedRunningTime="2026-03-18 10:29:52.419820801 +0000 UTC m=+1031.911555126" Mar 18 10:29:52 crc kubenswrapper[4733]: I0318 10:29:52.421624 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-ljvrt" podStartSLOduration=3.830720137 podStartE2EDuration="16.421619662s" podCreationTimestamp="2026-03-18 10:29:36 +0000 UTC" firstStartedPulling="2026-03-18 10:29:38.119140414 +0000 UTC m=+1017.610874739" lastFinishedPulling="2026-03-18 10:29:50.710039939 +0000 UTC m=+1030.201774264" observedRunningTime="2026-03-18 10:29:52.372087587 +0000 UTC m=+1031.863821912" watchObservedRunningTime="2026-03-18 10:29:52.421619662 +0000 UTC m=+1031.913353977" Mar 18 10:29:52 crc kubenswrapper[4733]: I0318 10:29:52.441983 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-v6zxn" podStartSLOduration=3.711302458 podStartE2EDuration="16.44196827s" podCreationTimestamp="2026-03-18 10:29:36 +0000 UTC" firstStartedPulling="2026-03-18 10:29:37.362369997 +0000 UTC m=+1016.854104322" lastFinishedPulling="2026-03-18 10:29:50.093035809 +0000 UTC m=+1029.584770134" observedRunningTime="2026-03-18 10:29:52.439251473 +0000 UTC m=+1031.930985798" watchObservedRunningTime="2026-03-18 10:29:52.44196827 +0000 UTC m=+1031.933702595" Mar 18 10:29:52 crc kubenswrapper[4733]: I0318 10:29:52.466043 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-22wt5" podStartSLOduration=3.978026058 podStartE2EDuration="16.465975531s" podCreationTimestamp="2026-03-18 10:29:36 +0000 UTC" firstStartedPulling="2026-03-18 10:29:38.198865527 +0000 UTC m=+1017.690599852" lastFinishedPulling="2026-03-18 10:29:50.686815 +0000 UTC m=+1030.178549325" observedRunningTime="2026-03-18 10:29:52.462966866 +0000 UTC m=+1031.954701201" watchObservedRunningTime="2026-03-18 10:29:52.465975531 +0000 UTC m=+1031.957709856" Mar 18 10:29:52 crc kubenswrapper[4733]: I0318 10:29:52.496299 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6eca2f16-53b8-4173-ace4-18b7292b1369-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-s6rbv\" (UID: \"6eca2f16-53b8-4173-ace4-18b7292b1369\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-s6rbv" Mar 18 10:29:52 crc kubenswrapper[4733]: E0318 10:29:52.496438 4733 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 10:29:52 crc kubenswrapper[4733]: E0318 10:29:52.496495 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6eca2f16-53b8-4173-ace4-18b7292b1369-cert podName:6eca2f16-53b8-4173-ace4-18b7292b1369 nodeName:}" failed. No retries permitted until 2026-03-18 10:30:08.496479307 +0000 UTC m=+1047.988213622 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6eca2f16-53b8-4173-ace4-18b7292b1369-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-s6rbv" (UID: "6eca2f16-53b8-4173-ace4-18b7292b1369") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 10:29:52 crc kubenswrapper[4733]: I0318 10:29:52.504290 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-sfv8v" podStartSLOduration=4.213550232 podStartE2EDuration="16.504274258s" podCreationTimestamp="2026-03-18 10:29:36 +0000 UTC" firstStartedPulling="2026-03-18 10:29:37.802366394 +0000 UTC m=+1017.294100719" lastFinishedPulling="2026-03-18 10:29:50.09309042 +0000 UTC m=+1029.584824745" observedRunningTime="2026-03-18 10:29:52.488933012 +0000 UTC m=+1031.980667337" watchObservedRunningTime="2026-03-18 10:29:52.504274258 +0000 UTC m=+1031.996008573" Mar 18 10:29:52 crc kubenswrapper[4733]: I0318 10:29:52.509472 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-wkjtf" podStartSLOduration=3.926944169 podStartE2EDuration="16.509460435s" podCreationTimestamp="2026-03-18 10:29:36 +0000 UTC" firstStartedPulling="2026-03-18 10:29:38.155689412 +0000 UTC m=+1017.647423727" lastFinishedPulling="2026-03-18 10:29:50.738205668 +0000 UTC m=+1030.229939993" observedRunningTime="2026-03-18 10:29:52.504089563 +0000 UTC m=+1031.995823888" watchObservedRunningTime="2026-03-18 10:29:52.509460435 +0000 UTC m=+1032.001194750" Mar 18 10:29:52 crc kubenswrapper[4733]: I0318 10:29:52.544803 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-884679f54-flv24" podStartSLOduration=4.040355166 podStartE2EDuration="16.544783747s" podCreationTimestamp="2026-03-18 10:29:36 +0000 UTC" firstStartedPulling="2026-03-18 10:29:38.20426993 +0000 UTC m=+1017.696004255" lastFinishedPulling="2026-03-18 10:29:50.708698521 +0000 UTC m=+1030.200432836" observedRunningTime="2026-03-18 10:29:52.544604182 +0000 UTC m=+1032.036338507" watchObservedRunningTime="2026-03-18 10:29:52.544783747 +0000 UTC m=+1032.036518072" Mar 18 10:29:52 crc kubenswrapper[4733]: I0318 10:29:52.564134 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-t8796" podStartSLOduration=3.581592067 podStartE2EDuration="16.564117836s" podCreationTimestamp="2026-03-18 10:29:36 +0000 UTC" firstStartedPulling="2026-03-18 10:29:37.765015734 +0000 UTC m=+1017.256750059" lastFinishedPulling="2026-03-18 10:29:50.747541503 +0000 UTC m=+1030.239275828" observedRunningTime="2026-03-18 10:29:52.559417333 +0000 UTC m=+1032.051151658" watchObservedRunningTime="2026-03-18 10:29:52.564117836 +0000 UTC m=+1032.055852161" Mar 18 10:29:52 crc kubenswrapper[4733]: I0318 10:29:52.906042 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a4b7e706-a9a7-490a-84a8-094d1d909ba8-metrics-certs\") pod \"openstack-operator-controller-manager-85877db48-qvlf2\" (UID: \"a4b7e706-a9a7-490a-84a8-094d1d909ba8\") " pod="openstack-operators/openstack-operator-controller-manager-85877db48-qvlf2" Mar 18 10:29:52 crc kubenswrapper[4733]: I0318 10:29:52.906288 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a4b7e706-a9a7-490a-84a8-094d1d909ba8-webhook-certs\") pod \"openstack-operator-controller-manager-85877db48-qvlf2\" (UID: \"a4b7e706-a9a7-490a-84a8-094d1d909ba8\") " pod="openstack-operators/openstack-operator-controller-manager-85877db48-qvlf2" Mar 18 10:29:52 crc kubenswrapper[4733]: E0318 10:29:52.907375 4733 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 10:29:52 crc kubenswrapper[4733]: E0318 10:29:52.907435 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4b7e706-a9a7-490a-84a8-094d1d909ba8-metrics-certs podName:a4b7e706-a9a7-490a-84a8-094d1d909ba8 nodeName:}" failed. No retries permitted until 2026-03-18 10:30:08.907419219 +0000 UTC m=+1048.399153544 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a4b7e706-a9a7-490a-84a8-094d1d909ba8-metrics-certs") pod "openstack-operator-controller-manager-85877db48-qvlf2" (UID: "a4b7e706-a9a7-490a-84a8-094d1d909ba8") : secret "metrics-server-cert" not found Mar 18 10:29:52 crc kubenswrapper[4733]: E0318 10:29:52.907780 4733 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 10:29:52 crc kubenswrapper[4733]: E0318 10:29:52.907847 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4b7e706-a9a7-490a-84a8-094d1d909ba8-webhook-certs podName:a4b7e706-a9a7-490a-84a8-094d1d909ba8 nodeName:}" failed. No retries permitted until 2026-03-18 10:30:08.907830131 +0000 UTC m=+1048.399564456 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a4b7e706-a9a7-490a-84a8-094d1d909ba8-webhook-certs") pod "openstack-operator-controller-manager-85877db48-qvlf2" (UID: "a4b7e706-a9a7-490a-84a8-094d1d909ba8") : secret "webhook-server-cert" not found Mar 18 10:29:56 crc kubenswrapper[4733]: I0318 10:29:56.498682 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-sfv8v" Mar 18 10:29:56 crc kubenswrapper[4733]: I0318 10:29:56.505411 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-v6zxn" Mar 18 10:29:56 crc kubenswrapper[4733]: I0318 10:29:56.550912 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-t8796" Mar 18 10:29:56 crc kubenswrapper[4733]: I0318 10:29:56.584753 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-ljvrt" Mar 18 10:29:56 crc kubenswrapper[4733]: I0318 10:29:56.699750 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-cxlns" Mar 18 10:29:56 crc kubenswrapper[4733]: I0318 10:29:56.716349 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-wkjtf" Mar 18 10:29:56 crc kubenswrapper[4733]: I0318 10:29:56.750229 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-pcscc" Mar 18 10:29:56 crc kubenswrapper[4733]: I0318 10:29:56.875341 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-tp4s7" Mar 18 10:29:56 crc kubenswrapper[4733]: I0318 10:29:56.898393 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-4xzlc" Mar 18 10:29:56 crc kubenswrapper[4733]: I0318 10:29:56.920854 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-767865f676-gkndg" Mar 18 10:29:56 crc kubenswrapper[4733]: I0318 10:29:56.941476 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-jmwdk" Mar 18 10:29:56 crc kubenswrapper[4733]: I0318 10:29:56.964483 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-22wt5" Mar 18 10:29:57 crc kubenswrapper[4733]: I0318 10:29:57.096915 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-884679f54-flv24" Mar 18 10:30:00 crc kubenswrapper[4733]: I0318 10:30:00.141471 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563830-2qqd2"] Mar 18 10:30:00 crc kubenswrapper[4733]: I0318 10:30:00.142835 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563830-2qqd2" Mar 18 10:30:00 crc kubenswrapper[4733]: I0318 10:30:00.145106 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:30:00 crc kubenswrapper[4733]: I0318 10:30:00.145422 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:30:00 crc kubenswrapper[4733]: I0318 10:30:00.145589 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wmd5k" Mar 18 10:30:00 crc kubenswrapper[4733]: I0318 10:30:00.155502 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563830-2qqd2"] Mar 18 10:30:00 crc kubenswrapper[4733]: I0318 10:30:00.164362 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563830-kdgdh"] Mar 18 10:30:00 crc kubenswrapper[4733]: I0318 10:30:00.165133 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563830-kdgdh" Mar 18 10:30:00 crc kubenswrapper[4733]: I0318 10:30:00.167019 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 10:30:00 crc kubenswrapper[4733]: I0318 10:30:00.167693 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 10:30:00 crc kubenswrapper[4733]: I0318 10:30:00.183717 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563830-kdgdh"] Mar 18 10:30:00 crc kubenswrapper[4733]: I0318 10:30:00.221457 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4a150895-3ed5-4f19-8a97-3b65f3254672-config-volume\") pod \"collect-profiles-29563830-kdgdh\" (UID: \"4a150895-3ed5-4f19-8a97-3b65f3254672\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563830-kdgdh" Mar 18 10:30:00 crc kubenswrapper[4733]: I0318 10:30:00.221538 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vz2gq\" (UniqueName: \"kubernetes.io/projected/3c8eb139-c576-4ab3-8c2b-a309f3aa4a35-kube-api-access-vz2gq\") pod \"auto-csr-approver-29563830-2qqd2\" (UID: \"3c8eb139-c576-4ab3-8c2b-a309f3aa4a35\") " pod="openshift-infra/auto-csr-approver-29563830-2qqd2" Mar 18 10:30:00 crc kubenswrapper[4733]: I0318 10:30:00.221607 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4a150895-3ed5-4f19-8a97-3b65f3254672-secret-volume\") pod \"collect-profiles-29563830-kdgdh\" (UID: \"4a150895-3ed5-4f19-8a97-3b65f3254672\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563830-kdgdh" Mar 18 10:30:00 crc kubenswrapper[4733]: I0318 10:30:00.221669 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zz865\" (UniqueName: \"kubernetes.io/projected/4a150895-3ed5-4f19-8a97-3b65f3254672-kube-api-access-zz865\") pod \"collect-profiles-29563830-kdgdh\" (UID: \"4a150895-3ed5-4f19-8a97-3b65f3254672\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563830-kdgdh" Mar 18 10:30:00 crc kubenswrapper[4733]: I0318 10:30:00.323095 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vz2gq\" (UniqueName: \"kubernetes.io/projected/3c8eb139-c576-4ab3-8c2b-a309f3aa4a35-kube-api-access-vz2gq\") pod \"auto-csr-approver-29563830-2qqd2\" (UID: \"3c8eb139-c576-4ab3-8c2b-a309f3aa4a35\") " pod="openshift-infra/auto-csr-approver-29563830-2qqd2" Mar 18 10:30:00 crc kubenswrapper[4733]: I0318 10:30:00.323237 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4a150895-3ed5-4f19-8a97-3b65f3254672-secret-volume\") pod \"collect-profiles-29563830-kdgdh\" (UID: \"4a150895-3ed5-4f19-8a97-3b65f3254672\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563830-kdgdh" Mar 18 10:30:00 crc kubenswrapper[4733]: I0318 10:30:00.323315 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zz865\" (UniqueName: \"kubernetes.io/projected/4a150895-3ed5-4f19-8a97-3b65f3254672-kube-api-access-zz865\") pod \"collect-profiles-29563830-kdgdh\" (UID: \"4a150895-3ed5-4f19-8a97-3b65f3254672\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563830-kdgdh" Mar 18 10:30:00 crc kubenswrapper[4733]: I0318 10:30:00.323363 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4a150895-3ed5-4f19-8a97-3b65f3254672-config-volume\") pod \"collect-profiles-29563830-kdgdh\" (UID: \"4a150895-3ed5-4f19-8a97-3b65f3254672\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563830-kdgdh" Mar 18 10:30:00 crc kubenswrapper[4733]: I0318 10:30:00.324569 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4a150895-3ed5-4f19-8a97-3b65f3254672-config-volume\") pod \"collect-profiles-29563830-kdgdh\" (UID: \"4a150895-3ed5-4f19-8a97-3b65f3254672\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563830-kdgdh" Mar 18 10:30:00 crc kubenswrapper[4733]: I0318 10:30:00.339755 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4a150895-3ed5-4f19-8a97-3b65f3254672-secret-volume\") pod \"collect-profiles-29563830-kdgdh\" (UID: \"4a150895-3ed5-4f19-8a97-3b65f3254672\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563830-kdgdh" Mar 18 10:30:00 crc kubenswrapper[4733]: I0318 10:30:00.341874 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vz2gq\" (UniqueName: \"kubernetes.io/projected/3c8eb139-c576-4ab3-8c2b-a309f3aa4a35-kube-api-access-vz2gq\") pod \"auto-csr-approver-29563830-2qqd2\" (UID: \"3c8eb139-c576-4ab3-8c2b-a309f3aa4a35\") " pod="openshift-infra/auto-csr-approver-29563830-2qqd2" Mar 18 10:30:00 crc kubenswrapper[4733]: I0318 10:30:00.346531 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zz865\" (UniqueName: \"kubernetes.io/projected/4a150895-3ed5-4f19-8a97-3b65f3254672-kube-api-access-zz865\") pod \"collect-profiles-29563830-kdgdh\" (UID: \"4a150895-3ed5-4f19-8a97-3b65f3254672\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563830-kdgdh" Mar 18 10:30:00 crc kubenswrapper[4733]: I0318 10:30:00.523075 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563830-2qqd2" Mar 18 10:30:00 crc kubenswrapper[4733]: I0318 10:30:00.531859 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563830-kdgdh" Mar 18 10:30:00 crc kubenswrapper[4733]: I0318 10:30:00.838070 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-fd4t7" event={"ID":"759f85a1-4e24-4b61-879b-90801d648683","Type":"ContainerStarted","Data":"524add925102e84399badc3832c397a565f5c8e3d285558bd513f60adefbe52c"} Mar 18 10:30:00 crc kubenswrapper[4733]: I0318 10:30:00.838471 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-fd4t7" Mar 18 10:30:00 crc kubenswrapper[4733]: I0318 10:30:00.839523 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-v2pb2" event={"ID":"6ea742ac-3be9-4067-ab5a-032365494fde","Type":"ContainerStarted","Data":"09cefc9491e9ff975d4e45fc5e9f851f90830f8364ec6c4df8421db2f20d0b62"} Mar 18 10:30:00 crc kubenswrapper[4733]: I0318 10:30:00.839718 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-c674c5965-v2pb2" Mar 18 10:30:00 crc kubenswrapper[4733]: I0318 10:30:00.841163 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-chmbd" event={"ID":"ae8a8fbc-d425-4da5-afb3-438a85a43722","Type":"ContainerStarted","Data":"e63d7e02138093bf3a90cb47a652968ea8c4f8c14758e128a67af5acc0b36599"} Mar 18 10:30:00 crc kubenswrapper[4733]: I0318 10:30:00.841393 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-55f864c847-chmbd" Mar 18 10:30:00 crc kubenswrapper[4733]: I0318 10:30:00.864723 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-fd4t7" podStartSLOduration=3.55700833 podStartE2EDuration="24.864706621s" podCreationTimestamp="2026-03-18 10:29:36 +0000 UTC" firstStartedPulling="2026-03-18 10:29:38.310227847 +0000 UTC m=+1017.801962172" lastFinishedPulling="2026-03-18 10:29:59.617926138 +0000 UTC m=+1039.109660463" observedRunningTime="2026-03-18 10:30:00.860603255 +0000 UTC m=+1040.352337580" watchObservedRunningTime="2026-03-18 10:30:00.864706621 +0000 UTC m=+1040.356440946" Mar 18 10:30:00 crc kubenswrapper[4733]: I0318 10:30:00.865930 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-9txbj" event={"ID":"4ad2d88a-c733-4409-b07b-5ff4661e1b68","Type":"ContainerStarted","Data":"0337ff8f516557269758798f022de29676d9583baad256014db066578e9e029c"} Mar 18 10:30:00 crc kubenswrapper[4733]: I0318 10:30:00.866500 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5784578c99-9txbj" Mar 18 10:30:00 crc kubenswrapper[4733]: I0318 10:30:00.869481 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-k64ch" event={"ID":"e64c7cd6-a04b-440e-ac47-40f672fbc333","Type":"ContainerStarted","Data":"773d54dfac2ac55d8e5d7689898349f0d7536703793e842660114b5e23d74f56"} Mar 18 10:30:00 crc kubenswrapper[4733]: I0318 10:30:00.872095 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-sqr4g" event={"ID":"cd9234ed-fcbc-4d81-9034-27d39b3df6ee","Type":"ContainerStarted","Data":"70b824e8acd9e58d6693782cc83587264fbb5786b581efbfd2c997d24705f6f5"} Mar 18 10:30:00 crc kubenswrapper[4733]: I0318 10:30:00.872454 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-sqr4g" Mar 18 10:30:00 crc kubenswrapper[4733]: I0318 10:30:00.873356 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-nskpj" event={"ID":"6152e0d7-6362-4c7d-ba2b-4a1e55ca4f54","Type":"ContainerStarted","Data":"2a4b3510fbc76543281dfd4f9a396cd04735643b77c8b669c851732e62a021d9"} Mar 18 10:30:00 crc kubenswrapper[4733]: I0318 10:30:00.873666 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-nskpj" Mar 18 10:30:00 crc kubenswrapper[4733]: I0318 10:30:00.894994 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-55f864c847-chmbd" podStartSLOduration=3.518320477 podStartE2EDuration="24.894976837s" podCreationTimestamp="2026-03-18 10:29:36 +0000 UTC" firstStartedPulling="2026-03-18 10:29:38.212130203 +0000 UTC m=+1017.703864528" lastFinishedPulling="2026-03-18 10:29:59.588786563 +0000 UTC m=+1039.080520888" observedRunningTime="2026-03-18 10:30:00.889096681 +0000 UTC m=+1040.380831006" watchObservedRunningTime="2026-03-18 10:30:00.894976837 +0000 UTC m=+1040.386711162" Mar 18 10:30:00 crc kubenswrapper[4733]: I0318 10:30:00.919935 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-c674c5965-v2pb2" podStartSLOduration=3.700862483 podStartE2EDuration="24.919919643s" podCreationTimestamp="2026-03-18 10:29:36 +0000 UTC" firstStartedPulling="2026-03-18 10:29:38.330328998 +0000 UTC m=+1017.822063323" lastFinishedPulling="2026-03-18 10:29:59.549386118 +0000 UTC m=+1039.041120483" observedRunningTime="2026-03-18 10:30:00.91943954 +0000 UTC m=+1040.411173855" watchObservedRunningTime="2026-03-18 10:30:00.919919643 +0000 UTC m=+1040.411653968" Mar 18 10:30:00 crc kubenswrapper[4733]: I0318 10:30:00.958471 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5784578c99-9txbj" podStartSLOduration=3.743723086 podStartE2EDuration="24.958448804s" podCreationTimestamp="2026-03-18 10:29:36 +0000 UTC" firstStartedPulling="2026-03-18 10:29:38.329673179 +0000 UTC m=+1017.821407504" lastFinishedPulling="2026-03-18 10:29:59.544398897 +0000 UTC m=+1039.036133222" observedRunningTime="2026-03-18 10:30:00.945781885 +0000 UTC m=+1040.437516210" watchObservedRunningTime="2026-03-18 10:30:00.958448804 +0000 UTC m=+1040.450183129" Mar 18 10:30:00 crc kubenswrapper[4733]: I0318 10:30:00.982055 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-nskpj" podStartSLOduration=3.656803724 podStartE2EDuration="24.982030591s" podCreationTimestamp="2026-03-18 10:29:36 +0000 UTC" firstStartedPulling="2026-03-18 10:29:38.312693597 +0000 UTC m=+1017.804427912" lastFinishedPulling="2026-03-18 10:29:59.637920454 +0000 UTC m=+1039.129654779" observedRunningTime="2026-03-18 10:30:00.97846585 +0000 UTC m=+1040.470200175" watchObservedRunningTime="2026-03-18 10:30:00.982030591 +0000 UTC m=+1040.473764916" Mar 18 10:30:01 crc kubenswrapper[4733]: I0318 10:30:01.012613 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-k64ch" podStartSLOduration=3.7992002449999998 podStartE2EDuration="25.012597806s" podCreationTimestamp="2026-03-18 10:29:36 +0000 UTC" firstStartedPulling="2026-03-18 10:29:38.329434382 +0000 UTC m=+1017.821168707" lastFinishedPulling="2026-03-18 10:29:59.542831933 +0000 UTC m=+1039.034566268" observedRunningTime="2026-03-18 10:30:01.010548938 +0000 UTC m=+1040.502283263" watchObservedRunningTime="2026-03-18 10:30:01.012597806 +0000 UTC m=+1040.504332121" Mar 18 10:30:01 crc kubenswrapper[4733]: I0318 10:30:01.055704 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563830-2qqd2"] Mar 18 10:30:01 crc kubenswrapper[4733]: W0318 10:30:01.060861 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c8eb139_c576_4ab3_8c2b_a309f3aa4a35.slice/crio-8d4e6c4fcb2ee48b7889361535947998ed4f0556343598ce1b623b89f6b9e765 WatchSource:0}: Error finding container 8d4e6c4fcb2ee48b7889361535947998ed4f0556343598ce1b623b89f6b9e765: Status 404 returned error can't find the container with id 8d4e6c4fcb2ee48b7889361535947998ed4f0556343598ce1b623b89f6b9e765 Mar 18 10:30:01 crc kubenswrapper[4733]: I0318 10:30:01.091907 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-sqr4g" podStartSLOduration=3.844421686 podStartE2EDuration="25.09189305s" podCreationTimestamp="2026-03-18 10:29:36 +0000 UTC" firstStartedPulling="2026-03-18 10:29:38.335897806 +0000 UTC m=+1017.827632131" lastFinishedPulling="2026-03-18 10:29:59.58336917 +0000 UTC m=+1039.075103495" observedRunningTime="2026-03-18 10:30:01.073356115 +0000 UTC m=+1040.565090440" watchObservedRunningTime="2026-03-18 10:30:01.09189305 +0000 UTC m=+1040.583627375" Mar 18 10:30:01 crc kubenswrapper[4733]: W0318 10:30:01.098854 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a150895_3ed5_4f19_8a97_3b65f3254672.slice/crio-1480a7f03b93c1de37bb75e6fb6e0a26c994052f0e7ff7ab7ed209cd18d75440 WatchSource:0}: Error finding container 1480a7f03b93c1de37bb75e6fb6e0a26c994052f0e7ff7ab7ed209cd18d75440: Status 404 returned error can't find the container with id 1480a7f03b93c1de37bb75e6fb6e0a26c994052f0e7ff7ab7ed209cd18d75440 Mar 18 10:30:01 crc kubenswrapper[4733]: I0318 10:30:01.099263 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563830-kdgdh"] Mar 18 10:30:01 crc kubenswrapper[4733]: I0318 10:30:01.883106 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563830-2qqd2" event={"ID":"3c8eb139-c576-4ab3-8c2b-a309f3aa4a35","Type":"ContainerStarted","Data":"8d4e6c4fcb2ee48b7889361535947998ed4f0556343598ce1b623b89f6b9e765"} Mar 18 10:30:01 crc kubenswrapper[4733]: I0318 10:30:01.885055 4733 generic.go:334] "Generic (PLEG): container finished" podID="4a150895-3ed5-4f19-8a97-3b65f3254672" containerID="1afbfc2e0fbf329edca7836eaa8d44fdbd1ae521356c2969dfea34427ac61545" exitCode=0 Mar 18 10:30:01 crc kubenswrapper[4733]: I0318 10:30:01.885172 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563830-kdgdh" event={"ID":"4a150895-3ed5-4f19-8a97-3b65f3254672","Type":"ContainerDied","Data":"1afbfc2e0fbf329edca7836eaa8d44fdbd1ae521356c2969dfea34427ac61545"} Mar 18 10:30:01 crc kubenswrapper[4733]: I0318 10:30:01.885310 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563830-kdgdh" event={"ID":"4a150895-3ed5-4f19-8a97-3b65f3254672","Type":"ContainerStarted","Data":"1480a7f03b93c1de37bb75e6fb6e0a26c994052f0e7ff7ab7ed209cd18d75440"} Mar 18 10:30:03 crc kubenswrapper[4733]: I0318 10:30:03.154485 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563830-kdgdh" Mar 18 10:30:03 crc kubenswrapper[4733]: I0318 10:30:03.271013 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zz865\" (UniqueName: \"kubernetes.io/projected/4a150895-3ed5-4f19-8a97-3b65f3254672-kube-api-access-zz865\") pod \"4a150895-3ed5-4f19-8a97-3b65f3254672\" (UID: \"4a150895-3ed5-4f19-8a97-3b65f3254672\") " Mar 18 10:30:03 crc kubenswrapper[4733]: I0318 10:30:03.271588 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4a150895-3ed5-4f19-8a97-3b65f3254672-config-volume\") pod \"4a150895-3ed5-4f19-8a97-3b65f3254672\" (UID: \"4a150895-3ed5-4f19-8a97-3b65f3254672\") " Mar 18 10:30:03 crc kubenswrapper[4733]: I0318 10:30:03.271958 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4a150895-3ed5-4f19-8a97-3b65f3254672-secret-volume\") pod \"4a150895-3ed5-4f19-8a97-3b65f3254672\" (UID: \"4a150895-3ed5-4f19-8a97-3b65f3254672\") " Mar 18 10:30:03 crc kubenswrapper[4733]: I0318 10:30:03.272579 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a150895-3ed5-4f19-8a97-3b65f3254672-config-volume" (OuterVolumeSpecName: "config-volume") pod "4a150895-3ed5-4f19-8a97-3b65f3254672" (UID: "4a150895-3ed5-4f19-8a97-3b65f3254672"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:30:03 crc kubenswrapper[4733]: I0318 10:30:03.273160 4733 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4a150895-3ed5-4f19-8a97-3b65f3254672-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 10:30:03 crc kubenswrapper[4733]: I0318 10:30:03.279365 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a150895-3ed5-4f19-8a97-3b65f3254672-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4a150895-3ed5-4f19-8a97-3b65f3254672" (UID: "4a150895-3ed5-4f19-8a97-3b65f3254672"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:30:03 crc kubenswrapper[4733]: I0318 10:30:03.279409 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a150895-3ed5-4f19-8a97-3b65f3254672-kube-api-access-zz865" (OuterVolumeSpecName: "kube-api-access-zz865") pod "4a150895-3ed5-4f19-8a97-3b65f3254672" (UID: "4a150895-3ed5-4f19-8a97-3b65f3254672"). InnerVolumeSpecName "kube-api-access-zz865". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:30:03 crc kubenswrapper[4733]: I0318 10:30:03.376228 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zz865\" (UniqueName: \"kubernetes.io/projected/4a150895-3ed5-4f19-8a97-3b65f3254672-kube-api-access-zz865\") on node \"crc\" DevicePath \"\"" Mar 18 10:30:03 crc kubenswrapper[4733]: I0318 10:30:03.376292 4733 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4a150895-3ed5-4f19-8a97-3b65f3254672-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 10:30:03 crc kubenswrapper[4733]: I0318 10:30:03.903044 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563830-kdgdh" event={"ID":"4a150895-3ed5-4f19-8a97-3b65f3254672","Type":"ContainerDied","Data":"1480a7f03b93c1de37bb75e6fb6e0a26c994052f0e7ff7ab7ed209cd18d75440"} Mar 18 10:30:03 crc kubenswrapper[4733]: I0318 10:30:03.903380 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1480a7f03b93c1de37bb75e6fb6e0a26c994052f0e7ff7ab7ed209cd18d75440" Mar 18 10:30:03 crc kubenswrapper[4733]: I0318 10:30:03.903138 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563830-kdgdh" Mar 18 10:30:06 crc kubenswrapper[4733]: I0318 10:30:06.825177 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-55f864c847-chmbd" Mar 18 10:30:07 crc kubenswrapper[4733]: I0318 10:30:07.116102 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5784578c99-9txbj" Mar 18 10:30:07 crc kubenswrapper[4733]: I0318 10:30:07.174719 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-fd4t7" Mar 18 10:30:07 crc kubenswrapper[4733]: I0318 10:30:07.190094 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-c674c5965-v2pb2" Mar 18 10:30:07 crc kubenswrapper[4733]: I0318 10:30:07.254575 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-nskpj" Mar 18 10:30:07 crc kubenswrapper[4733]: I0318 10:30:07.269810 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-sqr4g" Mar 18 10:30:08 crc kubenswrapper[4733]: I0318 10:30:08.359255 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/651c7dd5-3adc-48b4-b579-309258aa3735-cert\") pod \"infra-operator-controller-manager-74c694b97b-j4snz\" (UID: \"651c7dd5-3adc-48b4-b579-309258aa3735\") " pod="openstack-operators/infra-operator-controller-manager-74c694b97b-j4snz" Mar 18 10:30:08 crc kubenswrapper[4733]: I0318 10:30:08.371352 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/651c7dd5-3adc-48b4-b579-309258aa3735-cert\") pod \"infra-operator-controller-manager-74c694b97b-j4snz\" (UID: \"651c7dd5-3adc-48b4-b579-309258aa3735\") " pod="openstack-operators/infra-operator-controller-manager-74c694b97b-j4snz" Mar 18 10:30:08 crc kubenswrapper[4733]: I0318 10:30:08.529415 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-vp5bk" Mar 18 10:30:08 crc kubenswrapper[4733]: I0318 10:30:08.537228 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-74c694b97b-j4snz" Mar 18 10:30:08 crc kubenswrapper[4733]: I0318 10:30:08.562976 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6eca2f16-53b8-4173-ace4-18b7292b1369-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-s6rbv\" (UID: \"6eca2f16-53b8-4173-ace4-18b7292b1369\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-s6rbv" Mar 18 10:30:08 crc kubenswrapper[4733]: I0318 10:30:08.570308 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6eca2f16-53b8-4173-ace4-18b7292b1369-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-s6rbv\" (UID: \"6eca2f16-53b8-4173-ace4-18b7292b1369\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-s6rbv" Mar 18 10:30:08 crc kubenswrapper[4733]: I0318 10:30:08.851420 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-gwtrb" Mar 18 10:30:08 crc kubenswrapper[4733]: I0318 10:30:08.858772 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-s6rbv" Mar 18 10:30:08 crc kubenswrapper[4733]: I0318 10:30:08.978086 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a4b7e706-a9a7-490a-84a8-094d1d909ba8-metrics-certs\") pod \"openstack-operator-controller-manager-85877db48-qvlf2\" (UID: \"a4b7e706-a9a7-490a-84a8-094d1d909ba8\") " pod="openstack-operators/openstack-operator-controller-manager-85877db48-qvlf2" Mar 18 10:30:08 crc kubenswrapper[4733]: I0318 10:30:08.978492 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a4b7e706-a9a7-490a-84a8-094d1d909ba8-webhook-certs\") pod \"openstack-operator-controller-manager-85877db48-qvlf2\" (UID: \"a4b7e706-a9a7-490a-84a8-094d1d909ba8\") " pod="openstack-operators/openstack-operator-controller-manager-85877db48-qvlf2" Mar 18 10:30:08 crc kubenswrapper[4733]: I0318 10:30:08.982916 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a4b7e706-a9a7-490a-84a8-094d1d909ba8-webhook-certs\") pod \"openstack-operator-controller-manager-85877db48-qvlf2\" (UID: \"a4b7e706-a9a7-490a-84a8-094d1d909ba8\") " pod="openstack-operators/openstack-operator-controller-manager-85877db48-qvlf2" Mar 18 10:30:08 crc kubenswrapper[4733]: I0318 10:30:08.983017 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a4b7e706-a9a7-490a-84a8-094d1d909ba8-metrics-certs\") pod \"openstack-operator-controller-manager-85877db48-qvlf2\" (UID: \"a4b7e706-a9a7-490a-84a8-094d1d909ba8\") " pod="openstack-operators/openstack-operator-controller-manager-85877db48-qvlf2" Mar 18 10:30:08 crc kubenswrapper[4733]: I0318 10:30:08.992932 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-74c694b97b-j4snz"] Mar 18 10:30:09 crc kubenswrapper[4733]: I0318 10:30:09.101264 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-5xhfs" Mar 18 10:30:09 crc kubenswrapper[4733]: I0318 10:30:09.109570 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-85877db48-qvlf2" Mar 18 10:30:09 crc kubenswrapper[4733]: I0318 10:30:09.124339 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-s6rbv"] Mar 18 10:30:09 crc kubenswrapper[4733]: W0318 10:30:09.131898 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6eca2f16_53b8_4173_ace4_18b7292b1369.slice/crio-287488f351efea943c590067dadef0b34f1a7fcc14089ba9f594eddec06af49f WatchSource:0}: Error finding container 287488f351efea943c590067dadef0b34f1a7fcc14089ba9f594eddec06af49f: Status 404 returned error can't find the container with id 287488f351efea943c590067dadef0b34f1a7fcc14089ba9f594eddec06af49f Mar 18 10:30:09 crc kubenswrapper[4733]: I0318 10:30:09.331415 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-85877db48-qvlf2"] Mar 18 10:30:09 crc kubenswrapper[4733]: I0318 10:30:09.951388 4733 generic.go:334] "Generic (PLEG): container finished" podID="3c8eb139-c576-4ab3-8c2b-a309f3aa4a35" containerID="be977da7d932bb787db9cafb1727d3bd50b5e03495d1e8a82c232ed7c66e241e" exitCode=0 Mar 18 10:30:09 crc kubenswrapper[4733]: I0318 10:30:09.951449 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563830-2qqd2" event={"ID":"3c8eb139-c576-4ab3-8c2b-a309f3aa4a35","Type":"ContainerDied","Data":"be977da7d932bb787db9cafb1727d3bd50b5e03495d1e8a82c232ed7c66e241e"} Mar 18 10:30:09 crc kubenswrapper[4733]: I0318 10:30:09.953630 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-s6rbv" event={"ID":"6eca2f16-53b8-4173-ace4-18b7292b1369","Type":"ContainerStarted","Data":"287488f351efea943c590067dadef0b34f1a7fcc14089ba9f594eddec06af49f"} Mar 18 10:30:09 crc kubenswrapper[4733]: I0318 10:30:09.956121 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-85877db48-qvlf2" event={"ID":"a4b7e706-a9a7-490a-84a8-094d1d909ba8","Type":"ContainerStarted","Data":"fc4421f4ae5159781decebb7cafcfbdfcf0728a6d7062d996cbac8ffa23ff30d"} Mar 18 10:30:09 crc kubenswrapper[4733]: I0318 10:30:09.957560 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-74c694b97b-j4snz" event={"ID":"651c7dd5-3adc-48b4-b579-309258aa3735","Type":"ContainerStarted","Data":"12cfa674f121e9b5a9b930563f89df0d163c4e8f5b65a11351773057b72f3df1"} Mar 18 10:30:10 crc kubenswrapper[4733]: I0318 10:30:10.971523 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-85877db48-qvlf2" event={"ID":"a4b7e706-a9a7-490a-84a8-094d1d909ba8","Type":"ContainerStarted","Data":"19a4ab4122ea3ab13731e02ddad2ee9bbfc4c18d66f55e966aed85c4621a2208"} Mar 18 10:30:11 crc kubenswrapper[4733]: I0318 10:30:11.017133 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-85877db48-qvlf2" podStartSLOduration=35.017106993 podStartE2EDuration="35.017106993s" podCreationTimestamp="2026-03-18 10:29:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:30:11.007416188 +0000 UTC m=+1050.499150533" watchObservedRunningTime="2026-03-18 10:30:11.017106993 +0000 UTC m=+1050.508841328" Mar 18 10:30:11 crc kubenswrapper[4733]: I0318 10:30:11.300976 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563830-2qqd2" Mar 18 10:30:11 crc kubenswrapper[4733]: I0318 10:30:11.416278 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vz2gq\" (UniqueName: \"kubernetes.io/projected/3c8eb139-c576-4ab3-8c2b-a309f3aa4a35-kube-api-access-vz2gq\") pod \"3c8eb139-c576-4ab3-8c2b-a309f3aa4a35\" (UID: \"3c8eb139-c576-4ab3-8c2b-a309f3aa4a35\") " Mar 18 10:30:11 crc kubenswrapper[4733]: I0318 10:30:11.424758 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c8eb139-c576-4ab3-8c2b-a309f3aa4a35-kube-api-access-vz2gq" (OuterVolumeSpecName: "kube-api-access-vz2gq") pod "3c8eb139-c576-4ab3-8c2b-a309f3aa4a35" (UID: "3c8eb139-c576-4ab3-8c2b-a309f3aa4a35"). InnerVolumeSpecName "kube-api-access-vz2gq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:30:11 crc kubenswrapper[4733]: I0318 10:30:11.518054 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vz2gq\" (UniqueName: \"kubernetes.io/projected/3c8eb139-c576-4ab3-8c2b-a309f3aa4a35-kube-api-access-vz2gq\") on node \"crc\" DevicePath \"\"" Mar 18 10:30:11 crc kubenswrapper[4733]: I0318 10:30:11.986524 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563830-2qqd2" event={"ID":"3c8eb139-c576-4ab3-8c2b-a309f3aa4a35","Type":"ContainerDied","Data":"8d4e6c4fcb2ee48b7889361535947998ed4f0556343598ce1b623b89f6b9e765"} Mar 18 10:30:11 crc kubenswrapper[4733]: I0318 10:30:11.989023 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d4e6c4fcb2ee48b7889361535947998ed4f0556343598ce1b623b89f6b9e765" Mar 18 10:30:11 crc kubenswrapper[4733]: I0318 10:30:11.989074 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-85877db48-qvlf2" Mar 18 10:30:11 crc kubenswrapper[4733]: I0318 10:30:11.986598 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563830-2qqd2" Mar 18 10:30:12 crc kubenswrapper[4733]: I0318 10:30:12.386763 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563824-l49xk"] Mar 18 10:30:12 crc kubenswrapper[4733]: I0318 10:30:12.393088 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563824-l49xk"] Mar 18 10:30:13 crc kubenswrapper[4733]: I0318 10:30:13.195665 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6077c15f-e285-4625-b336-a84327b1af2d" path="/var/lib/kubelet/pods/6077c15f-e285-4625-b336-a84327b1af2d/volumes" Mar 18 10:30:13 crc kubenswrapper[4733]: I0318 10:30:13.570887 4733 patch_prober.go:28] interesting pod/machine-config-daemon-2h7dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:30:13 crc kubenswrapper[4733]: I0318 10:30:13.570956 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:30:13 crc kubenswrapper[4733]: I0318 10:30:13.571012 4733 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" Mar 18 10:30:13 crc kubenswrapper[4733]: I0318 10:30:13.571635 4733 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2a78644e078fbb319d0fc66d47cfb2501076e4fd678ad793e791ddb4f3d3ee96"} pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 10:30:13 crc kubenswrapper[4733]: I0318 10:30:13.571698 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" containerName="machine-config-daemon" containerID="cri-o://2a78644e078fbb319d0fc66d47cfb2501076e4fd678ad793e791ddb4f3d3ee96" gracePeriod=600 Mar 18 10:30:14 crc kubenswrapper[4733]: I0318 10:30:14.004404 4733 generic.go:334] "Generic (PLEG): container finished" podID="6f75e1c5-e0c5-43df-944f-77b734070793" containerID="2a78644e078fbb319d0fc66d47cfb2501076e4fd678ad793e791ddb4f3d3ee96" exitCode=0 Mar 18 10:30:14 crc kubenswrapper[4733]: I0318 10:30:14.004449 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" event={"ID":"6f75e1c5-e0c5-43df-944f-77b734070793","Type":"ContainerDied","Data":"2a78644e078fbb319d0fc66d47cfb2501076e4fd678ad793e791ddb4f3d3ee96"} Mar 18 10:30:14 crc kubenswrapper[4733]: I0318 10:30:14.004485 4733 scope.go:117] "RemoveContainer" containerID="a11e956cdd33846b5919c35822db029436f82987d5e2c2bb6427c6d1dfd2098c" Mar 18 10:30:15 crc kubenswrapper[4733]: I0318 10:30:15.012713 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" event={"ID":"6f75e1c5-e0c5-43df-944f-77b734070793","Type":"ContainerStarted","Data":"345f1c51e0b2f38e27fd31ce4a7323d51ffa4b8419f456177dd8653558afb625"} Mar 18 10:30:15 crc kubenswrapper[4733]: I0318 10:30:15.015118 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-s6rbv" event={"ID":"6eca2f16-53b8-4173-ace4-18b7292b1369","Type":"ContainerStarted","Data":"61a53e6a37ef75cdfb62d9ca277be1f5b865c3fe9a1b65b59be71e8453a3d337"} Mar 18 10:30:15 crc kubenswrapper[4733]: I0318 10:30:15.015536 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-s6rbv" Mar 18 10:30:15 crc kubenswrapper[4733]: I0318 10:30:15.016829 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-74c694b97b-j4snz" event={"ID":"651c7dd5-3adc-48b4-b579-309258aa3735","Type":"ContainerStarted","Data":"226fcfa3ddb6adc8e926f181f47c3dd1d4bd8a04122f509d852cfa38b5c8743f"} Mar 18 10:30:15 crc kubenswrapper[4733]: I0318 10:30:15.017530 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-74c694b97b-j4snz" Mar 18 10:30:15 crc kubenswrapper[4733]: I0318 10:30:15.040503 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-74c694b97b-j4snz" podStartSLOduration=34.04694815 podStartE2EDuration="39.040482711s" podCreationTimestamp="2026-03-18 10:29:36 +0000 UTC" firstStartedPulling="2026-03-18 10:30:09.003245733 +0000 UTC m=+1048.494980058" lastFinishedPulling="2026-03-18 10:30:13.996780294 +0000 UTC m=+1053.488514619" observedRunningTime="2026-03-18 10:30:15.036694304 +0000 UTC m=+1054.528428629" watchObservedRunningTime="2026-03-18 10:30:15.040482711 +0000 UTC m=+1054.532217036" Mar 18 10:30:15 crc kubenswrapper[4733]: I0318 10:30:15.061023 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-s6rbv" podStartSLOduration=34.163728425 podStartE2EDuration="39.061005502s" podCreationTimestamp="2026-03-18 10:29:36 +0000 UTC" firstStartedPulling="2026-03-18 10:30:09.136810653 +0000 UTC m=+1048.628544988" lastFinishedPulling="2026-03-18 10:30:14.03408774 +0000 UTC m=+1053.525822065" observedRunningTime="2026-03-18 10:30:15.05917518 +0000 UTC m=+1054.550909505" watchObservedRunningTime="2026-03-18 10:30:15.061005502 +0000 UTC m=+1054.552739837" Mar 18 10:30:19 crc kubenswrapper[4733]: I0318 10:30:19.120856 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-85877db48-qvlf2" Mar 18 10:30:28 crc kubenswrapper[4733]: I0318 10:30:28.144001 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gxrnl"] Mar 18 10:30:28 crc kubenswrapper[4733]: E0318 10:30:28.144734 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a150895-3ed5-4f19-8a97-3b65f3254672" containerName="collect-profiles" Mar 18 10:30:28 crc kubenswrapper[4733]: I0318 10:30:28.144745 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a150895-3ed5-4f19-8a97-3b65f3254672" containerName="collect-profiles" Mar 18 10:30:28 crc kubenswrapper[4733]: E0318 10:30:28.144767 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c8eb139-c576-4ab3-8c2b-a309f3aa4a35" containerName="oc" Mar 18 10:30:28 crc kubenswrapper[4733]: I0318 10:30:28.144773 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c8eb139-c576-4ab3-8c2b-a309f3aa4a35" containerName="oc" Mar 18 10:30:28 crc kubenswrapper[4733]: I0318 10:30:28.144898 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a150895-3ed5-4f19-8a97-3b65f3254672" containerName="collect-profiles" Mar 18 10:30:28 crc kubenswrapper[4733]: I0318 10:30:28.144918 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c8eb139-c576-4ab3-8c2b-a309f3aa4a35" containerName="oc" Mar 18 10:30:28 crc kubenswrapper[4733]: I0318 10:30:28.145834 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gxrnl" Mar 18 10:30:28 crc kubenswrapper[4733]: I0318 10:30:28.162851 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gxrnl"] Mar 18 10:30:28 crc kubenswrapper[4733]: I0318 10:30:28.180704 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be59cd8b-2e8d-41ae-8c18-1f6dea878859-utilities\") pod \"certified-operators-gxrnl\" (UID: \"be59cd8b-2e8d-41ae-8c18-1f6dea878859\") " pod="openshift-marketplace/certified-operators-gxrnl" Mar 18 10:30:28 crc kubenswrapper[4733]: I0318 10:30:28.180739 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42hs8\" (UniqueName: \"kubernetes.io/projected/be59cd8b-2e8d-41ae-8c18-1f6dea878859-kube-api-access-42hs8\") pod \"certified-operators-gxrnl\" (UID: \"be59cd8b-2e8d-41ae-8c18-1f6dea878859\") " pod="openshift-marketplace/certified-operators-gxrnl" Mar 18 10:30:28 crc kubenswrapper[4733]: I0318 10:30:28.180886 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be59cd8b-2e8d-41ae-8c18-1f6dea878859-catalog-content\") pod \"certified-operators-gxrnl\" (UID: \"be59cd8b-2e8d-41ae-8c18-1f6dea878859\") " pod="openshift-marketplace/certified-operators-gxrnl" Mar 18 10:30:28 crc kubenswrapper[4733]: I0318 10:30:28.282342 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be59cd8b-2e8d-41ae-8c18-1f6dea878859-catalog-content\") pod \"certified-operators-gxrnl\" (UID: \"be59cd8b-2e8d-41ae-8c18-1f6dea878859\") " pod="openshift-marketplace/certified-operators-gxrnl" Mar 18 10:30:28 crc kubenswrapper[4733]: I0318 10:30:28.282417 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be59cd8b-2e8d-41ae-8c18-1f6dea878859-utilities\") pod \"certified-operators-gxrnl\" (UID: \"be59cd8b-2e8d-41ae-8c18-1f6dea878859\") " pod="openshift-marketplace/certified-operators-gxrnl" Mar 18 10:30:28 crc kubenswrapper[4733]: I0318 10:30:28.282437 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42hs8\" (UniqueName: \"kubernetes.io/projected/be59cd8b-2e8d-41ae-8c18-1f6dea878859-kube-api-access-42hs8\") pod \"certified-operators-gxrnl\" (UID: \"be59cd8b-2e8d-41ae-8c18-1f6dea878859\") " pod="openshift-marketplace/certified-operators-gxrnl" Mar 18 10:30:28 crc kubenswrapper[4733]: I0318 10:30:28.282768 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be59cd8b-2e8d-41ae-8c18-1f6dea878859-catalog-content\") pod \"certified-operators-gxrnl\" (UID: \"be59cd8b-2e8d-41ae-8c18-1f6dea878859\") " pod="openshift-marketplace/certified-operators-gxrnl" Mar 18 10:30:28 crc kubenswrapper[4733]: I0318 10:30:28.283037 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be59cd8b-2e8d-41ae-8c18-1f6dea878859-utilities\") pod \"certified-operators-gxrnl\" (UID: \"be59cd8b-2e8d-41ae-8c18-1f6dea878859\") " pod="openshift-marketplace/certified-operators-gxrnl" Mar 18 10:30:28 crc kubenswrapper[4733]: I0318 10:30:28.308577 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42hs8\" (UniqueName: \"kubernetes.io/projected/be59cd8b-2e8d-41ae-8c18-1f6dea878859-kube-api-access-42hs8\") pod \"certified-operators-gxrnl\" (UID: \"be59cd8b-2e8d-41ae-8c18-1f6dea878859\") " pod="openshift-marketplace/certified-operators-gxrnl" Mar 18 10:30:28 crc kubenswrapper[4733]: I0318 10:30:28.464484 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gxrnl" Mar 18 10:30:28 crc kubenswrapper[4733]: I0318 10:30:28.544648 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-74c694b97b-j4snz" Mar 18 10:30:28 crc kubenswrapper[4733]: I0318 10:30:28.868075 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-s6rbv" Mar 18 10:30:28 crc kubenswrapper[4733]: I0318 10:30:28.967340 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gxrnl"] Mar 18 10:30:29 crc kubenswrapper[4733]: I0318 10:30:29.138288 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gxrnl" event={"ID":"be59cd8b-2e8d-41ae-8c18-1f6dea878859","Type":"ContainerStarted","Data":"3f798c2bf3f5a1669f683d8681c89d7676157ae54ed2fa5529e930b00288aa89"} Mar 18 10:30:30 crc kubenswrapper[4733]: I0318 10:30:30.151474 4733 generic.go:334] "Generic (PLEG): container finished" podID="be59cd8b-2e8d-41ae-8c18-1f6dea878859" containerID="1a0d3f9f18feb5d8d80a2b6645f85d5b1828acf7d2348ee16404888b26c9e7d7" exitCode=0 Mar 18 10:30:30 crc kubenswrapper[4733]: I0318 10:30:30.151814 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gxrnl" event={"ID":"be59cd8b-2e8d-41ae-8c18-1f6dea878859","Type":"ContainerDied","Data":"1a0d3f9f18feb5d8d80a2b6645f85d5b1828acf7d2348ee16404888b26c9e7d7"} Mar 18 10:30:31 crc kubenswrapper[4733]: I0318 10:30:31.159371 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gxrnl" event={"ID":"be59cd8b-2e8d-41ae-8c18-1f6dea878859","Type":"ContainerStarted","Data":"1f2ad3b9ebbdb27ca1e9ac77090b1c05e9f6528d3dbbd9391a8e464389617db7"} Mar 18 10:30:32 crc kubenswrapper[4733]: I0318 10:30:32.169483 4733 generic.go:334] "Generic (PLEG): container finished" podID="be59cd8b-2e8d-41ae-8c18-1f6dea878859" containerID="1f2ad3b9ebbdb27ca1e9ac77090b1c05e9f6528d3dbbd9391a8e464389617db7" exitCode=0 Mar 18 10:30:32 crc kubenswrapper[4733]: I0318 10:30:32.169595 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gxrnl" event={"ID":"be59cd8b-2e8d-41ae-8c18-1f6dea878859","Type":"ContainerDied","Data":"1f2ad3b9ebbdb27ca1e9ac77090b1c05e9f6528d3dbbd9391a8e464389617db7"} Mar 18 10:30:33 crc kubenswrapper[4733]: I0318 10:30:33.202654 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gxrnl" event={"ID":"be59cd8b-2e8d-41ae-8c18-1f6dea878859","Type":"ContainerStarted","Data":"88d1674a4eaf402acf29785a0f8030272e4ba1e43ad27f542258f3ccecd14883"} Mar 18 10:30:33 crc kubenswrapper[4733]: I0318 10:30:33.214929 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gxrnl" podStartSLOduration=2.795983299 podStartE2EDuration="5.214906241s" podCreationTimestamp="2026-03-18 10:30:28 +0000 UTC" firstStartedPulling="2026-03-18 10:30:30.154466014 +0000 UTC m=+1069.646200379" lastFinishedPulling="2026-03-18 10:30:32.573388976 +0000 UTC m=+1072.065123321" observedRunningTime="2026-03-18 10:30:33.21241585 +0000 UTC m=+1072.704150205" watchObservedRunningTime="2026-03-18 10:30:33.214906241 +0000 UTC m=+1072.706640586" Mar 18 10:30:38 crc kubenswrapper[4733]: I0318 10:30:38.465685 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gxrnl" Mar 18 10:30:38 crc kubenswrapper[4733]: I0318 10:30:38.465762 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gxrnl" Mar 18 10:30:38 crc kubenswrapper[4733]: I0318 10:30:38.523711 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gxrnl" Mar 18 10:30:39 crc kubenswrapper[4733]: I0318 10:30:39.300405 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gxrnl" Mar 18 10:30:39 crc kubenswrapper[4733]: I0318 10:30:39.346878 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gxrnl"] Mar 18 10:30:41 crc kubenswrapper[4733]: I0318 10:30:41.252513 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gxrnl" podUID="be59cd8b-2e8d-41ae-8c18-1f6dea878859" containerName="registry-server" containerID="cri-o://88d1674a4eaf402acf29785a0f8030272e4ba1e43ad27f542258f3ccecd14883" gracePeriod=2 Mar 18 10:30:41 crc kubenswrapper[4733]: I0318 10:30:41.711021 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gxrnl" Mar 18 10:30:41 crc kubenswrapper[4733]: I0318 10:30:41.767771 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be59cd8b-2e8d-41ae-8c18-1f6dea878859-utilities\") pod \"be59cd8b-2e8d-41ae-8c18-1f6dea878859\" (UID: \"be59cd8b-2e8d-41ae-8c18-1f6dea878859\") " Mar 18 10:30:41 crc kubenswrapper[4733]: I0318 10:30:41.767955 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42hs8\" (UniqueName: \"kubernetes.io/projected/be59cd8b-2e8d-41ae-8c18-1f6dea878859-kube-api-access-42hs8\") pod \"be59cd8b-2e8d-41ae-8c18-1f6dea878859\" (UID: \"be59cd8b-2e8d-41ae-8c18-1f6dea878859\") " Mar 18 10:30:41 crc kubenswrapper[4733]: I0318 10:30:41.768029 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be59cd8b-2e8d-41ae-8c18-1f6dea878859-catalog-content\") pod \"be59cd8b-2e8d-41ae-8c18-1f6dea878859\" (UID: \"be59cd8b-2e8d-41ae-8c18-1f6dea878859\") " Mar 18 10:30:41 crc kubenswrapper[4733]: I0318 10:30:41.768901 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be59cd8b-2e8d-41ae-8c18-1f6dea878859-utilities" (OuterVolumeSpecName: "utilities") pod "be59cd8b-2e8d-41ae-8c18-1f6dea878859" (UID: "be59cd8b-2e8d-41ae-8c18-1f6dea878859"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:30:41 crc kubenswrapper[4733]: I0318 10:30:41.775030 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be59cd8b-2e8d-41ae-8c18-1f6dea878859-kube-api-access-42hs8" (OuterVolumeSpecName: "kube-api-access-42hs8") pod "be59cd8b-2e8d-41ae-8c18-1f6dea878859" (UID: "be59cd8b-2e8d-41ae-8c18-1f6dea878859"). InnerVolumeSpecName "kube-api-access-42hs8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:30:41 crc kubenswrapper[4733]: I0318 10:30:41.831296 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be59cd8b-2e8d-41ae-8c18-1f6dea878859-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "be59cd8b-2e8d-41ae-8c18-1f6dea878859" (UID: "be59cd8b-2e8d-41ae-8c18-1f6dea878859"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:30:41 crc kubenswrapper[4733]: I0318 10:30:41.869722 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42hs8\" (UniqueName: \"kubernetes.io/projected/be59cd8b-2e8d-41ae-8c18-1f6dea878859-kube-api-access-42hs8\") on node \"crc\" DevicePath \"\"" Mar 18 10:30:41 crc kubenswrapper[4733]: I0318 10:30:41.869768 4733 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be59cd8b-2e8d-41ae-8c18-1f6dea878859-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 10:30:41 crc kubenswrapper[4733]: I0318 10:30:41.869781 4733 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be59cd8b-2e8d-41ae-8c18-1f6dea878859-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 10:30:42 crc kubenswrapper[4733]: I0318 10:30:42.260089 4733 generic.go:334] "Generic (PLEG): container finished" podID="be59cd8b-2e8d-41ae-8c18-1f6dea878859" containerID="88d1674a4eaf402acf29785a0f8030272e4ba1e43ad27f542258f3ccecd14883" exitCode=0 Mar 18 10:30:42 crc kubenswrapper[4733]: I0318 10:30:42.260138 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gxrnl" Mar 18 10:30:42 crc kubenswrapper[4733]: I0318 10:30:42.260157 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gxrnl" event={"ID":"be59cd8b-2e8d-41ae-8c18-1f6dea878859","Type":"ContainerDied","Data":"88d1674a4eaf402acf29785a0f8030272e4ba1e43ad27f542258f3ccecd14883"} Mar 18 10:30:42 crc kubenswrapper[4733]: I0318 10:30:42.260641 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gxrnl" event={"ID":"be59cd8b-2e8d-41ae-8c18-1f6dea878859","Type":"ContainerDied","Data":"3f798c2bf3f5a1669f683d8681c89d7676157ae54ed2fa5529e930b00288aa89"} Mar 18 10:30:42 crc kubenswrapper[4733]: I0318 10:30:42.260673 4733 scope.go:117] "RemoveContainer" containerID="88d1674a4eaf402acf29785a0f8030272e4ba1e43ad27f542258f3ccecd14883" Mar 18 10:30:42 crc kubenswrapper[4733]: I0318 10:30:42.288580 4733 scope.go:117] "RemoveContainer" containerID="1f2ad3b9ebbdb27ca1e9ac77090b1c05e9f6528d3dbbd9391a8e464389617db7" Mar 18 10:30:42 crc kubenswrapper[4733]: I0318 10:30:42.297513 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gxrnl"] Mar 18 10:30:42 crc kubenswrapper[4733]: I0318 10:30:42.303826 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gxrnl"] Mar 18 10:30:42 crc kubenswrapper[4733]: I0318 10:30:42.309626 4733 scope.go:117] "RemoveContainer" containerID="1a0d3f9f18feb5d8d80a2b6645f85d5b1828acf7d2348ee16404888b26c9e7d7" Mar 18 10:30:42 crc kubenswrapper[4733]: I0318 10:30:42.330287 4733 scope.go:117] "RemoveContainer" containerID="88d1674a4eaf402acf29785a0f8030272e4ba1e43ad27f542258f3ccecd14883" Mar 18 10:30:42 crc kubenswrapper[4733]: E0318 10:30:42.330702 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88d1674a4eaf402acf29785a0f8030272e4ba1e43ad27f542258f3ccecd14883\": container with ID starting with 88d1674a4eaf402acf29785a0f8030272e4ba1e43ad27f542258f3ccecd14883 not found: ID does not exist" containerID="88d1674a4eaf402acf29785a0f8030272e4ba1e43ad27f542258f3ccecd14883" Mar 18 10:30:42 crc kubenswrapper[4733]: I0318 10:30:42.330739 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88d1674a4eaf402acf29785a0f8030272e4ba1e43ad27f542258f3ccecd14883"} err="failed to get container status \"88d1674a4eaf402acf29785a0f8030272e4ba1e43ad27f542258f3ccecd14883\": rpc error: code = NotFound desc = could not find container \"88d1674a4eaf402acf29785a0f8030272e4ba1e43ad27f542258f3ccecd14883\": container with ID starting with 88d1674a4eaf402acf29785a0f8030272e4ba1e43ad27f542258f3ccecd14883 not found: ID does not exist" Mar 18 10:30:42 crc kubenswrapper[4733]: I0318 10:30:42.330764 4733 scope.go:117] "RemoveContainer" containerID="1f2ad3b9ebbdb27ca1e9ac77090b1c05e9f6528d3dbbd9391a8e464389617db7" Mar 18 10:30:42 crc kubenswrapper[4733]: E0318 10:30:42.331030 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f2ad3b9ebbdb27ca1e9ac77090b1c05e9f6528d3dbbd9391a8e464389617db7\": container with ID starting with 1f2ad3b9ebbdb27ca1e9ac77090b1c05e9f6528d3dbbd9391a8e464389617db7 not found: ID does not exist" containerID="1f2ad3b9ebbdb27ca1e9ac77090b1c05e9f6528d3dbbd9391a8e464389617db7" Mar 18 10:30:42 crc kubenswrapper[4733]: I0318 10:30:42.331078 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f2ad3b9ebbdb27ca1e9ac77090b1c05e9f6528d3dbbd9391a8e464389617db7"} err="failed to get container status \"1f2ad3b9ebbdb27ca1e9ac77090b1c05e9f6528d3dbbd9391a8e464389617db7\": rpc error: code = NotFound desc = could not find container \"1f2ad3b9ebbdb27ca1e9ac77090b1c05e9f6528d3dbbd9391a8e464389617db7\": container with ID starting with 1f2ad3b9ebbdb27ca1e9ac77090b1c05e9f6528d3dbbd9391a8e464389617db7 not found: ID does not exist" Mar 18 10:30:42 crc kubenswrapper[4733]: I0318 10:30:42.331109 4733 scope.go:117] "RemoveContainer" containerID="1a0d3f9f18feb5d8d80a2b6645f85d5b1828acf7d2348ee16404888b26c9e7d7" Mar 18 10:30:42 crc kubenswrapper[4733]: E0318 10:30:42.331454 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a0d3f9f18feb5d8d80a2b6645f85d5b1828acf7d2348ee16404888b26c9e7d7\": container with ID starting with 1a0d3f9f18feb5d8d80a2b6645f85d5b1828acf7d2348ee16404888b26c9e7d7 not found: ID does not exist" containerID="1a0d3f9f18feb5d8d80a2b6645f85d5b1828acf7d2348ee16404888b26c9e7d7" Mar 18 10:30:42 crc kubenswrapper[4733]: I0318 10:30:42.331489 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a0d3f9f18feb5d8d80a2b6645f85d5b1828acf7d2348ee16404888b26c9e7d7"} err="failed to get container status \"1a0d3f9f18feb5d8d80a2b6645f85d5b1828acf7d2348ee16404888b26c9e7d7\": rpc error: code = NotFound desc = could not find container \"1a0d3f9f18feb5d8d80a2b6645f85d5b1828acf7d2348ee16404888b26c9e7d7\": container with ID starting with 1a0d3f9f18feb5d8d80a2b6645f85d5b1828acf7d2348ee16404888b26c9e7d7 not found: ID does not exist" Mar 18 10:30:43 crc kubenswrapper[4733]: I0318 10:30:43.194475 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be59cd8b-2e8d-41ae-8c18-1f6dea878859" path="/var/lib/kubelet/pods/be59cd8b-2e8d-41ae-8c18-1f6dea878859/volumes" Mar 18 10:30:48 crc kubenswrapper[4733]: I0318 10:30:48.206646 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-7hxp6"] Mar 18 10:30:48 crc kubenswrapper[4733]: E0318 10:30:48.208587 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be59cd8b-2e8d-41ae-8c18-1f6dea878859" containerName="registry-server" Mar 18 10:30:48 crc kubenswrapper[4733]: I0318 10:30:48.208719 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="be59cd8b-2e8d-41ae-8c18-1f6dea878859" containerName="registry-server" Mar 18 10:30:48 crc kubenswrapper[4733]: E0318 10:30:48.208805 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be59cd8b-2e8d-41ae-8c18-1f6dea878859" containerName="extract-utilities" Mar 18 10:30:48 crc kubenswrapper[4733]: I0318 10:30:48.208875 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="be59cd8b-2e8d-41ae-8c18-1f6dea878859" containerName="extract-utilities" Mar 18 10:30:48 crc kubenswrapper[4733]: E0318 10:30:48.208965 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be59cd8b-2e8d-41ae-8c18-1f6dea878859" containerName="extract-content" Mar 18 10:30:48 crc kubenswrapper[4733]: I0318 10:30:48.209040 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="be59cd8b-2e8d-41ae-8c18-1f6dea878859" containerName="extract-content" Mar 18 10:30:48 crc kubenswrapper[4733]: I0318 10:30:48.209312 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="be59cd8b-2e8d-41ae-8c18-1f6dea878859" containerName="registry-server" Mar 18 10:30:48 crc kubenswrapper[4733]: I0318 10:30:48.210462 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-7hxp6" Mar 18 10:30:48 crc kubenswrapper[4733]: I0318 10:30:48.215723 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-7hxp6"] Mar 18 10:30:48 crc kubenswrapper[4733]: I0318 10:30:48.219151 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 18 10:30:48 crc kubenswrapper[4733]: I0318 10:30:48.219554 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 18 10:30:48 crc kubenswrapper[4733]: I0318 10:30:48.223940 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-c44sb" Mar 18 10:30:48 crc kubenswrapper[4733]: I0318 10:30:48.224227 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 18 10:30:48 crc kubenswrapper[4733]: I0318 10:30:48.296625 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-jhs6c"] Mar 18 10:30:48 crc kubenswrapper[4733]: I0318 10:30:48.297821 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-jhs6c" Mar 18 10:30:48 crc kubenswrapper[4733]: I0318 10:30:48.309370 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 18 10:30:48 crc kubenswrapper[4733]: I0318 10:30:48.324057 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-jhs6c"] Mar 18 10:30:48 crc kubenswrapper[4733]: I0318 10:30:48.379895 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/725b76ca-c6aa-47f4-b75b-7ba4cd999979-config\") pod \"dnsmasq-dns-675f4bcbfc-7hxp6\" (UID: \"725b76ca-c6aa-47f4-b75b-7ba4cd999979\") " pod="openstack/dnsmasq-dns-675f4bcbfc-7hxp6" Mar 18 10:30:48 crc kubenswrapper[4733]: I0318 10:30:48.380027 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtr2d\" (UniqueName: \"kubernetes.io/projected/725b76ca-c6aa-47f4-b75b-7ba4cd999979-kube-api-access-mtr2d\") pod \"dnsmasq-dns-675f4bcbfc-7hxp6\" (UID: \"725b76ca-c6aa-47f4-b75b-7ba4cd999979\") " pod="openstack/dnsmasq-dns-675f4bcbfc-7hxp6" Mar 18 10:30:48 crc kubenswrapper[4733]: I0318 10:30:48.480973 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/681a4fb9-f5dc-4b7d-aad7-45d15f11de1c-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-jhs6c\" (UID: \"681a4fb9-f5dc-4b7d-aad7-45d15f11de1c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-jhs6c" Mar 18 10:30:48 crc kubenswrapper[4733]: I0318 10:30:48.481029 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtr2d\" (UniqueName: \"kubernetes.io/projected/725b76ca-c6aa-47f4-b75b-7ba4cd999979-kube-api-access-mtr2d\") pod \"dnsmasq-dns-675f4bcbfc-7hxp6\" (UID: \"725b76ca-c6aa-47f4-b75b-7ba4cd999979\") " pod="openstack/dnsmasq-dns-675f4bcbfc-7hxp6" Mar 18 10:30:48 crc kubenswrapper[4733]: I0318 10:30:48.481114 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/681a4fb9-f5dc-4b7d-aad7-45d15f11de1c-config\") pod \"dnsmasq-dns-78dd6ddcc-jhs6c\" (UID: \"681a4fb9-f5dc-4b7d-aad7-45d15f11de1c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-jhs6c" Mar 18 10:30:48 crc kubenswrapper[4733]: I0318 10:30:48.481141 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/725b76ca-c6aa-47f4-b75b-7ba4cd999979-config\") pod \"dnsmasq-dns-675f4bcbfc-7hxp6\" (UID: \"725b76ca-c6aa-47f4-b75b-7ba4cd999979\") " pod="openstack/dnsmasq-dns-675f4bcbfc-7hxp6" Mar 18 10:30:48 crc kubenswrapper[4733]: I0318 10:30:48.481211 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dc6zn\" (UniqueName: \"kubernetes.io/projected/681a4fb9-f5dc-4b7d-aad7-45d15f11de1c-kube-api-access-dc6zn\") pod \"dnsmasq-dns-78dd6ddcc-jhs6c\" (UID: \"681a4fb9-f5dc-4b7d-aad7-45d15f11de1c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-jhs6c" Mar 18 10:30:48 crc kubenswrapper[4733]: I0318 10:30:48.482170 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/725b76ca-c6aa-47f4-b75b-7ba4cd999979-config\") pod \"dnsmasq-dns-675f4bcbfc-7hxp6\" (UID: \"725b76ca-c6aa-47f4-b75b-7ba4cd999979\") " pod="openstack/dnsmasq-dns-675f4bcbfc-7hxp6" Mar 18 10:30:48 crc kubenswrapper[4733]: I0318 10:30:48.503916 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtr2d\" (UniqueName: \"kubernetes.io/projected/725b76ca-c6aa-47f4-b75b-7ba4cd999979-kube-api-access-mtr2d\") pod \"dnsmasq-dns-675f4bcbfc-7hxp6\" (UID: \"725b76ca-c6aa-47f4-b75b-7ba4cd999979\") " pod="openstack/dnsmasq-dns-675f4bcbfc-7hxp6" Mar 18 10:30:48 crc kubenswrapper[4733]: I0318 10:30:48.535656 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-7hxp6" Mar 18 10:30:48 crc kubenswrapper[4733]: I0318 10:30:48.582024 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/681a4fb9-f5dc-4b7d-aad7-45d15f11de1c-config\") pod \"dnsmasq-dns-78dd6ddcc-jhs6c\" (UID: \"681a4fb9-f5dc-4b7d-aad7-45d15f11de1c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-jhs6c" Mar 18 10:30:48 crc kubenswrapper[4733]: I0318 10:30:48.582702 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dc6zn\" (UniqueName: \"kubernetes.io/projected/681a4fb9-f5dc-4b7d-aad7-45d15f11de1c-kube-api-access-dc6zn\") pod \"dnsmasq-dns-78dd6ddcc-jhs6c\" (UID: \"681a4fb9-f5dc-4b7d-aad7-45d15f11de1c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-jhs6c" Mar 18 10:30:48 crc kubenswrapper[4733]: I0318 10:30:48.582871 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/681a4fb9-f5dc-4b7d-aad7-45d15f11de1c-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-jhs6c\" (UID: \"681a4fb9-f5dc-4b7d-aad7-45d15f11de1c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-jhs6c" Mar 18 10:30:48 crc kubenswrapper[4733]: I0318 10:30:48.583032 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/681a4fb9-f5dc-4b7d-aad7-45d15f11de1c-config\") pod \"dnsmasq-dns-78dd6ddcc-jhs6c\" (UID: \"681a4fb9-f5dc-4b7d-aad7-45d15f11de1c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-jhs6c" Mar 18 10:30:48 crc kubenswrapper[4733]: I0318 10:30:48.583479 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/681a4fb9-f5dc-4b7d-aad7-45d15f11de1c-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-jhs6c\" (UID: \"681a4fb9-f5dc-4b7d-aad7-45d15f11de1c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-jhs6c" Mar 18 10:30:48 crc kubenswrapper[4733]: I0318 10:30:48.619106 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dc6zn\" (UniqueName: \"kubernetes.io/projected/681a4fb9-f5dc-4b7d-aad7-45d15f11de1c-kube-api-access-dc6zn\") pod \"dnsmasq-dns-78dd6ddcc-jhs6c\" (UID: \"681a4fb9-f5dc-4b7d-aad7-45d15f11de1c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-jhs6c" Mar 18 10:30:48 crc kubenswrapper[4733]: I0318 10:30:48.623502 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-jhs6c" Mar 18 10:30:48 crc kubenswrapper[4733]: I0318 10:30:48.932811 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-7hxp6"] Mar 18 10:30:49 crc kubenswrapper[4733]: I0318 10:30:49.051242 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-jhs6c"] Mar 18 10:30:49 crc kubenswrapper[4733]: W0318 10:30:49.052580 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod681a4fb9_f5dc_4b7d_aad7_45d15f11de1c.slice/crio-143eafdd483ed64fb00fabfeb37bfb5824c7d4e108597ececea0f80ea29d068b WatchSource:0}: Error finding container 143eafdd483ed64fb00fabfeb37bfb5824c7d4e108597ececea0f80ea29d068b: Status 404 returned error can't find the container with id 143eafdd483ed64fb00fabfeb37bfb5824c7d4e108597ececea0f80ea29d068b Mar 18 10:30:49 crc kubenswrapper[4733]: I0318 10:30:49.319540 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-jhs6c" event={"ID":"681a4fb9-f5dc-4b7d-aad7-45d15f11de1c","Type":"ContainerStarted","Data":"143eafdd483ed64fb00fabfeb37bfb5824c7d4e108597ececea0f80ea29d068b"} Mar 18 10:30:49 crc kubenswrapper[4733]: I0318 10:30:49.321828 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-7hxp6" event={"ID":"725b76ca-c6aa-47f4-b75b-7ba4cd999979","Type":"ContainerStarted","Data":"338afcfd61f57729b6823a0742441ae38ca9d73a3a0cb99ea518e0556be8e8f6"} Mar 18 10:30:51 crc kubenswrapper[4733]: I0318 10:30:51.039456 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-7hxp6"] Mar 18 10:30:51 crc kubenswrapper[4733]: I0318 10:30:51.059932 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-xh24t"] Mar 18 10:30:51 crc kubenswrapper[4733]: I0318 10:30:51.061177 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-xh24t" Mar 18 10:30:51 crc kubenswrapper[4733]: I0318 10:30:51.095895 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-xh24t"] Mar 18 10:30:51 crc kubenswrapper[4733]: I0318 10:30:51.231689 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a56bac49-b398-4b61-9b54-7969acd2dc93-config\") pod \"dnsmasq-dns-666b6646f7-xh24t\" (UID: \"a56bac49-b398-4b61-9b54-7969acd2dc93\") " pod="openstack/dnsmasq-dns-666b6646f7-xh24t" Mar 18 10:30:51 crc kubenswrapper[4733]: I0318 10:30:51.231749 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mfbr\" (UniqueName: \"kubernetes.io/projected/a56bac49-b398-4b61-9b54-7969acd2dc93-kube-api-access-5mfbr\") pod \"dnsmasq-dns-666b6646f7-xh24t\" (UID: \"a56bac49-b398-4b61-9b54-7969acd2dc93\") " pod="openstack/dnsmasq-dns-666b6646f7-xh24t" Mar 18 10:30:51 crc kubenswrapper[4733]: I0318 10:30:51.231840 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a56bac49-b398-4b61-9b54-7969acd2dc93-dns-svc\") pod \"dnsmasq-dns-666b6646f7-xh24t\" (UID: \"a56bac49-b398-4b61-9b54-7969acd2dc93\") " pod="openstack/dnsmasq-dns-666b6646f7-xh24t" Mar 18 10:30:51 crc kubenswrapper[4733]: I0318 10:30:51.333765 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a56bac49-b398-4b61-9b54-7969acd2dc93-dns-svc\") pod \"dnsmasq-dns-666b6646f7-xh24t\" (UID: \"a56bac49-b398-4b61-9b54-7969acd2dc93\") " pod="openstack/dnsmasq-dns-666b6646f7-xh24t" Mar 18 10:30:51 crc kubenswrapper[4733]: I0318 10:30:51.333838 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a56bac49-b398-4b61-9b54-7969acd2dc93-config\") pod \"dnsmasq-dns-666b6646f7-xh24t\" (UID: \"a56bac49-b398-4b61-9b54-7969acd2dc93\") " pod="openstack/dnsmasq-dns-666b6646f7-xh24t" Mar 18 10:30:51 crc kubenswrapper[4733]: I0318 10:30:51.333853 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mfbr\" (UniqueName: \"kubernetes.io/projected/a56bac49-b398-4b61-9b54-7969acd2dc93-kube-api-access-5mfbr\") pod \"dnsmasq-dns-666b6646f7-xh24t\" (UID: \"a56bac49-b398-4b61-9b54-7969acd2dc93\") " pod="openstack/dnsmasq-dns-666b6646f7-xh24t" Mar 18 10:30:51 crc kubenswrapper[4733]: I0318 10:30:51.334852 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a56bac49-b398-4b61-9b54-7969acd2dc93-dns-svc\") pod \"dnsmasq-dns-666b6646f7-xh24t\" (UID: \"a56bac49-b398-4b61-9b54-7969acd2dc93\") " pod="openstack/dnsmasq-dns-666b6646f7-xh24t" Mar 18 10:30:51 crc kubenswrapper[4733]: I0318 10:30:51.335342 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a56bac49-b398-4b61-9b54-7969acd2dc93-config\") pod \"dnsmasq-dns-666b6646f7-xh24t\" (UID: \"a56bac49-b398-4b61-9b54-7969acd2dc93\") " pod="openstack/dnsmasq-dns-666b6646f7-xh24t" Mar 18 10:30:51 crc kubenswrapper[4733]: I0318 10:30:51.374346 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-jhs6c"] Mar 18 10:30:51 crc kubenswrapper[4733]: I0318 10:30:51.389149 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mfbr\" (UniqueName: \"kubernetes.io/projected/a56bac49-b398-4b61-9b54-7969acd2dc93-kube-api-access-5mfbr\") pod \"dnsmasq-dns-666b6646f7-xh24t\" (UID: \"a56bac49-b398-4b61-9b54-7969acd2dc93\") " pod="openstack/dnsmasq-dns-666b6646f7-xh24t" Mar 18 10:30:51 crc kubenswrapper[4733]: I0318 10:30:51.389524 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-xh24t" Mar 18 10:30:51 crc kubenswrapper[4733]: I0318 10:30:51.398691 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-tljb4"] Mar 18 10:30:51 crc kubenswrapper[4733]: I0318 10:30:51.400006 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-tljb4" Mar 18 10:30:51 crc kubenswrapper[4733]: I0318 10:30:51.410493 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-tljb4"] Mar 18 10:30:51 crc kubenswrapper[4733]: I0318 10:30:51.535855 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tggzq\" (UniqueName: \"kubernetes.io/projected/d6ec9568-99c8-4bee-a97c-46400fcc0e73-kube-api-access-tggzq\") pod \"dnsmasq-dns-57d769cc4f-tljb4\" (UID: \"d6ec9568-99c8-4bee-a97c-46400fcc0e73\") " pod="openstack/dnsmasq-dns-57d769cc4f-tljb4" Mar 18 10:30:51 crc kubenswrapper[4733]: I0318 10:30:51.536237 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d6ec9568-99c8-4bee-a97c-46400fcc0e73-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-tljb4\" (UID: \"d6ec9568-99c8-4bee-a97c-46400fcc0e73\") " pod="openstack/dnsmasq-dns-57d769cc4f-tljb4" Mar 18 10:30:51 crc kubenswrapper[4733]: I0318 10:30:51.536276 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6ec9568-99c8-4bee-a97c-46400fcc0e73-config\") pod \"dnsmasq-dns-57d769cc4f-tljb4\" (UID: \"d6ec9568-99c8-4bee-a97c-46400fcc0e73\") " pod="openstack/dnsmasq-dns-57d769cc4f-tljb4" Mar 18 10:30:51 crc kubenswrapper[4733]: I0318 10:30:51.637737 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tggzq\" (UniqueName: \"kubernetes.io/projected/d6ec9568-99c8-4bee-a97c-46400fcc0e73-kube-api-access-tggzq\") pod \"dnsmasq-dns-57d769cc4f-tljb4\" (UID: \"d6ec9568-99c8-4bee-a97c-46400fcc0e73\") " pod="openstack/dnsmasq-dns-57d769cc4f-tljb4" Mar 18 10:30:51 crc kubenswrapper[4733]: I0318 10:30:51.637864 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d6ec9568-99c8-4bee-a97c-46400fcc0e73-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-tljb4\" (UID: \"d6ec9568-99c8-4bee-a97c-46400fcc0e73\") " pod="openstack/dnsmasq-dns-57d769cc4f-tljb4" Mar 18 10:30:51 crc kubenswrapper[4733]: I0318 10:30:51.637930 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6ec9568-99c8-4bee-a97c-46400fcc0e73-config\") pod \"dnsmasq-dns-57d769cc4f-tljb4\" (UID: \"d6ec9568-99c8-4bee-a97c-46400fcc0e73\") " pod="openstack/dnsmasq-dns-57d769cc4f-tljb4" Mar 18 10:30:51 crc kubenswrapper[4733]: I0318 10:30:51.638800 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6ec9568-99c8-4bee-a97c-46400fcc0e73-config\") pod \"dnsmasq-dns-57d769cc4f-tljb4\" (UID: \"d6ec9568-99c8-4bee-a97c-46400fcc0e73\") " pod="openstack/dnsmasq-dns-57d769cc4f-tljb4" Mar 18 10:30:51 crc kubenswrapper[4733]: I0318 10:30:51.638858 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d6ec9568-99c8-4bee-a97c-46400fcc0e73-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-tljb4\" (UID: \"d6ec9568-99c8-4bee-a97c-46400fcc0e73\") " pod="openstack/dnsmasq-dns-57d769cc4f-tljb4" Mar 18 10:30:51 crc kubenswrapper[4733]: I0318 10:30:51.661261 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tggzq\" (UniqueName: \"kubernetes.io/projected/d6ec9568-99c8-4bee-a97c-46400fcc0e73-kube-api-access-tggzq\") pod \"dnsmasq-dns-57d769cc4f-tljb4\" (UID: \"d6ec9568-99c8-4bee-a97c-46400fcc0e73\") " pod="openstack/dnsmasq-dns-57d769cc4f-tljb4" Mar 18 10:30:51 crc kubenswrapper[4733]: I0318 10:30:51.741591 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-tljb4" Mar 18 10:30:51 crc kubenswrapper[4733]: I0318 10:30:51.975908 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-xh24t"] Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.012459 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.013637 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.017367 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.017386 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.017685 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.017760 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-p7fvd" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.017857 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.017990 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.018063 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.023618 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.145325 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wcnb\" (UniqueName: \"kubernetes.io/projected/f0570ce4-1455-4698-85cf-01f7108d9e7f-kube-api-access-7wcnb\") pod \"rabbitmq-server-0\" (UID: \"f0570ce4-1455-4698-85cf-01f7108d9e7f\") " pod="openstack/rabbitmq-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.145374 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"f0570ce4-1455-4698-85cf-01f7108d9e7f\") " pod="openstack/rabbitmq-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.145411 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f0570ce4-1455-4698-85cf-01f7108d9e7f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f0570ce4-1455-4698-85cf-01f7108d9e7f\") " pod="openstack/rabbitmq-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.145439 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f0570ce4-1455-4698-85cf-01f7108d9e7f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f0570ce4-1455-4698-85cf-01f7108d9e7f\") " pod="openstack/rabbitmq-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.145459 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f0570ce4-1455-4698-85cf-01f7108d9e7f-config-data\") pod \"rabbitmq-server-0\" (UID: \"f0570ce4-1455-4698-85cf-01f7108d9e7f\") " pod="openstack/rabbitmq-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.145482 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f0570ce4-1455-4698-85cf-01f7108d9e7f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f0570ce4-1455-4698-85cf-01f7108d9e7f\") " pod="openstack/rabbitmq-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.145510 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f0570ce4-1455-4698-85cf-01f7108d9e7f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f0570ce4-1455-4698-85cf-01f7108d9e7f\") " pod="openstack/rabbitmq-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.145524 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f0570ce4-1455-4698-85cf-01f7108d9e7f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f0570ce4-1455-4698-85cf-01f7108d9e7f\") " pod="openstack/rabbitmq-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.145538 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f0570ce4-1455-4698-85cf-01f7108d9e7f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f0570ce4-1455-4698-85cf-01f7108d9e7f\") " pod="openstack/rabbitmq-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.145556 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f0570ce4-1455-4698-85cf-01f7108d9e7f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f0570ce4-1455-4698-85cf-01f7108d9e7f\") " pod="openstack/rabbitmq-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.146368 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f0570ce4-1455-4698-85cf-01f7108d9e7f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f0570ce4-1455-4698-85cf-01f7108d9e7f\") " pod="openstack/rabbitmq-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.179228 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-tljb4"] Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.247397 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f0570ce4-1455-4698-85cf-01f7108d9e7f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f0570ce4-1455-4698-85cf-01f7108d9e7f\") " pod="openstack/rabbitmq-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.247438 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f0570ce4-1455-4698-85cf-01f7108d9e7f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f0570ce4-1455-4698-85cf-01f7108d9e7f\") " pod="openstack/rabbitmq-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.247457 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f0570ce4-1455-4698-85cf-01f7108d9e7f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f0570ce4-1455-4698-85cf-01f7108d9e7f\") " pod="openstack/rabbitmq-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.247477 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f0570ce4-1455-4698-85cf-01f7108d9e7f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f0570ce4-1455-4698-85cf-01f7108d9e7f\") " pod="openstack/rabbitmq-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.247501 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f0570ce4-1455-4698-85cf-01f7108d9e7f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f0570ce4-1455-4698-85cf-01f7108d9e7f\") " pod="openstack/rabbitmq-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.247538 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wcnb\" (UniqueName: \"kubernetes.io/projected/f0570ce4-1455-4698-85cf-01f7108d9e7f-kube-api-access-7wcnb\") pod \"rabbitmq-server-0\" (UID: \"f0570ce4-1455-4698-85cf-01f7108d9e7f\") " pod="openstack/rabbitmq-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.247560 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"f0570ce4-1455-4698-85cf-01f7108d9e7f\") " pod="openstack/rabbitmq-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.247587 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f0570ce4-1455-4698-85cf-01f7108d9e7f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f0570ce4-1455-4698-85cf-01f7108d9e7f\") " pod="openstack/rabbitmq-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.247611 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f0570ce4-1455-4698-85cf-01f7108d9e7f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f0570ce4-1455-4698-85cf-01f7108d9e7f\") " pod="openstack/rabbitmq-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.247627 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f0570ce4-1455-4698-85cf-01f7108d9e7f-config-data\") pod \"rabbitmq-server-0\" (UID: \"f0570ce4-1455-4698-85cf-01f7108d9e7f\") " pod="openstack/rabbitmq-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.247653 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f0570ce4-1455-4698-85cf-01f7108d9e7f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f0570ce4-1455-4698-85cf-01f7108d9e7f\") " pod="openstack/rabbitmq-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.248644 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f0570ce4-1455-4698-85cf-01f7108d9e7f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f0570ce4-1455-4698-85cf-01f7108d9e7f\") " pod="openstack/rabbitmq-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.248935 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f0570ce4-1455-4698-85cf-01f7108d9e7f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f0570ce4-1455-4698-85cf-01f7108d9e7f\") " pod="openstack/rabbitmq-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.248949 4733 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"f0570ce4-1455-4698-85cf-01f7108d9e7f\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/rabbitmq-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.250814 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f0570ce4-1455-4698-85cf-01f7108d9e7f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f0570ce4-1455-4698-85cf-01f7108d9e7f\") " pod="openstack/rabbitmq-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.251140 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f0570ce4-1455-4698-85cf-01f7108d9e7f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f0570ce4-1455-4698-85cf-01f7108d9e7f\") " pod="openstack/rabbitmq-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.251333 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f0570ce4-1455-4698-85cf-01f7108d9e7f-config-data\") pod \"rabbitmq-server-0\" (UID: \"f0570ce4-1455-4698-85cf-01f7108d9e7f\") " pod="openstack/rabbitmq-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.260037 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f0570ce4-1455-4698-85cf-01f7108d9e7f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f0570ce4-1455-4698-85cf-01f7108d9e7f\") " pod="openstack/rabbitmq-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.260113 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f0570ce4-1455-4698-85cf-01f7108d9e7f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f0570ce4-1455-4698-85cf-01f7108d9e7f\") " pod="openstack/rabbitmq-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.260134 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f0570ce4-1455-4698-85cf-01f7108d9e7f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f0570ce4-1455-4698-85cf-01f7108d9e7f\") " pod="openstack/rabbitmq-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.260236 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f0570ce4-1455-4698-85cf-01f7108d9e7f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f0570ce4-1455-4698-85cf-01f7108d9e7f\") " pod="openstack/rabbitmq-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.265393 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wcnb\" (UniqueName: \"kubernetes.io/projected/f0570ce4-1455-4698-85cf-01f7108d9e7f-kube-api-access-7wcnb\") pod \"rabbitmq-server-0\" (UID: \"f0570ce4-1455-4698-85cf-01f7108d9e7f\") " pod="openstack/rabbitmq-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.268950 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"f0570ce4-1455-4698-85cf-01f7108d9e7f\") " pod="openstack/rabbitmq-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.341430 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.360015 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.363572 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.366317 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.366644 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.366729 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.366788 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.366901 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.366920 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-6884w" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.366921 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.376399 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.551941 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b4a4e3e2-bd4d-4f8d-97bc-51267378ab03-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b4a4e3e2-bd4d-4f8d-97bc-51267378ab03\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.552002 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b4a4e3e2-bd4d-4f8d-97bc-51267378ab03-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b4a4e3e2-bd4d-4f8d-97bc-51267378ab03\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.552029 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s62ng\" (UniqueName: \"kubernetes.io/projected/b4a4e3e2-bd4d-4f8d-97bc-51267378ab03-kube-api-access-s62ng\") pod \"rabbitmq-cell1-server-0\" (UID: \"b4a4e3e2-bd4d-4f8d-97bc-51267378ab03\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.552082 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b4a4e3e2-bd4d-4f8d-97bc-51267378ab03\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.552121 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b4a4e3e2-bd4d-4f8d-97bc-51267378ab03-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b4a4e3e2-bd4d-4f8d-97bc-51267378ab03\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.552144 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b4a4e3e2-bd4d-4f8d-97bc-51267378ab03-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b4a4e3e2-bd4d-4f8d-97bc-51267378ab03\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.552168 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b4a4e3e2-bd4d-4f8d-97bc-51267378ab03-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b4a4e3e2-bd4d-4f8d-97bc-51267378ab03\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.552225 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b4a4e3e2-bd4d-4f8d-97bc-51267378ab03-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b4a4e3e2-bd4d-4f8d-97bc-51267378ab03\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.552273 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b4a4e3e2-bd4d-4f8d-97bc-51267378ab03-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b4a4e3e2-bd4d-4f8d-97bc-51267378ab03\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.552315 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b4a4e3e2-bd4d-4f8d-97bc-51267378ab03-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b4a4e3e2-bd4d-4f8d-97bc-51267378ab03\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.552350 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b4a4e3e2-bd4d-4f8d-97bc-51267378ab03-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b4a4e3e2-bd4d-4f8d-97bc-51267378ab03\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.654651 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b4a4e3e2-bd4d-4f8d-97bc-51267378ab03-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b4a4e3e2-bd4d-4f8d-97bc-51267378ab03\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.654752 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s62ng\" (UniqueName: \"kubernetes.io/projected/b4a4e3e2-bd4d-4f8d-97bc-51267378ab03-kube-api-access-s62ng\") pod \"rabbitmq-cell1-server-0\" (UID: \"b4a4e3e2-bd4d-4f8d-97bc-51267378ab03\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.654827 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b4a4e3e2-bd4d-4f8d-97bc-51267378ab03\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.654851 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b4a4e3e2-bd4d-4f8d-97bc-51267378ab03-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b4a4e3e2-bd4d-4f8d-97bc-51267378ab03\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.654886 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b4a4e3e2-bd4d-4f8d-97bc-51267378ab03-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b4a4e3e2-bd4d-4f8d-97bc-51267378ab03\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.654922 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b4a4e3e2-bd4d-4f8d-97bc-51267378ab03-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b4a4e3e2-bd4d-4f8d-97bc-51267378ab03\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.654965 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b4a4e3e2-bd4d-4f8d-97bc-51267378ab03-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b4a4e3e2-bd4d-4f8d-97bc-51267378ab03\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.654999 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b4a4e3e2-bd4d-4f8d-97bc-51267378ab03-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b4a4e3e2-bd4d-4f8d-97bc-51267378ab03\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.655045 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b4a4e3e2-bd4d-4f8d-97bc-51267378ab03-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b4a4e3e2-bd4d-4f8d-97bc-51267378ab03\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.655083 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b4a4e3e2-bd4d-4f8d-97bc-51267378ab03-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b4a4e3e2-bd4d-4f8d-97bc-51267378ab03\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.655123 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b4a4e3e2-bd4d-4f8d-97bc-51267378ab03-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b4a4e3e2-bd4d-4f8d-97bc-51267378ab03\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.656513 4733 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b4a4e3e2-bd4d-4f8d-97bc-51267378ab03\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-cell1-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.657377 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b4a4e3e2-bd4d-4f8d-97bc-51267378ab03-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b4a4e3e2-bd4d-4f8d-97bc-51267378ab03\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.657830 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b4a4e3e2-bd4d-4f8d-97bc-51267378ab03-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b4a4e3e2-bd4d-4f8d-97bc-51267378ab03\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.658509 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b4a4e3e2-bd4d-4f8d-97bc-51267378ab03-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b4a4e3e2-bd4d-4f8d-97bc-51267378ab03\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.659817 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b4a4e3e2-bd4d-4f8d-97bc-51267378ab03-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b4a4e3e2-bd4d-4f8d-97bc-51267378ab03\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.660307 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b4a4e3e2-bd4d-4f8d-97bc-51267378ab03-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b4a4e3e2-bd4d-4f8d-97bc-51267378ab03\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.661716 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b4a4e3e2-bd4d-4f8d-97bc-51267378ab03-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b4a4e3e2-bd4d-4f8d-97bc-51267378ab03\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.662450 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b4a4e3e2-bd4d-4f8d-97bc-51267378ab03-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b4a4e3e2-bd4d-4f8d-97bc-51267378ab03\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.657535 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b4a4e3e2-bd4d-4f8d-97bc-51267378ab03-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b4a4e3e2-bd4d-4f8d-97bc-51267378ab03\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.674113 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b4a4e3e2-bd4d-4f8d-97bc-51267378ab03-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b4a4e3e2-bd4d-4f8d-97bc-51267378ab03\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.678485 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s62ng\" (UniqueName: \"kubernetes.io/projected/b4a4e3e2-bd4d-4f8d-97bc-51267378ab03-kube-api-access-s62ng\") pod \"rabbitmq-cell1-server-0\" (UID: \"b4a4e3e2-bd4d-4f8d-97bc-51267378ab03\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.684955 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b4a4e3e2-bd4d-4f8d-97bc-51267378ab03\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 10:30:52 crc kubenswrapper[4733]: I0318 10:30:52.713812 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 10:30:53 crc kubenswrapper[4733]: I0318 10:30:53.565845 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 18 10:30:53 crc kubenswrapper[4733]: I0318 10:30:53.566919 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 18 10:30:53 crc kubenswrapper[4733]: I0318 10:30:53.575028 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 18 10:30:53 crc kubenswrapper[4733]: I0318 10:30:53.577508 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 18 10:30:53 crc kubenswrapper[4733]: I0318 10:30:53.577652 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-7hltt" Mar 18 10:30:53 crc kubenswrapper[4733]: I0318 10:30:53.577754 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 18 10:30:53 crc kubenswrapper[4733]: I0318 10:30:53.577837 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 18 10:30:53 crc kubenswrapper[4733]: I0318 10:30:53.584584 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 18 10:30:53 crc kubenswrapper[4733]: I0318 10:30:53.691394 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc60b49b-96fa-40fd-a8e5-40c810f5ef80-operator-scripts\") pod \"openstack-galera-0\" (UID: \"dc60b49b-96fa-40fd-a8e5-40c810f5ef80\") " pod="openstack/openstack-galera-0" Mar 18 10:30:53 crc kubenswrapper[4733]: I0318 10:30:53.691462 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/dc60b49b-96fa-40fd-a8e5-40c810f5ef80-kolla-config\") pod \"openstack-galera-0\" (UID: \"dc60b49b-96fa-40fd-a8e5-40c810f5ef80\") " pod="openstack/openstack-galera-0" Mar 18 10:30:53 crc kubenswrapper[4733]: I0318 10:30:53.691615 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/dc60b49b-96fa-40fd-a8e5-40c810f5ef80-config-data-default\") pod \"openstack-galera-0\" (UID: \"dc60b49b-96fa-40fd-a8e5-40c810f5ef80\") " pod="openstack/openstack-galera-0" Mar 18 10:30:53 crc kubenswrapper[4733]: I0318 10:30:53.691699 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/dc60b49b-96fa-40fd-a8e5-40c810f5ef80-config-data-generated\") pod \"openstack-galera-0\" (UID: \"dc60b49b-96fa-40fd-a8e5-40c810f5ef80\") " pod="openstack/openstack-galera-0" Mar 18 10:30:53 crc kubenswrapper[4733]: I0318 10:30:53.691742 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"dc60b49b-96fa-40fd-a8e5-40c810f5ef80\") " pod="openstack/openstack-galera-0" Mar 18 10:30:53 crc kubenswrapper[4733]: I0318 10:30:53.691768 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjj5v\" (UniqueName: \"kubernetes.io/projected/dc60b49b-96fa-40fd-a8e5-40c810f5ef80-kube-api-access-zjj5v\") pod \"openstack-galera-0\" (UID: \"dc60b49b-96fa-40fd-a8e5-40c810f5ef80\") " pod="openstack/openstack-galera-0" Mar 18 10:30:53 crc kubenswrapper[4733]: I0318 10:30:53.691828 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc60b49b-96fa-40fd-a8e5-40c810f5ef80-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"dc60b49b-96fa-40fd-a8e5-40c810f5ef80\") " pod="openstack/openstack-galera-0" Mar 18 10:30:53 crc kubenswrapper[4733]: I0318 10:30:53.691893 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc60b49b-96fa-40fd-a8e5-40c810f5ef80-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"dc60b49b-96fa-40fd-a8e5-40c810f5ef80\") " pod="openstack/openstack-galera-0" Mar 18 10:30:53 crc kubenswrapper[4733]: I0318 10:30:53.792966 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc60b49b-96fa-40fd-a8e5-40c810f5ef80-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"dc60b49b-96fa-40fd-a8e5-40c810f5ef80\") " pod="openstack/openstack-galera-0" Mar 18 10:30:53 crc kubenswrapper[4733]: I0318 10:30:53.793036 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc60b49b-96fa-40fd-a8e5-40c810f5ef80-operator-scripts\") pod \"openstack-galera-0\" (UID: \"dc60b49b-96fa-40fd-a8e5-40c810f5ef80\") " pod="openstack/openstack-galera-0" Mar 18 10:30:53 crc kubenswrapper[4733]: I0318 10:30:53.793083 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/dc60b49b-96fa-40fd-a8e5-40c810f5ef80-kolla-config\") pod \"openstack-galera-0\" (UID: \"dc60b49b-96fa-40fd-a8e5-40c810f5ef80\") " pod="openstack/openstack-galera-0" Mar 18 10:30:53 crc kubenswrapper[4733]: I0318 10:30:53.793128 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/dc60b49b-96fa-40fd-a8e5-40c810f5ef80-config-data-default\") pod \"openstack-galera-0\" (UID: \"dc60b49b-96fa-40fd-a8e5-40c810f5ef80\") " pod="openstack/openstack-galera-0" Mar 18 10:30:53 crc kubenswrapper[4733]: I0318 10:30:53.793169 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/dc60b49b-96fa-40fd-a8e5-40c810f5ef80-config-data-generated\") pod \"openstack-galera-0\" (UID: \"dc60b49b-96fa-40fd-a8e5-40c810f5ef80\") " pod="openstack/openstack-galera-0" Mar 18 10:30:53 crc kubenswrapper[4733]: I0318 10:30:53.793212 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"dc60b49b-96fa-40fd-a8e5-40c810f5ef80\") " pod="openstack/openstack-galera-0" Mar 18 10:30:53 crc kubenswrapper[4733]: I0318 10:30:53.793237 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjj5v\" (UniqueName: \"kubernetes.io/projected/dc60b49b-96fa-40fd-a8e5-40c810f5ef80-kube-api-access-zjj5v\") pod \"openstack-galera-0\" (UID: \"dc60b49b-96fa-40fd-a8e5-40c810f5ef80\") " pod="openstack/openstack-galera-0" Mar 18 10:30:53 crc kubenswrapper[4733]: I0318 10:30:53.793276 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc60b49b-96fa-40fd-a8e5-40c810f5ef80-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"dc60b49b-96fa-40fd-a8e5-40c810f5ef80\") " pod="openstack/openstack-galera-0" Mar 18 10:30:53 crc kubenswrapper[4733]: I0318 10:30:53.794718 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/dc60b49b-96fa-40fd-a8e5-40c810f5ef80-config-data-default\") pod \"openstack-galera-0\" (UID: \"dc60b49b-96fa-40fd-a8e5-40c810f5ef80\") " pod="openstack/openstack-galera-0" Mar 18 10:30:53 crc kubenswrapper[4733]: I0318 10:30:53.796031 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc60b49b-96fa-40fd-a8e5-40c810f5ef80-operator-scripts\") pod \"openstack-galera-0\" (UID: \"dc60b49b-96fa-40fd-a8e5-40c810f5ef80\") " pod="openstack/openstack-galera-0" Mar 18 10:30:53 crc kubenswrapper[4733]: I0318 10:30:53.796613 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/dc60b49b-96fa-40fd-a8e5-40c810f5ef80-kolla-config\") pod \"openstack-galera-0\" (UID: \"dc60b49b-96fa-40fd-a8e5-40c810f5ef80\") " pod="openstack/openstack-galera-0" Mar 18 10:30:53 crc kubenswrapper[4733]: I0318 10:30:53.796895 4733 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"dc60b49b-96fa-40fd-a8e5-40c810f5ef80\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/openstack-galera-0" Mar 18 10:30:53 crc kubenswrapper[4733]: I0318 10:30:53.802886 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc60b49b-96fa-40fd-a8e5-40c810f5ef80-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"dc60b49b-96fa-40fd-a8e5-40c810f5ef80\") " pod="openstack/openstack-galera-0" Mar 18 10:30:53 crc kubenswrapper[4733]: I0318 10:30:53.803704 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/dc60b49b-96fa-40fd-a8e5-40c810f5ef80-config-data-generated\") pod \"openstack-galera-0\" (UID: \"dc60b49b-96fa-40fd-a8e5-40c810f5ef80\") " pod="openstack/openstack-galera-0" Mar 18 10:30:53 crc kubenswrapper[4733]: I0318 10:30:53.821507 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"dc60b49b-96fa-40fd-a8e5-40c810f5ef80\") " pod="openstack/openstack-galera-0" Mar 18 10:30:53 crc kubenswrapper[4733]: I0318 10:30:53.823815 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjj5v\" (UniqueName: \"kubernetes.io/projected/dc60b49b-96fa-40fd-a8e5-40c810f5ef80-kube-api-access-zjj5v\") pod \"openstack-galera-0\" (UID: \"dc60b49b-96fa-40fd-a8e5-40c810f5ef80\") " pod="openstack/openstack-galera-0" Mar 18 10:30:53 crc kubenswrapper[4733]: I0318 10:30:53.828798 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc60b49b-96fa-40fd-a8e5-40c810f5ef80-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"dc60b49b-96fa-40fd-a8e5-40c810f5ef80\") " pod="openstack/openstack-galera-0" Mar 18 10:30:53 crc kubenswrapper[4733]: I0318 10:30:53.906550 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 18 10:30:54 crc kubenswrapper[4733]: I0318 10:30:54.942179 4733 scope.go:117] "RemoveContainer" containerID="6f50555f9faf96f94c8c33f53803364eb9620cbe1dd5e27e68cba9056a299fa1" Mar 18 10:30:54 crc kubenswrapper[4733]: I0318 10:30:54.999716 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 10:30:55 crc kubenswrapper[4733]: I0318 10:30:55.000826 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 18 10:30:55 crc kubenswrapper[4733]: I0318 10:30:55.002443 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-67rtq" Mar 18 10:30:55 crc kubenswrapper[4733]: I0318 10:30:55.002819 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 18 10:30:55 crc kubenswrapper[4733]: I0318 10:30:55.002865 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 18 10:30:55 crc kubenswrapper[4733]: I0318 10:30:55.003074 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 18 10:30:55 crc kubenswrapper[4733]: I0318 10:30:55.017403 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 10:30:55 crc kubenswrapper[4733]: I0318 10:30:55.130385 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"0208d826-df0f-41c8-83a7-821a21b7b85d\") " pod="openstack/openstack-cell1-galera-0" Mar 18 10:30:55 crc kubenswrapper[4733]: I0318 10:30:55.130450 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjknb\" (UniqueName: \"kubernetes.io/projected/0208d826-df0f-41c8-83a7-821a21b7b85d-kube-api-access-cjknb\") pod \"openstack-cell1-galera-0\" (UID: \"0208d826-df0f-41c8-83a7-821a21b7b85d\") " pod="openstack/openstack-cell1-galera-0" Mar 18 10:30:55 crc kubenswrapper[4733]: I0318 10:30:55.130481 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0208d826-df0f-41c8-83a7-821a21b7b85d-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"0208d826-df0f-41c8-83a7-821a21b7b85d\") " pod="openstack/openstack-cell1-galera-0" Mar 18 10:30:55 crc kubenswrapper[4733]: I0318 10:30:55.130506 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0208d826-df0f-41c8-83a7-821a21b7b85d-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"0208d826-df0f-41c8-83a7-821a21b7b85d\") " pod="openstack/openstack-cell1-galera-0" Mar 18 10:30:55 crc kubenswrapper[4733]: I0318 10:30:55.130551 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0208d826-df0f-41c8-83a7-821a21b7b85d-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"0208d826-df0f-41c8-83a7-821a21b7b85d\") " pod="openstack/openstack-cell1-galera-0" Mar 18 10:30:55 crc kubenswrapper[4733]: I0318 10:30:55.130606 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0208d826-df0f-41c8-83a7-821a21b7b85d-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"0208d826-df0f-41c8-83a7-821a21b7b85d\") " pod="openstack/openstack-cell1-galera-0" Mar 18 10:30:55 crc kubenswrapper[4733]: I0318 10:30:55.130643 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0208d826-df0f-41c8-83a7-821a21b7b85d-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"0208d826-df0f-41c8-83a7-821a21b7b85d\") " pod="openstack/openstack-cell1-galera-0" Mar 18 10:30:55 crc kubenswrapper[4733]: I0318 10:30:55.130670 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0208d826-df0f-41c8-83a7-821a21b7b85d-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"0208d826-df0f-41c8-83a7-821a21b7b85d\") " pod="openstack/openstack-cell1-galera-0" Mar 18 10:30:55 crc kubenswrapper[4733]: I0318 10:30:55.132482 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 18 10:30:55 crc kubenswrapper[4733]: I0318 10:30:55.133536 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 18 10:30:55 crc kubenswrapper[4733]: I0318 10:30:55.135753 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 18 10:30:55 crc kubenswrapper[4733]: I0318 10:30:55.135920 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-dssff" Mar 18 10:30:55 crc kubenswrapper[4733]: I0318 10:30:55.136033 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 18 10:30:55 crc kubenswrapper[4733]: I0318 10:30:55.144455 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 18 10:30:55 crc kubenswrapper[4733]: I0318 10:30:55.232281 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0208d826-df0f-41c8-83a7-821a21b7b85d-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"0208d826-df0f-41c8-83a7-821a21b7b85d\") " pod="openstack/openstack-cell1-galera-0" Mar 18 10:30:55 crc kubenswrapper[4733]: I0318 10:30:55.232341 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dd66892e-808c-405a-ac8e-366b6ca8b148-config-data\") pod \"memcached-0\" (UID: \"dd66892e-808c-405a-ac8e-366b6ca8b148\") " pod="openstack/memcached-0" Mar 18 10:30:55 crc kubenswrapper[4733]: I0318 10:30:55.232377 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0208d826-df0f-41c8-83a7-821a21b7b85d-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"0208d826-df0f-41c8-83a7-821a21b7b85d\") " pod="openstack/openstack-cell1-galera-0" Mar 18 10:30:55 crc kubenswrapper[4733]: I0318 10:30:55.232403 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd66892e-808c-405a-ac8e-366b6ca8b148-memcached-tls-certs\") pod \"memcached-0\" (UID: \"dd66892e-808c-405a-ac8e-366b6ca8b148\") " pod="openstack/memcached-0" Mar 18 10:30:55 crc kubenswrapper[4733]: I0318 10:30:55.232426 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0208d826-df0f-41c8-83a7-821a21b7b85d-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"0208d826-df0f-41c8-83a7-821a21b7b85d\") " pod="openstack/openstack-cell1-galera-0" Mar 18 10:30:55 crc kubenswrapper[4733]: I0318 10:30:55.232448 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0208d826-df0f-41c8-83a7-821a21b7b85d-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"0208d826-df0f-41c8-83a7-821a21b7b85d\") " pod="openstack/openstack-cell1-galera-0" Mar 18 10:30:55 crc kubenswrapper[4733]: I0318 10:30:55.232502 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvwbt\" (UniqueName: \"kubernetes.io/projected/dd66892e-808c-405a-ac8e-366b6ca8b148-kube-api-access-kvwbt\") pod \"memcached-0\" (UID: \"dd66892e-808c-405a-ac8e-366b6ca8b148\") " pod="openstack/memcached-0" Mar 18 10:30:55 crc kubenswrapper[4733]: I0318 10:30:55.232523 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"0208d826-df0f-41c8-83a7-821a21b7b85d\") " pod="openstack/openstack-cell1-galera-0" Mar 18 10:30:55 crc kubenswrapper[4733]: I0318 10:30:55.232554 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd66892e-808c-405a-ac8e-366b6ca8b148-combined-ca-bundle\") pod \"memcached-0\" (UID: \"dd66892e-808c-405a-ac8e-366b6ca8b148\") " pod="openstack/memcached-0" Mar 18 10:30:55 crc kubenswrapper[4733]: I0318 10:30:55.232577 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/dd66892e-808c-405a-ac8e-366b6ca8b148-kolla-config\") pod \"memcached-0\" (UID: \"dd66892e-808c-405a-ac8e-366b6ca8b148\") " pod="openstack/memcached-0" Mar 18 10:30:55 crc kubenswrapper[4733]: I0318 10:30:55.232723 4733 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"0208d826-df0f-41c8-83a7-821a21b7b85d\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/openstack-cell1-galera-0" Mar 18 10:30:55 crc kubenswrapper[4733]: I0318 10:30:55.232785 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjknb\" (UniqueName: \"kubernetes.io/projected/0208d826-df0f-41c8-83a7-821a21b7b85d-kube-api-access-cjknb\") pod \"openstack-cell1-galera-0\" (UID: \"0208d826-df0f-41c8-83a7-821a21b7b85d\") " pod="openstack/openstack-cell1-galera-0" Mar 18 10:30:55 crc kubenswrapper[4733]: I0318 10:30:55.233009 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0208d826-df0f-41c8-83a7-821a21b7b85d-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"0208d826-df0f-41c8-83a7-821a21b7b85d\") " pod="openstack/openstack-cell1-galera-0" Mar 18 10:30:55 crc kubenswrapper[4733]: I0318 10:30:55.233051 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0208d826-df0f-41c8-83a7-821a21b7b85d-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"0208d826-df0f-41c8-83a7-821a21b7b85d\") " pod="openstack/openstack-cell1-galera-0" Mar 18 10:30:55 crc kubenswrapper[4733]: I0318 10:30:55.233919 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0208d826-df0f-41c8-83a7-821a21b7b85d-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"0208d826-df0f-41c8-83a7-821a21b7b85d\") " pod="openstack/openstack-cell1-galera-0" Mar 18 10:30:55 crc kubenswrapper[4733]: I0318 10:30:55.234100 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0208d826-df0f-41c8-83a7-821a21b7b85d-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"0208d826-df0f-41c8-83a7-821a21b7b85d\") " pod="openstack/openstack-cell1-galera-0" Mar 18 10:30:55 crc kubenswrapper[4733]: I0318 10:30:55.234160 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0208d826-df0f-41c8-83a7-821a21b7b85d-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"0208d826-df0f-41c8-83a7-821a21b7b85d\") " pod="openstack/openstack-cell1-galera-0" Mar 18 10:30:55 crc kubenswrapper[4733]: I0318 10:30:55.234704 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0208d826-df0f-41c8-83a7-821a21b7b85d-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"0208d826-df0f-41c8-83a7-821a21b7b85d\") " pod="openstack/openstack-cell1-galera-0" Mar 18 10:30:55 crc kubenswrapper[4733]: I0318 10:30:55.238129 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0208d826-df0f-41c8-83a7-821a21b7b85d-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"0208d826-df0f-41c8-83a7-821a21b7b85d\") " pod="openstack/openstack-cell1-galera-0" Mar 18 10:30:55 crc kubenswrapper[4733]: I0318 10:30:55.238729 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0208d826-df0f-41c8-83a7-821a21b7b85d-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"0208d826-df0f-41c8-83a7-821a21b7b85d\") " pod="openstack/openstack-cell1-galera-0" Mar 18 10:30:55 crc kubenswrapper[4733]: I0318 10:30:55.256525 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjknb\" (UniqueName: \"kubernetes.io/projected/0208d826-df0f-41c8-83a7-821a21b7b85d-kube-api-access-cjknb\") pod \"openstack-cell1-galera-0\" (UID: \"0208d826-df0f-41c8-83a7-821a21b7b85d\") " pod="openstack/openstack-cell1-galera-0" Mar 18 10:30:55 crc kubenswrapper[4733]: I0318 10:30:55.263868 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"0208d826-df0f-41c8-83a7-821a21b7b85d\") " pod="openstack/openstack-cell1-galera-0" Mar 18 10:30:55 crc kubenswrapper[4733]: I0318 10:30:55.320429 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 18 10:30:55 crc kubenswrapper[4733]: I0318 10:30:55.335170 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvwbt\" (UniqueName: \"kubernetes.io/projected/dd66892e-808c-405a-ac8e-366b6ca8b148-kube-api-access-kvwbt\") pod \"memcached-0\" (UID: \"dd66892e-808c-405a-ac8e-366b6ca8b148\") " pod="openstack/memcached-0" Mar 18 10:30:55 crc kubenswrapper[4733]: I0318 10:30:55.335235 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd66892e-808c-405a-ac8e-366b6ca8b148-combined-ca-bundle\") pod \"memcached-0\" (UID: \"dd66892e-808c-405a-ac8e-366b6ca8b148\") " pod="openstack/memcached-0" Mar 18 10:30:55 crc kubenswrapper[4733]: I0318 10:30:55.335256 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/dd66892e-808c-405a-ac8e-366b6ca8b148-kolla-config\") pod \"memcached-0\" (UID: \"dd66892e-808c-405a-ac8e-366b6ca8b148\") " pod="openstack/memcached-0" Mar 18 10:30:55 crc kubenswrapper[4733]: I0318 10:30:55.335325 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dd66892e-808c-405a-ac8e-366b6ca8b148-config-data\") pod \"memcached-0\" (UID: \"dd66892e-808c-405a-ac8e-366b6ca8b148\") " pod="openstack/memcached-0" Mar 18 10:30:55 crc kubenswrapper[4733]: I0318 10:30:55.335706 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd66892e-808c-405a-ac8e-366b6ca8b148-memcached-tls-certs\") pod \"memcached-0\" (UID: \"dd66892e-808c-405a-ac8e-366b6ca8b148\") " pod="openstack/memcached-0" Mar 18 10:30:55 crc kubenswrapper[4733]: I0318 10:30:55.336463 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/dd66892e-808c-405a-ac8e-366b6ca8b148-kolla-config\") pod \"memcached-0\" (UID: \"dd66892e-808c-405a-ac8e-366b6ca8b148\") " pod="openstack/memcached-0" Mar 18 10:30:55 crc kubenswrapper[4733]: I0318 10:30:55.336498 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dd66892e-808c-405a-ac8e-366b6ca8b148-config-data\") pod \"memcached-0\" (UID: \"dd66892e-808c-405a-ac8e-366b6ca8b148\") " pod="openstack/memcached-0" Mar 18 10:30:55 crc kubenswrapper[4733]: I0318 10:30:55.338222 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd66892e-808c-405a-ac8e-366b6ca8b148-combined-ca-bundle\") pod \"memcached-0\" (UID: \"dd66892e-808c-405a-ac8e-366b6ca8b148\") " pod="openstack/memcached-0" Mar 18 10:30:55 crc kubenswrapper[4733]: I0318 10:30:55.338520 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd66892e-808c-405a-ac8e-366b6ca8b148-memcached-tls-certs\") pod \"memcached-0\" (UID: \"dd66892e-808c-405a-ac8e-366b6ca8b148\") " pod="openstack/memcached-0" Mar 18 10:30:55 crc kubenswrapper[4733]: I0318 10:30:55.352246 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvwbt\" (UniqueName: \"kubernetes.io/projected/dd66892e-808c-405a-ac8e-366b6ca8b148-kube-api-access-kvwbt\") pod \"memcached-0\" (UID: \"dd66892e-808c-405a-ac8e-366b6ca8b148\") " pod="openstack/memcached-0" Mar 18 10:30:55 crc kubenswrapper[4733]: I0318 10:30:55.392583 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-xh24t" event={"ID":"a56bac49-b398-4b61-9b54-7969acd2dc93","Type":"ContainerStarted","Data":"cf4476f986138503b3408c91ec78e55f73e59536bc0804b03b95667a22e6c6a6"} Mar 18 10:30:55 crc kubenswrapper[4733]: I0318 10:30:55.394341 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-tljb4" event={"ID":"d6ec9568-99c8-4bee-a97c-46400fcc0e73","Type":"ContainerStarted","Data":"642fd85cd38aa86a0c841253dfb5fda87ce7251b22474e12b3ef33923f92f9b2"} Mar 18 10:30:55 crc kubenswrapper[4733]: I0318 10:30:55.456704 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 18 10:30:57 crc kubenswrapper[4733]: I0318 10:30:57.368389 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 10:30:57 crc kubenswrapper[4733]: I0318 10:30:57.373403 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 10:30:57 crc kubenswrapper[4733]: I0318 10:30:57.376089 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 10:30:57 crc kubenswrapper[4733]: I0318 10:30:57.377484 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-mzkrd" Mar 18 10:30:57 crc kubenswrapper[4733]: I0318 10:30:57.401853 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77qsg\" (UniqueName: \"kubernetes.io/projected/55f0b230-09f2-4be2-aa1f-76a37f3fe30c-kube-api-access-77qsg\") pod \"kube-state-metrics-0\" (UID: \"55f0b230-09f2-4be2-aa1f-76a37f3fe30c\") " pod="openstack/kube-state-metrics-0" Mar 18 10:30:57 crc kubenswrapper[4733]: I0318 10:30:57.503317 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77qsg\" (UniqueName: \"kubernetes.io/projected/55f0b230-09f2-4be2-aa1f-76a37f3fe30c-kube-api-access-77qsg\") pod \"kube-state-metrics-0\" (UID: \"55f0b230-09f2-4be2-aa1f-76a37f3fe30c\") " pod="openstack/kube-state-metrics-0" Mar 18 10:30:57 crc kubenswrapper[4733]: I0318 10:30:57.523214 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77qsg\" (UniqueName: \"kubernetes.io/projected/55f0b230-09f2-4be2-aa1f-76a37f3fe30c-kube-api-access-77qsg\") pod \"kube-state-metrics-0\" (UID: \"55f0b230-09f2-4be2-aa1f-76a37f3fe30c\") " pod="openstack/kube-state-metrics-0" Mar 18 10:30:57 crc kubenswrapper[4733]: I0318 10:30:57.707762 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 10:31:00 crc kubenswrapper[4733]: I0318 10:31:00.482283 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-rh64b"] Mar 18 10:31:00 crc kubenswrapper[4733]: I0318 10:31:00.483502 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rh64b" Mar 18 10:31:00 crc kubenswrapper[4733]: I0318 10:31:00.487472 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 18 10:31:00 crc kubenswrapper[4733]: I0318 10:31:00.487519 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 18 10:31:00 crc kubenswrapper[4733]: I0318 10:31:00.487769 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-84shn" Mar 18 10:31:00 crc kubenswrapper[4733]: I0318 10:31:00.492440 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-ljrgt"] Mar 18 10:31:00 crc kubenswrapper[4733]: I0318 10:31:00.493915 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-ljrgt" Mar 18 10:31:00 crc kubenswrapper[4733]: I0318 10:31:00.500230 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-rh64b"] Mar 18 10:31:00 crc kubenswrapper[4733]: I0318 10:31:00.512798 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-ljrgt"] Mar 18 10:31:00 crc kubenswrapper[4733]: I0318 10:31:00.553253 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d75a8d54-aca8-49cd-9062-6389baaf7a09-var-run\") pod \"ovn-controller-ovs-ljrgt\" (UID: \"d75a8d54-aca8-49cd-9062-6389baaf7a09\") " pod="openstack/ovn-controller-ovs-ljrgt" Mar 18 10:31:00 crc kubenswrapper[4733]: I0318 10:31:00.553300 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9q7j4\" (UniqueName: \"kubernetes.io/projected/d75a8d54-aca8-49cd-9062-6389baaf7a09-kube-api-access-9q7j4\") pod \"ovn-controller-ovs-ljrgt\" (UID: \"d75a8d54-aca8-49cd-9062-6389baaf7a09\") " pod="openstack/ovn-controller-ovs-ljrgt" Mar 18 10:31:00 crc kubenswrapper[4733]: I0318 10:31:00.553424 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e3c842d3-b3dd-4cf2-9df0-16cea4061bc5-scripts\") pod \"ovn-controller-rh64b\" (UID: \"e3c842d3-b3dd-4cf2-9df0-16cea4061bc5\") " pod="openstack/ovn-controller-rh64b" Mar 18 10:31:00 crc kubenswrapper[4733]: I0318 10:31:00.553478 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/d75a8d54-aca8-49cd-9062-6389baaf7a09-etc-ovs\") pod \"ovn-controller-ovs-ljrgt\" (UID: \"d75a8d54-aca8-49cd-9062-6389baaf7a09\") " pod="openstack/ovn-controller-ovs-ljrgt" Mar 18 10:31:00 crc kubenswrapper[4733]: I0318 10:31:00.553529 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/d75a8d54-aca8-49cd-9062-6389baaf7a09-var-lib\") pod \"ovn-controller-ovs-ljrgt\" (UID: \"d75a8d54-aca8-49cd-9062-6389baaf7a09\") " pod="openstack/ovn-controller-ovs-ljrgt" Mar 18 10:31:00 crc kubenswrapper[4733]: I0318 10:31:00.553553 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3c842d3-b3dd-4cf2-9df0-16cea4061bc5-combined-ca-bundle\") pod \"ovn-controller-rh64b\" (UID: \"e3c842d3-b3dd-4cf2-9df0-16cea4061bc5\") " pod="openstack/ovn-controller-rh64b" Mar 18 10:31:00 crc kubenswrapper[4733]: I0318 10:31:00.553575 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdpd5\" (UniqueName: \"kubernetes.io/projected/e3c842d3-b3dd-4cf2-9df0-16cea4061bc5-kube-api-access-sdpd5\") pod \"ovn-controller-rh64b\" (UID: \"e3c842d3-b3dd-4cf2-9df0-16cea4061bc5\") " pod="openstack/ovn-controller-rh64b" Mar 18 10:31:00 crc kubenswrapper[4733]: I0318 10:31:00.553612 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e3c842d3-b3dd-4cf2-9df0-16cea4061bc5-var-log-ovn\") pod \"ovn-controller-rh64b\" (UID: \"e3c842d3-b3dd-4cf2-9df0-16cea4061bc5\") " pod="openstack/ovn-controller-rh64b" Mar 18 10:31:00 crc kubenswrapper[4733]: I0318 10:31:00.553653 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/d75a8d54-aca8-49cd-9062-6389baaf7a09-var-log\") pod \"ovn-controller-ovs-ljrgt\" (UID: \"d75a8d54-aca8-49cd-9062-6389baaf7a09\") " pod="openstack/ovn-controller-ovs-ljrgt" Mar 18 10:31:00 crc kubenswrapper[4733]: I0318 10:31:00.553741 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e3c842d3-b3dd-4cf2-9df0-16cea4061bc5-var-run-ovn\") pod \"ovn-controller-rh64b\" (UID: \"e3c842d3-b3dd-4cf2-9df0-16cea4061bc5\") " pod="openstack/ovn-controller-rh64b" Mar 18 10:31:00 crc kubenswrapper[4733]: I0318 10:31:00.553763 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d75a8d54-aca8-49cd-9062-6389baaf7a09-scripts\") pod \"ovn-controller-ovs-ljrgt\" (UID: \"d75a8d54-aca8-49cd-9062-6389baaf7a09\") " pod="openstack/ovn-controller-ovs-ljrgt" Mar 18 10:31:00 crc kubenswrapper[4733]: I0318 10:31:00.553788 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3c842d3-b3dd-4cf2-9df0-16cea4061bc5-ovn-controller-tls-certs\") pod \"ovn-controller-rh64b\" (UID: \"e3c842d3-b3dd-4cf2-9df0-16cea4061bc5\") " pod="openstack/ovn-controller-rh64b" Mar 18 10:31:00 crc kubenswrapper[4733]: I0318 10:31:00.553858 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e3c842d3-b3dd-4cf2-9df0-16cea4061bc5-var-run\") pod \"ovn-controller-rh64b\" (UID: \"e3c842d3-b3dd-4cf2-9df0-16cea4061bc5\") " pod="openstack/ovn-controller-rh64b" Mar 18 10:31:00 crc kubenswrapper[4733]: I0318 10:31:00.654607 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d75a8d54-aca8-49cd-9062-6389baaf7a09-var-run\") pod \"ovn-controller-ovs-ljrgt\" (UID: \"d75a8d54-aca8-49cd-9062-6389baaf7a09\") " pod="openstack/ovn-controller-ovs-ljrgt" Mar 18 10:31:00 crc kubenswrapper[4733]: I0318 10:31:00.654660 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9q7j4\" (UniqueName: \"kubernetes.io/projected/d75a8d54-aca8-49cd-9062-6389baaf7a09-kube-api-access-9q7j4\") pod \"ovn-controller-ovs-ljrgt\" (UID: \"d75a8d54-aca8-49cd-9062-6389baaf7a09\") " pod="openstack/ovn-controller-ovs-ljrgt" Mar 18 10:31:00 crc kubenswrapper[4733]: I0318 10:31:00.654686 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e3c842d3-b3dd-4cf2-9df0-16cea4061bc5-scripts\") pod \"ovn-controller-rh64b\" (UID: \"e3c842d3-b3dd-4cf2-9df0-16cea4061bc5\") " pod="openstack/ovn-controller-rh64b" Mar 18 10:31:00 crc kubenswrapper[4733]: I0318 10:31:00.654705 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/d75a8d54-aca8-49cd-9062-6389baaf7a09-etc-ovs\") pod \"ovn-controller-ovs-ljrgt\" (UID: \"d75a8d54-aca8-49cd-9062-6389baaf7a09\") " pod="openstack/ovn-controller-ovs-ljrgt" Mar 18 10:31:00 crc kubenswrapper[4733]: I0318 10:31:00.654730 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/d75a8d54-aca8-49cd-9062-6389baaf7a09-var-lib\") pod \"ovn-controller-ovs-ljrgt\" (UID: \"d75a8d54-aca8-49cd-9062-6389baaf7a09\") " pod="openstack/ovn-controller-ovs-ljrgt" Mar 18 10:31:00 crc kubenswrapper[4733]: I0318 10:31:00.654748 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdpd5\" (UniqueName: \"kubernetes.io/projected/e3c842d3-b3dd-4cf2-9df0-16cea4061bc5-kube-api-access-sdpd5\") pod \"ovn-controller-rh64b\" (UID: \"e3c842d3-b3dd-4cf2-9df0-16cea4061bc5\") " pod="openstack/ovn-controller-rh64b" Mar 18 10:31:00 crc kubenswrapper[4733]: I0318 10:31:00.654762 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3c842d3-b3dd-4cf2-9df0-16cea4061bc5-combined-ca-bundle\") pod \"ovn-controller-rh64b\" (UID: \"e3c842d3-b3dd-4cf2-9df0-16cea4061bc5\") " pod="openstack/ovn-controller-rh64b" Mar 18 10:31:00 crc kubenswrapper[4733]: I0318 10:31:00.654783 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e3c842d3-b3dd-4cf2-9df0-16cea4061bc5-var-log-ovn\") pod \"ovn-controller-rh64b\" (UID: \"e3c842d3-b3dd-4cf2-9df0-16cea4061bc5\") " pod="openstack/ovn-controller-rh64b" Mar 18 10:31:00 crc kubenswrapper[4733]: I0318 10:31:00.654806 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/d75a8d54-aca8-49cd-9062-6389baaf7a09-var-log\") pod \"ovn-controller-ovs-ljrgt\" (UID: \"d75a8d54-aca8-49cd-9062-6389baaf7a09\") " pod="openstack/ovn-controller-ovs-ljrgt" Mar 18 10:31:00 crc kubenswrapper[4733]: I0318 10:31:00.654830 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e3c842d3-b3dd-4cf2-9df0-16cea4061bc5-var-run-ovn\") pod \"ovn-controller-rh64b\" (UID: \"e3c842d3-b3dd-4cf2-9df0-16cea4061bc5\") " pod="openstack/ovn-controller-rh64b" Mar 18 10:31:00 crc kubenswrapper[4733]: I0318 10:31:00.654844 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d75a8d54-aca8-49cd-9062-6389baaf7a09-scripts\") pod \"ovn-controller-ovs-ljrgt\" (UID: \"d75a8d54-aca8-49cd-9062-6389baaf7a09\") " pod="openstack/ovn-controller-ovs-ljrgt" Mar 18 10:31:00 crc kubenswrapper[4733]: I0318 10:31:00.654863 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3c842d3-b3dd-4cf2-9df0-16cea4061bc5-ovn-controller-tls-certs\") pod \"ovn-controller-rh64b\" (UID: \"e3c842d3-b3dd-4cf2-9df0-16cea4061bc5\") " pod="openstack/ovn-controller-rh64b" Mar 18 10:31:00 crc kubenswrapper[4733]: I0318 10:31:00.654894 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e3c842d3-b3dd-4cf2-9df0-16cea4061bc5-var-run\") pod \"ovn-controller-rh64b\" (UID: \"e3c842d3-b3dd-4cf2-9df0-16cea4061bc5\") " pod="openstack/ovn-controller-rh64b" Mar 18 10:31:00 crc kubenswrapper[4733]: I0318 10:31:00.655221 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e3c842d3-b3dd-4cf2-9df0-16cea4061bc5-var-run\") pod \"ovn-controller-rh64b\" (UID: \"e3c842d3-b3dd-4cf2-9df0-16cea4061bc5\") " pod="openstack/ovn-controller-rh64b" Mar 18 10:31:00 crc kubenswrapper[4733]: I0318 10:31:00.655239 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d75a8d54-aca8-49cd-9062-6389baaf7a09-var-run\") pod \"ovn-controller-ovs-ljrgt\" (UID: \"d75a8d54-aca8-49cd-9062-6389baaf7a09\") " pod="openstack/ovn-controller-ovs-ljrgt" Mar 18 10:31:00 crc kubenswrapper[4733]: I0318 10:31:00.656166 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e3c842d3-b3dd-4cf2-9df0-16cea4061bc5-var-log-ovn\") pod \"ovn-controller-rh64b\" (UID: \"e3c842d3-b3dd-4cf2-9df0-16cea4061bc5\") " pod="openstack/ovn-controller-rh64b" Mar 18 10:31:00 crc kubenswrapper[4733]: I0318 10:31:00.656265 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/d75a8d54-aca8-49cd-9062-6389baaf7a09-var-log\") pod \"ovn-controller-ovs-ljrgt\" (UID: \"d75a8d54-aca8-49cd-9062-6389baaf7a09\") " pod="openstack/ovn-controller-ovs-ljrgt" Mar 18 10:31:00 crc kubenswrapper[4733]: I0318 10:31:00.656343 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e3c842d3-b3dd-4cf2-9df0-16cea4061bc5-var-run-ovn\") pod \"ovn-controller-rh64b\" (UID: \"e3c842d3-b3dd-4cf2-9df0-16cea4061bc5\") " pod="openstack/ovn-controller-rh64b" Mar 18 10:31:00 crc kubenswrapper[4733]: I0318 10:31:00.657885 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/d75a8d54-aca8-49cd-9062-6389baaf7a09-etc-ovs\") pod \"ovn-controller-ovs-ljrgt\" (UID: \"d75a8d54-aca8-49cd-9062-6389baaf7a09\") " pod="openstack/ovn-controller-ovs-ljrgt" Mar 18 10:31:00 crc kubenswrapper[4733]: I0318 10:31:00.658923 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/d75a8d54-aca8-49cd-9062-6389baaf7a09-var-lib\") pod \"ovn-controller-ovs-ljrgt\" (UID: \"d75a8d54-aca8-49cd-9062-6389baaf7a09\") " pod="openstack/ovn-controller-ovs-ljrgt" Mar 18 10:31:00 crc kubenswrapper[4733]: I0318 10:31:00.660422 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d75a8d54-aca8-49cd-9062-6389baaf7a09-scripts\") pod \"ovn-controller-ovs-ljrgt\" (UID: \"d75a8d54-aca8-49cd-9062-6389baaf7a09\") " pod="openstack/ovn-controller-ovs-ljrgt" Mar 18 10:31:00 crc kubenswrapper[4733]: I0318 10:31:00.663986 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e3c842d3-b3dd-4cf2-9df0-16cea4061bc5-scripts\") pod \"ovn-controller-rh64b\" (UID: \"e3c842d3-b3dd-4cf2-9df0-16cea4061bc5\") " pod="openstack/ovn-controller-rh64b" Mar 18 10:31:00 crc kubenswrapper[4733]: I0318 10:31:00.673231 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3c842d3-b3dd-4cf2-9df0-16cea4061bc5-ovn-controller-tls-certs\") pod \"ovn-controller-rh64b\" (UID: \"e3c842d3-b3dd-4cf2-9df0-16cea4061bc5\") " pod="openstack/ovn-controller-rh64b" Mar 18 10:31:00 crc kubenswrapper[4733]: I0318 10:31:00.673290 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9q7j4\" (UniqueName: \"kubernetes.io/projected/d75a8d54-aca8-49cd-9062-6389baaf7a09-kube-api-access-9q7j4\") pod \"ovn-controller-ovs-ljrgt\" (UID: \"d75a8d54-aca8-49cd-9062-6389baaf7a09\") " pod="openstack/ovn-controller-ovs-ljrgt" Mar 18 10:31:00 crc kubenswrapper[4733]: I0318 10:31:00.673720 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3c842d3-b3dd-4cf2-9df0-16cea4061bc5-combined-ca-bundle\") pod \"ovn-controller-rh64b\" (UID: \"e3c842d3-b3dd-4cf2-9df0-16cea4061bc5\") " pod="openstack/ovn-controller-rh64b" Mar 18 10:31:00 crc kubenswrapper[4733]: I0318 10:31:00.677216 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdpd5\" (UniqueName: \"kubernetes.io/projected/e3c842d3-b3dd-4cf2-9df0-16cea4061bc5-kube-api-access-sdpd5\") pod \"ovn-controller-rh64b\" (UID: \"e3c842d3-b3dd-4cf2-9df0-16cea4061bc5\") " pod="openstack/ovn-controller-rh64b" Mar 18 10:31:00 crc kubenswrapper[4733]: I0318 10:31:00.837847 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rh64b" Mar 18 10:31:00 crc kubenswrapper[4733]: I0318 10:31:00.849038 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-ljrgt" Mar 18 10:31:01 crc kubenswrapper[4733]: I0318 10:31:01.391927 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 10:31:01 crc kubenswrapper[4733]: I0318 10:31:01.393284 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 18 10:31:01 crc kubenswrapper[4733]: I0318 10:31:01.396222 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 18 10:31:01 crc kubenswrapper[4733]: I0318 10:31:01.396730 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 18 10:31:01 crc kubenswrapper[4733]: I0318 10:31:01.396777 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-rnxnj" Mar 18 10:31:01 crc kubenswrapper[4733]: I0318 10:31:01.396870 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 18 10:31:01 crc kubenswrapper[4733]: I0318 10:31:01.396933 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 18 10:31:01 crc kubenswrapper[4733]: I0318 10:31:01.399270 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 10:31:01 crc kubenswrapper[4733]: I0318 10:31:01.569919 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95hf9\" (UniqueName: \"kubernetes.io/projected/a8c27598-870d-4de0-a986-47042d7d6f4c-kube-api-access-95hf9\") pod \"ovsdbserver-nb-0\" (UID: \"a8c27598-870d-4de0-a986-47042d7d6f4c\") " pod="openstack/ovsdbserver-nb-0" Mar 18 10:31:01 crc kubenswrapper[4733]: I0318 10:31:01.570005 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a8c27598-870d-4de0-a986-47042d7d6f4c-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"a8c27598-870d-4de0-a986-47042d7d6f4c\") " pod="openstack/ovsdbserver-nb-0" Mar 18 10:31:01 crc kubenswrapper[4733]: I0318 10:31:01.570080 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8c27598-870d-4de0-a986-47042d7d6f4c-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a8c27598-870d-4de0-a986-47042d7d6f4c\") " pod="openstack/ovsdbserver-nb-0" Mar 18 10:31:01 crc kubenswrapper[4733]: I0318 10:31:01.570166 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-nb-0\" (UID: \"a8c27598-870d-4de0-a986-47042d7d6f4c\") " pod="openstack/ovsdbserver-nb-0" Mar 18 10:31:01 crc kubenswrapper[4733]: I0318 10:31:01.570220 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8c27598-870d-4de0-a986-47042d7d6f4c-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"a8c27598-870d-4de0-a986-47042d7d6f4c\") " pod="openstack/ovsdbserver-nb-0" Mar 18 10:31:01 crc kubenswrapper[4733]: I0318 10:31:01.570246 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8c27598-870d-4de0-a986-47042d7d6f4c-config\") pod \"ovsdbserver-nb-0\" (UID: \"a8c27598-870d-4de0-a986-47042d7d6f4c\") " pod="openstack/ovsdbserver-nb-0" Mar 18 10:31:01 crc kubenswrapper[4733]: I0318 10:31:01.570391 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a8c27598-870d-4de0-a986-47042d7d6f4c-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"a8c27598-870d-4de0-a986-47042d7d6f4c\") " pod="openstack/ovsdbserver-nb-0" Mar 18 10:31:01 crc kubenswrapper[4733]: I0318 10:31:01.570457 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8c27598-870d-4de0-a986-47042d7d6f4c-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a8c27598-870d-4de0-a986-47042d7d6f4c\") " pod="openstack/ovsdbserver-nb-0" Mar 18 10:31:01 crc kubenswrapper[4733]: I0318 10:31:01.672578 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a8c27598-870d-4de0-a986-47042d7d6f4c-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"a8c27598-870d-4de0-a986-47042d7d6f4c\") " pod="openstack/ovsdbserver-nb-0" Mar 18 10:31:01 crc kubenswrapper[4733]: I0318 10:31:01.672648 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8c27598-870d-4de0-a986-47042d7d6f4c-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a8c27598-870d-4de0-a986-47042d7d6f4c\") " pod="openstack/ovsdbserver-nb-0" Mar 18 10:31:01 crc kubenswrapper[4733]: I0318 10:31:01.672706 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95hf9\" (UniqueName: \"kubernetes.io/projected/a8c27598-870d-4de0-a986-47042d7d6f4c-kube-api-access-95hf9\") pod \"ovsdbserver-nb-0\" (UID: \"a8c27598-870d-4de0-a986-47042d7d6f4c\") " pod="openstack/ovsdbserver-nb-0" Mar 18 10:31:01 crc kubenswrapper[4733]: I0318 10:31:01.672766 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a8c27598-870d-4de0-a986-47042d7d6f4c-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"a8c27598-870d-4de0-a986-47042d7d6f4c\") " pod="openstack/ovsdbserver-nb-0" Mar 18 10:31:01 crc kubenswrapper[4733]: I0318 10:31:01.672794 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8c27598-870d-4de0-a986-47042d7d6f4c-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a8c27598-870d-4de0-a986-47042d7d6f4c\") " pod="openstack/ovsdbserver-nb-0" Mar 18 10:31:01 crc kubenswrapper[4733]: I0318 10:31:01.672832 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-nb-0\" (UID: \"a8c27598-870d-4de0-a986-47042d7d6f4c\") " pod="openstack/ovsdbserver-nb-0" Mar 18 10:31:01 crc kubenswrapper[4733]: I0318 10:31:01.672851 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8c27598-870d-4de0-a986-47042d7d6f4c-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"a8c27598-870d-4de0-a986-47042d7d6f4c\") " pod="openstack/ovsdbserver-nb-0" Mar 18 10:31:01 crc kubenswrapper[4733]: I0318 10:31:01.672881 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8c27598-870d-4de0-a986-47042d7d6f4c-config\") pod \"ovsdbserver-nb-0\" (UID: \"a8c27598-870d-4de0-a986-47042d7d6f4c\") " pod="openstack/ovsdbserver-nb-0" Mar 18 10:31:01 crc kubenswrapper[4733]: I0318 10:31:01.673347 4733 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-nb-0\" (UID: \"a8c27598-870d-4de0-a986-47042d7d6f4c\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/ovsdbserver-nb-0" Mar 18 10:31:01 crc kubenswrapper[4733]: I0318 10:31:01.678650 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a8c27598-870d-4de0-a986-47042d7d6f4c-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"a8c27598-870d-4de0-a986-47042d7d6f4c\") " pod="openstack/ovsdbserver-nb-0" Mar 18 10:31:01 crc kubenswrapper[4733]: I0318 10:31:01.680452 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a8c27598-870d-4de0-a986-47042d7d6f4c-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"a8c27598-870d-4de0-a986-47042d7d6f4c\") " pod="openstack/ovsdbserver-nb-0" Mar 18 10:31:01 crc kubenswrapper[4733]: I0318 10:31:01.680953 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8c27598-870d-4de0-a986-47042d7d6f4c-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a8c27598-870d-4de0-a986-47042d7d6f4c\") " pod="openstack/ovsdbserver-nb-0" Mar 18 10:31:01 crc kubenswrapper[4733]: I0318 10:31:01.681631 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8c27598-870d-4de0-a986-47042d7d6f4c-config\") pod \"ovsdbserver-nb-0\" (UID: \"a8c27598-870d-4de0-a986-47042d7d6f4c\") " pod="openstack/ovsdbserver-nb-0" Mar 18 10:31:01 crc kubenswrapper[4733]: I0318 10:31:01.688878 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8c27598-870d-4de0-a986-47042d7d6f4c-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a8c27598-870d-4de0-a986-47042d7d6f4c\") " pod="openstack/ovsdbserver-nb-0" Mar 18 10:31:01 crc kubenswrapper[4733]: I0318 10:31:01.690574 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95hf9\" (UniqueName: \"kubernetes.io/projected/a8c27598-870d-4de0-a986-47042d7d6f4c-kube-api-access-95hf9\") pod \"ovsdbserver-nb-0\" (UID: \"a8c27598-870d-4de0-a986-47042d7d6f4c\") " pod="openstack/ovsdbserver-nb-0" Mar 18 10:31:01 crc kubenswrapper[4733]: I0318 10:31:01.695122 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8c27598-870d-4de0-a986-47042d7d6f4c-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"a8c27598-870d-4de0-a986-47042d7d6f4c\") " pod="openstack/ovsdbserver-nb-0" Mar 18 10:31:01 crc kubenswrapper[4733]: I0318 10:31:01.698126 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-nb-0\" (UID: \"a8c27598-870d-4de0-a986-47042d7d6f4c\") " pod="openstack/ovsdbserver-nb-0" Mar 18 10:31:01 crc kubenswrapper[4733]: I0318 10:31:01.720732 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 18 10:31:03 crc kubenswrapper[4733]: E0318 10:31:03.867979 4733 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 18 10:31:03 crc kubenswrapper[4733]: E0318 10:31:03.868483 4733 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mtr2d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-7hxp6_openstack(725b76ca-c6aa-47f4-b75b-7ba4cd999979): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 10:31:03 crc kubenswrapper[4733]: E0318 10:31:03.869696 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-7hxp6" podUID="725b76ca-c6aa-47f4-b75b-7ba4cd999979" Mar 18 10:31:03 crc kubenswrapper[4733]: E0318 10:31:03.915455 4733 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 18 10:31:03 crc kubenswrapper[4733]: E0318 10:31:03.915611 4733 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dc6zn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-jhs6c_openstack(681a4fb9-f5dc-4b7d-aad7-45d15f11de1c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 10:31:03 crc kubenswrapper[4733]: E0318 10:31:03.916770 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-jhs6c" podUID="681a4fb9-f5dc-4b7d-aad7-45d15f11de1c" Mar 18 10:31:04 crc kubenswrapper[4733]: I0318 10:31:04.480869 4733 generic.go:334] "Generic (PLEG): container finished" podID="d6ec9568-99c8-4bee-a97c-46400fcc0e73" containerID="f1bfaec48682e00a29092241786f91d62461cdabac1600bdef26a808c7697bdd" exitCode=0 Mar 18 10:31:04 crc kubenswrapper[4733]: I0318 10:31:04.482241 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-tljb4" event={"ID":"d6ec9568-99c8-4bee-a97c-46400fcc0e73","Type":"ContainerDied","Data":"f1bfaec48682e00a29092241786f91d62461cdabac1600bdef26a808c7697bdd"} Mar 18 10:31:04 crc kubenswrapper[4733]: I0318 10:31:04.485740 4733 generic.go:334] "Generic (PLEG): container finished" podID="a56bac49-b398-4b61-9b54-7969acd2dc93" containerID="0d7f9ebe26354b0fdcafbf9243319e234596728fe76f62846f8d6f2de9c01686" exitCode=0 Mar 18 10:31:04 crc kubenswrapper[4733]: I0318 10:31:04.487035 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-xh24t" event={"ID":"a56bac49-b398-4b61-9b54-7969acd2dc93","Type":"ContainerDied","Data":"0d7f9ebe26354b0fdcafbf9243319e234596728fe76f62846f8d6f2de9c01686"} Mar 18 10:31:04 crc kubenswrapper[4733]: I0318 10:31:04.570056 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 10:31:04 crc kubenswrapper[4733]: I0318 10:31:04.571633 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 18 10:31:04 crc kubenswrapper[4733]: I0318 10:31:04.576294 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 18 10:31:04 crc kubenswrapper[4733]: I0318 10:31:04.576509 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 18 10:31:04 crc kubenswrapper[4733]: I0318 10:31:04.576646 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-726jj" Mar 18 10:31:04 crc kubenswrapper[4733]: I0318 10:31:04.578596 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 18 10:31:04 crc kubenswrapper[4733]: I0318 10:31:04.593503 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 10:31:04 crc kubenswrapper[4733]: I0318 10:31:04.597649 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 18 10:31:04 crc kubenswrapper[4733]: I0318 10:31:04.605567 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 18 10:31:04 crc kubenswrapper[4733]: I0318 10:31:04.623674 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 10:31:04 crc kubenswrapper[4733]: W0318 10:31:04.666906 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc60b49b_96fa_40fd_a8e5_40c810f5ef80.slice/crio-1cc1d503cfe0c3f0293efb60f8e37ebd8029bc5f9e2559b027fd01da9fcfc135 WatchSource:0}: Error finding container 1cc1d503cfe0c3f0293efb60f8e37ebd8029bc5f9e2559b027fd01da9fcfc135: Status 404 returned error can't find the container with id 1cc1d503cfe0c3f0293efb60f8e37ebd8029bc5f9e2559b027fd01da9fcfc135 Mar 18 10:31:04 crc kubenswrapper[4733]: I0318 10:31:04.720820 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-ljrgt"] Mar 18 10:31:04 crc kubenswrapper[4733]: I0318 10:31:04.760656 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0868210e-9d93-4f63-b425-7db21f13cd90-config\") pod \"ovsdbserver-sb-0\" (UID: \"0868210e-9d93-4f63-b425-7db21f13cd90\") " pod="openstack/ovsdbserver-sb-0" Mar 18 10:31:04 crc kubenswrapper[4733]: I0318 10:31:04.760705 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0868210e-9d93-4f63-b425-7db21f13cd90-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"0868210e-9d93-4f63-b425-7db21f13cd90\") " pod="openstack/ovsdbserver-sb-0" Mar 18 10:31:04 crc kubenswrapper[4733]: I0318 10:31:04.760781 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0868210e-9d93-4f63-b425-7db21f13cd90-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"0868210e-9d93-4f63-b425-7db21f13cd90\") " pod="openstack/ovsdbserver-sb-0" Mar 18 10:31:04 crc kubenswrapper[4733]: I0318 10:31:04.760826 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"0868210e-9d93-4f63-b425-7db21f13cd90\") " pod="openstack/ovsdbserver-sb-0" Mar 18 10:31:04 crc kubenswrapper[4733]: I0318 10:31:04.760858 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0868210e-9d93-4f63-b425-7db21f13cd90-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"0868210e-9d93-4f63-b425-7db21f13cd90\") " pod="openstack/ovsdbserver-sb-0" Mar 18 10:31:04 crc kubenswrapper[4733]: I0318 10:31:04.760904 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0868210e-9d93-4f63-b425-7db21f13cd90-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"0868210e-9d93-4f63-b425-7db21f13cd90\") " pod="openstack/ovsdbserver-sb-0" Mar 18 10:31:04 crc kubenswrapper[4733]: I0318 10:31:04.760948 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-842ht\" (UniqueName: \"kubernetes.io/projected/0868210e-9d93-4f63-b425-7db21f13cd90-kube-api-access-842ht\") pod \"ovsdbserver-sb-0\" (UID: \"0868210e-9d93-4f63-b425-7db21f13cd90\") " pod="openstack/ovsdbserver-sb-0" Mar 18 10:31:04 crc kubenswrapper[4733]: I0318 10:31:04.760971 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0868210e-9d93-4f63-b425-7db21f13cd90-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"0868210e-9d93-4f63-b425-7db21f13cd90\") " pod="openstack/ovsdbserver-sb-0" Mar 18 10:31:04 crc kubenswrapper[4733]: I0318 10:31:04.862636 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-842ht\" (UniqueName: \"kubernetes.io/projected/0868210e-9d93-4f63-b425-7db21f13cd90-kube-api-access-842ht\") pod \"ovsdbserver-sb-0\" (UID: \"0868210e-9d93-4f63-b425-7db21f13cd90\") " pod="openstack/ovsdbserver-sb-0" Mar 18 10:31:04 crc kubenswrapper[4733]: I0318 10:31:04.863090 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0868210e-9d93-4f63-b425-7db21f13cd90-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"0868210e-9d93-4f63-b425-7db21f13cd90\") " pod="openstack/ovsdbserver-sb-0" Mar 18 10:31:04 crc kubenswrapper[4733]: I0318 10:31:04.863130 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0868210e-9d93-4f63-b425-7db21f13cd90-config\") pod \"ovsdbserver-sb-0\" (UID: \"0868210e-9d93-4f63-b425-7db21f13cd90\") " pod="openstack/ovsdbserver-sb-0" Mar 18 10:31:04 crc kubenswrapper[4733]: I0318 10:31:04.863164 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0868210e-9d93-4f63-b425-7db21f13cd90-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"0868210e-9d93-4f63-b425-7db21f13cd90\") " pod="openstack/ovsdbserver-sb-0" Mar 18 10:31:04 crc kubenswrapper[4733]: I0318 10:31:04.863262 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0868210e-9d93-4f63-b425-7db21f13cd90-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"0868210e-9d93-4f63-b425-7db21f13cd90\") " pod="openstack/ovsdbserver-sb-0" Mar 18 10:31:04 crc kubenswrapper[4733]: I0318 10:31:04.863316 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"0868210e-9d93-4f63-b425-7db21f13cd90\") " pod="openstack/ovsdbserver-sb-0" Mar 18 10:31:04 crc kubenswrapper[4733]: I0318 10:31:04.863346 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0868210e-9d93-4f63-b425-7db21f13cd90-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"0868210e-9d93-4f63-b425-7db21f13cd90\") " pod="openstack/ovsdbserver-sb-0" Mar 18 10:31:04 crc kubenswrapper[4733]: I0318 10:31:04.863399 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0868210e-9d93-4f63-b425-7db21f13cd90-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"0868210e-9d93-4f63-b425-7db21f13cd90\") " pod="openstack/ovsdbserver-sb-0" Mar 18 10:31:04 crc kubenswrapper[4733]: I0318 10:31:04.863901 4733 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"0868210e-9d93-4f63-b425-7db21f13cd90\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/ovsdbserver-sb-0" Mar 18 10:31:04 crc kubenswrapper[4733]: I0318 10:31:04.864017 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0868210e-9d93-4f63-b425-7db21f13cd90-config\") pod \"ovsdbserver-sb-0\" (UID: \"0868210e-9d93-4f63-b425-7db21f13cd90\") " pod="openstack/ovsdbserver-sb-0" Mar 18 10:31:04 crc kubenswrapper[4733]: I0318 10:31:04.864048 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0868210e-9d93-4f63-b425-7db21f13cd90-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"0868210e-9d93-4f63-b425-7db21f13cd90\") " pod="openstack/ovsdbserver-sb-0" Mar 18 10:31:04 crc kubenswrapper[4733]: I0318 10:31:04.866791 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0868210e-9d93-4f63-b425-7db21f13cd90-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"0868210e-9d93-4f63-b425-7db21f13cd90\") " pod="openstack/ovsdbserver-sb-0" Mar 18 10:31:04 crc kubenswrapper[4733]: I0318 10:31:04.869219 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0868210e-9d93-4f63-b425-7db21f13cd90-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"0868210e-9d93-4f63-b425-7db21f13cd90\") " pod="openstack/ovsdbserver-sb-0" Mar 18 10:31:04 crc kubenswrapper[4733]: I0318 10:31:04.869253 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0868210e-9d93-4f63-b425-7db21f13cd90-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"0868210e-9d93-4f63-b425-7db21f13cd90\") " pod="openstack/ovsdbserver-sb-0" Mar 18 10:31:04 crc kubenswrapper[4733]: I0318 10:31:04.873958 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0868210e-9d93-4f63-b425-7db21f13cd90-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"0868210e-9d93-4f63-b425-7db21f13cd90\") " pod="openstack/ovsdbserver-sb-0" Mar 18 10:31:04 crc kubenswrapper[4733]: I0318 10:31:04.882042 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-jhs6c" Mar 18 10:31:04 crc kubenswrapper[4733]: I0318 10:31:04.883322 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-842ht\" (UniqueName: \"kubernetes.io/projected/0868210e-9d93-4f63-b425-7db21f13cd90-kube-api-access-842ht\") pod \"ovsdbserver-sb-0\" (UID: \"0868210e-9d93-4f63-b425-7db21f13cd90\") " pod="openstack/ovsdbserver-sb-0" Mar 18 10:31:04 crc kubenswrapper[4733]: I0318 10:31:04.893463 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"0868210e-9d93-4f63-b425-7db21f13cd90\") " pod="openstack/ovsdbserver-sb-0" Mar 18 10:31:04 crc kubenswrapper[4733]: I0318 10:31:04.893842 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-7hxp6" Mar 18 10:31:04 crc kubenswrapper[4733]: I0318 10:31:04.903698 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 18 10:31:04 crc kubenswrapper[4733]: I0318 10:31:04.977593 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 10:31:05 crc kubenswrapper[4733]: W0318 10:31:05.005947 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0208d826_df0f_41c8_83a7_821a21b7b85d.slice/crio-c832fa08ef6fe02ea7fd8206a78bd9fcd5b47ee8734823fb799cd2309fa0a789 WatchSource:0}: Error finding container c832fa08ef6fe02ea7fd8206a78bd9fcd5b47ee8734823fb799cd2309fa0a789: Status 404 returned error can't find the container with id c832fa08ef6fe02ea7fd8206a78bd9fcd5b47ee8734823fb799cd2309fa0a789 Mar 18 10:31:05 crc kubenswrapper[4733]: I0318 10:31:05.019153 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-rh64b"] Mar 18 10:31:05 crc kubenswrapper[4733]: W0318 10:31:05.030448 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3c842d3_b3dd_4cf2_9df0_16cea4061bc5.slice/crio-b6ddef781f663dec8631c0b8e7931ac50c253872640374c5f5e682ce10a5bdce WatchSource:0}: Error finding container b6ddef781f663dec8631c0b8e7931ac50c253872640374c5f5e682ce10a5bdce: Status 404 returned error can't find the container with id b6ddef781f663dec8631c0b8e7931ac50c253872640374c5f5e682ce10a5bdce Mar 18 10:31:05 crc kubenswrapper[4733]: I0318 10:31:05.043587 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 10:31:05 crc kubenswrapper[4733]: W0318 10:31:05.060749 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4a4e3e2_bd4d_4f8d_97bc_51267378ab03.slice/crio-b52c27ec7e9f93d315a6dda8b5e47a3247a65d8221d7f759cf1532f592f883b1 WatchSource:0}: Error finding container b52c27ec7e9f93d315a6dda8b5e47a3247a65d8221d7f759cf1532f592f883b1: Status 404 returned error can't find the container with id b52c27ec7e9f93d315a6dda8b5e47a3247a65d8221d7f759cf1532f592f883b1 Mar 18 10:31:05 crc kubenswrapper[4733]: I0318 10:31:05.064559 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 10:31:05 crc kubenswrapper[4733]: I0318 10:31:05.067860 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/681a4fb9-f5dc-4b7d-aad7-45d15f11de1c-dns-svc\") pod \"681a4fb9-f5dc-4b7d-aad7-45d15f11de1c\" (UID: \"681a4fb9-f5dc-4b7d-aad7-45d15f11de1c\") " Mar 18 10:31:05 crc kubenswrapper[4733]: I0318 10:31:05.067995 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dc6zn\" (UniqueName: \"kubernetes.io/projected/681a4fb9-f5dc-4b7d-aad7-45d15f11de1c-kube-api-access-dc6zn\") pod \"681a4fb9-f5dc-4b7d-aad7-45d15f11de1c\" (UID: \"681a4fb9-f5dc-4b7d-aad7-45d15f11de1c\") " Mar 18 10:31:05 crc kubenswrapper[4733]: I0318 10:31:05.068034 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/725b76ca-c6aa-47f4-b75b-7ba4cd999979-config\") pod \"725b76ca-c6aa-47f4-b75b-7ba4cd999979\" (UID: \"725b76ca-c6aa-47f4-b75b-7ba4cd999979\") " Mar 18 10:31:05 crc kubenswrapper[4733]: I0318 10:31:05.068118 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/681a4fb9-f5dc-4b7d-aad7-45d15f11de1c-config\") pod \"681a4fb9-f5dc-4b7d-aad7-45d15f11de1c\" (UID: \"681a4fb9-f5dc-4b7d-aad7-45d15f11de1c\") " Mar 18 10:31:05 crc kubenswrapper[4733]: I0318 10:31:05.068154 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtr2d\" (UniqueName: \"kubernetes.io/projected/725b76ca-c6aa-47f4-b75b-7ba4cd999979-kube-api-access-mtr2d\") pod \"725b76ca-c6aa-47f4-b75b-7ba4cd999979\" (UID: \"725b76ca-c6aa-47f4-b75b-7ba4cd999979\") " Mar 18 10:31:05 crc kubenswrapper[4733]: I0318 10:31:05.069211 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/681a4fb9-f5dc-4b7d-aad7-45d15f11de1c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "681a4fb9-f5dc-4b7d-aad7-45d15f11de1c" (UID: "681a4fb9-f5dc-4b7d-aad7-45d15f11de1c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:31:05 crc kubenswrapper[4733]: I0318 10:31:05.069597 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/725b76ca-c6aa-47f4-b75b-7ba4cd999979-config" (OuterVolumeSpecName: "config") pod "725b76ca-c6aa-47f4-b75b-7ba4cd999979" (UID: "725b76ca-c6aa-47f4-b75b-7ba4cd999979"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:31:05 crc kubenswrapper[4733]: I0318 10:31:05.069650 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/681a4fb9-f5dc-4b7d-aad7-45d15f11de1c-config" (OuterVolumeSpecName: "config") pod "681a4fb9-f5dc-4b7d-aad7-45d15f11de1c" (UID: "681a4fb9-f5dc-4b7d-aad7-45d15f11de1c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:31:05 crc kubenswrapper[4733]: I0318 10:31:05.074154 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/725b76ca-c6aa-47f4-b75b-7ba4cd999979-kube-api-access-mtr2d" (OuterVolumeSpecName: "kube-api-access-mtr2d") pod "725b76ca-c6aa-47f4-b75b-7ba4cd999979" (UID: "725b76ca-c6aa-47f4-b75b-7ba4cd999979"). InnerVolumeSpecName "kube-api-access-mtr2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:31:05 crc kubenswrapper[4733]: I0318 10:31:05.075044 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/681a4fb9-f5dc-4b7d-aad7-45d15f11de1c-kube-api-access-dc6zn" (OuterVolumeSpecName: "kube-api-access-dc6zn") pod "681a4fb9-f5dc-4b7d-aad7-45d15f11de1c" (UID: "681a4fb9-f5dc-4b7d-aad7-45d15f11de1c"). InnerVolumeSpecName "kube-api-access-dc6zn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:31:05 crc kubenswrapper[4733]: I0318 10:31:05.171309 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/681a4fb9-f5dc-4b7d-aad7-45d15f11de1c-config\") on node \"crc\" DevicePath \"\"" Mar 18 10:31:05 crc kubenswrapper[4733]: I0318 10:31:05.171693 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtr2d\" (UniqueName: \"kubernetes.io/projected/725b76ca-c6aa-47f4-b75b-7ba4cd999979-kube-api-access-mtr2d\") on node \"crc\" DevicePath \"\"" Mar 18 10:31:05 crc kubenswrapper[4733]: I0318 10:31:05.171711 4733 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/681a4fb9-f5dc-4b7d-aad7-45d15f11de1c-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 10:31:05 crc kubenswrapper[4733]: I0318 10:31:05.171724 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dc6zn\" (UniqueName: \"kubernetes.io/projected/681a4fb9-f5dc-4b7d-aad7-45d15f11de1c-kube-api-access-dc6zn\") on node \"crc\" DevicePath \"\"" Mar 18 10:31:05 crc kubenswrapper[4733]: I0318 10:31:05.171763 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/725b76ca-c6aa-47f4-b75b-7ba4cd999979-config\") on node \"crc\" DevicePath \"\"" Mar 18 10:31:05 crc kubenswrapper[4733]: I0318 10:31:05.429284 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 10:31:05 crc kubenswrapper[4733]: I0318 10:31:05.495518 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-ljrgt" event={"ID":"d75a8d54-aca8-49cd-9062-6389baaf7a09","Type":"ContainerStarted","Data":"f6d52325fe24512c4cbe1a8ba8f41b25ad326a0a8ebc7201acee3673f1955213"} Mar 18 10:31:05 crc kubenswrapper[4733]: I0318 10:31:05.498212 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-xh24t" event={"ID":"a56bac49-b398-4b61-9b54-7969acd2dc93","Type":"ContainerStarted","Data":"9ab9c30c68bdbce8904477ffce48bb66e4703ed2c3ced83026789c4904ce7735"} Mar 18 10:31:05 crc kubenswrapper[4733]: I0318 10:31:05.499126 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-xh24t" Mar 18 10:31:05 crc kubenswrapper[4733]: I0318 10:31:05.500440 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rh64b" event={"ID":"e3c842d3-b3dd-4cf2-9df0-16cea4061bc5","Type":"ContainerStarted","Data":"b6ddef781f663dec8631c0b8e7931ac50c253872640374c5f5e682ce10a5bdce"} Mar 18 10:31:05 crc kubenswrapper[4733]: I0318 10:31:05.502412 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f0570ce4-1455-4698-85cf-01f7108d9e7f","Type":"ContainerStarted","Data":"5e84fda0307415591d2c9d4daad4d37b5e33c659c7910fb7b6abb0d132644f7d"} Mar 18 10:31:05 crc kubenswrapper[4733]: I0318 10:31:05.503335 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"55f0b230-09f2-4be2-aa1f-76a37f3fe30c","Type":"ContainerStarted","Data":"668c3f35174c4e84be42265f1c88bf5ad344602872ac5d7aa3490f7a7785d7a2"} Mar 18 10:31:05 crc kubenswrapper[4733]: I0318 10:31:05.504773 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-jhs6c" event={"ID":"681a4fb9-f5dc-4b7d-aad7-45d15f11de1c","Type":"ContainerDied","Data":"143eafdd483ed64fb00fabfeb37bfb5824c7d4e108597ececea0f80ea29d068b"} Mar 18 10:31:05 crc kubenswrapper[4733]: I0318 10:31:05.504827 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-jhs6c" Mar 18 10:31:05 crc kubenswrapper[4733]: I0318 10:31:05.506925 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b4a4e3e2-bd4d-4f8d-97bc-51267378ab03","Type":"ContainerStarted","Data":"b52c27ec7e9f93d315a6dda8b5e47a3247a65d8221d7f759cf1532f592f883b1"} Mar 18 10:31:05 crc kubenswrapper[4733]: I0318 10:31:05.508302 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"0208d826-df0f-41c8-83a7-821a21b7b85d","Type":"ContainerStarted","Data":"c832fa08ef6fe02ea7fd8206a78bd9fcd5b47ee8734823fb799cd2309fa0a789"} Mar 18 10:31:05 crc kubenswrapper[4733]: I0318 10:31:05.510341 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"dc60b49b-96fa-40fd-a8e5-40c810f5ef80","Type":"ContainerStarted","Data":"1cc1d503cfe0c3f0293efb60f8e37ebd8029bc5f9e2559b027fd01da9fcfc135"} Mar 18 10:31:05 crc kubenswrapper[4733]: I0318 10:31:05.517344 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-xh24t" podStartSLOduration=5.193686977 podStartE2EDuration="14.517325795s" podCreationTimestamp="2026-03-18 10:30:51 +0000 UTC" firstStartedPulling="2026-03-18 10:30:54.697752813 +0000 UTC m=+1094.189487138" lastFinishedPulling="2026-03-18 10:31:04.021391631 +0000 UTC m=+1103.513125956" observedRunningTime="2026-03-18 10:31:05.513399284 +0000 UTC m=+1105.005133619" watchObservedRunningTime="2026-03-18 10:31:05.517325795 +0000 UTC m=+1105.009060120" Mar 18 10:31:05 crc kubenswrapper[4733]: I0318 10:31:05.522326 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-7hxp6" event={"ID":"725b76ca-c6aa-47f4-b75b-7ba4cd999979","Type":"ContainerDied","Data":"338afcfd61f57729b6823a0742441ae38ca9d73a3a0cb99ea518e0556be8e8f6"} Mar 18 10:31:05 crc kubenswrapper[4733]: I0318 10:31:05.522390 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-7hxp6" Mar 18 10:31:05 crc kubenswrapper[4733]: I0318 10:31:05.526954 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"dd66892e-808c-405a-ac8e-366b6ca8b148","Type":"ContainerStarted","Data":"815f9c448060d19c188316891f4b7a7913171ca8b6609a7cca214bade89395e1"} Mar 18 10:31:05 crc kubenswrapper[4733]: I0318 10:31:05.532261 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-tljb4" event={"ID":"d6ec9568-99c8-4bee-a97c-46400fcc0e73","Type":"ContainerStarted","Data":"aef65315ca9ffdfc8f038573ba4d7c7dad7e497bb3b3225251bd4117319d3146"} Mar 18 10:31:05 crc kubenswrapper[4733]: I0318 10:31:05.537148 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-tljb4" Mar 18 10:31:05 crc kubenswrapper[4733]: I0318 10:31:05.649501 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-jhs6c"] Mar 18 10:31:05 crc kubenswrapper[4733]: I0318 10:31:05.668276 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-jhs6c"] Mar 18 10:31:05 crc kubenswrapper[4733]: I0318 10:31:05.684418 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-tljb4" podStartSLOduration=5.321941026 podStartE2EDuration="14.684397223s" podCreationTimestamp="2026-03-18 10:30:51 +0000 UTC" firstStartedPulling="2026-03-18 10:30:54.699721529 +0000 UTC m=+1094.191455854" lastFinishedPulling="2026-03-18 10:31:04.062177716 +0000 UTC m=+1103.553912051" observedRunningTime="2026-03-18 10:31:05.551003458 +0000 UTC m=+1105.042737793" watchObservedRunningTime="2026-03-18 10:31:05.684397223 +0000 UTC m=+1105.176131548" Mar 18 10:31:05 crc kubenswrapper[4733]: I0318 10:31:05.721787 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-7hxp6"] Mar 18 10:31:05 crc kubenswrapper[4733]: I0318 10:31:05.732931 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-7hxp6"] Mar 18 10:31:05 crc kubenswrapper[4733]: I0318 10:31:05.751383 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 10:31:05 crc kubenswrapper[4733]: I0318 10:31:05.935608 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-6trms"] Mar 18 10:31:05 crc kubenswrapper[4733]: I0318 10:31:05.937279 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-6trms" Mar 18 10:31:05 crc kubenswrapper[4733]: I0318 10:31:05.939383 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 18 10:31:05 crc kubenswrapper[4733]: W0318 10:31:05.939577 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8c27598_870d_4de0_a986_47042d7d6f4c.slice/crio-b06a5b369913b1df1835f783d2fda94d69d9748e0d8b40519583ecfe90984a75 WatchSource:0}: Error finding container b06a5b369913b1df1835f783d2fda94d69d9748e0d8b40519583ecfe90984a75: Status 404 returned error can't find the container with id b06a5b369913b1df1835f783d2fda94d69d9748e0d8b40519583ecfe90984a75 Mar 18 10:31:05 crc kubenswrapper[4733]: I0318 10:31:05.955900 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-6trms"] Mar 18 10:31:05 crc kubenswrapper[4733]: I0318 10:31:05.998059 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7849feb-5f1b-4b67-a3f7-8a419ebda0bd-config\") pod \"ovn-controller-metrics-6trms\" (UID: \"e7849feb-5f1b-4b67-a3f7-8a419ebda0bd\") " pod="openstack/ovn-controller-metrics-6trms" Mar 18 10:31:05 crc kubenswrapper[4733]: I0318 10:31:05.998107 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7849feb-5f1b-4b67-a3f7-8a419ebda0bd-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-6trms\" (UID: \"e7849feb-5f1b-4b67-a3f7-8a419ebda0bd\") " pod="openstack/ovn-controller-metrics-6trms" Mar 18 10:31:05 crc kubenswrapper[4733]: I0318 10:31:05.998232 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/e7849feb-5f1b-4b67-a3f7-8a419ebda0bd-ovn-rundir\") pod \"ovn-controller-metrics-6trms\" (UID: \"e7849feb-5f1b-4b67-a3f7-8a419ebda0bd\") " pod="openstack/ovn-controller-metrics-6trms" Mar 18 10:31:05 crc kubenswrapper[4733]: I0318 10:31:05.998257 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7849feb-5f1b-4b67-a3f7-8a419ebda0bd-combined-ca-bundle\") pod \"ovn-controller-metrics-6trms\" (UID: \"e7849feb-5f1b-4b67-a3f7-8a419ebda0bd\") " pod="openstack/ovn-controller-metrics-6trms" Mar 18 10:31:05 crc kubenswrapper[4733]: I0318 10:31:05.998294 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/e7849feb-5f1b-4b67-a3f7-8a419ebda0bd-ovs-rundir\") pod \"ovn-controller-metrics-6trms\" (UID: \"e7849feb-5f1b-4b67-a3f7-8a419ebda0bd\") " pod="openstack/ovn-controller-metrics-6trms" Mar 18 10:31:05 crc kubenswrapper[4733]: I0318 10:31:05.998397 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2h7tb\" (UniqueName: \"kubernetes.io/projected/e7849feb-5f1b-4b67-a3f7-8a419ebda0bd-kube-api-access-2h7tb\") pod \"ovn-controller-metrics-6trms\" (UID: \"e7849feb-5f1b-4b67-a3f7-8a419ebda0bd\") " pod="openstack/ovn-controller-metrics-6trms" Mar 18 10:31:06 crc kubenswrapper[4733]: I0318 10:31:06.073848 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-xh24t"] Mar 18 10:31:06 crc kubenswrapper[4733]: I0318 10:31:06.126785 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/e7849feb-5f1b-4b67-a3f7-8a419ebda0bd-ovn-rundir\") pod \"ovn-controller-metrics-6trms\" (UID: \"e7849feb-5f1b-4b67-a3f7-8a419ebda0bd\") " pod="openstack/ovn-controller-metrics-6trms" Mar 18 10:31:06 crc kubenswrapper[4733]: I0318 10:31:06.126828 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7849feb-5f1b-4b67-a3f7-8a419ebda0bd-combined-ca-bundle\") pod \"ovn-controller-metrics-6trms\" (UID: \"e7849feb-5f1b-4b67-a3f7-8a419ebda0bd\") " pod="openstack/ovn-controller-metrics-6trms" Mar 18 10:31:06 crc kubenswrapper[4733]: I0318 10:31:06.126847 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/e7849feb-5f1b-4b67-a3f7-8a419ebda0bd-ovs-rundir\") pod \"ovn-controller-metrics-6trms\" (UID: \"e7849feb-5f1b-4b67-a3f7-8a419ebda0bd\") " pod="openstack/ovn-controller-metrics-6trms" Mar 18 10:31:06 crc kubenswrapper[4733]: I0318 10:31:06.126916 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2h7tb\" (UniqueName: \"kubernetes.io/projected/e7849feb-5f1b-4b67-a3f7-8a419ebda0bd-kube-api-access-2h7tb\") pod \"ovn-controller-metrics-6trms\" (UID: \"e7849feb-5f1b-4b67-a3f7-8a419ebda0bd\") " pod="openstack/ovn-controller-metrics-6trms" Mar 18 10:31:06 crc kubenswrapper[4733]: I0318 10:31:06.126940 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7849feb-5f1b-4b67-a3f7-8a419ebda0bd-config\") pod \"ovn-controller-metrics-6trms\" (UID: \"e7849feb-5f1b-4b67-a3f7-8a419ebda0bd\") " pod="openstack/ovn-controller-metrics-6trms" Mar 18 10:31:06 crc kubenswrapper[4733]: I0318 10:31:06.126961 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7849feb-5f1b-4b67-a3f7-8a419ebda0bd-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-6trms\" (UID: \"e7849feb-5f1b-4b67-a3f7-8a419ebda0bd\") " pod="openstack/ovn-controller-metrics-6trms" Mar 18 10:31:06 crc kubenswrapper[4733]: I0318 10:31:06.130209 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/e7849feb-5f1b-4b67-a3f7-8a419ebda0bd-ovs-rundir\") pod \"ovn-controller-metrics-6trms\" (UID: \"e7849feb-5f1b-4b67-a3f7-8a419ebda0bd\") " pod="openstack/ovn-controller-metrics-6trms" Mar 18 10:31:06 crc kubenswrapper[4733]: I0318 10:31:06.130320 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/e7849feb-5f1b-4b67-a3f7-8a419ebda0bd-ovn-rundir\") pod \"ovn-controller-metrics-6trms\" (UID: \"e7849feb-5f1b-4b67-a3f7-8a419ebda0bd\") " pod="openstack/ovn-controller-metrics-6trms" Mar 18 10:31:06 crc kubenswrapper[4733]: I0318 10:31:06.131131 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7849feb-5f1b-4b67-a3f7-8a419ebda0bd-config\") pod \"ovn-controller-metrics-6trms\" (UID: \"e7849feb-5f1b-4b67-a3f7-8a419ebda0bd\") " pod="openstack/ovn-controller-metrics-6trms" Mar 18 10:31:06 crc kubenswrapper[4733]: I0318 10:31:06.137780 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7849feb-5f1b-4b67-a3f7-8a419ebda0bd-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-6trms\" (UID: \"e7849feb-5f1b-4b67-a3f7-8a419ebda0bd\") " pod="openstack/ovn-controller-metrics-6trms" Mar 18 10:31:06 crc kubenswrapper[4733]: I0318 10:31:06.141272 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7849feb-5f1b-4b67-a3f7-8a419ebda0bd-combined-ca-bundle\") pod \"ovn-controller-metrics-6trms\" (UID: \"e7849feb-5f1b-4b67-a3f7-8a419ebda0bd\") " pod="openstack/ovn-controller-metrics-6trms" Mar 18 10:31:06 crc kubenswrapper[4733]: I0318 10:31:06.165518 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-z55fc"] Mar 18 10:31:06 crc kubenswrapper[4733]: I0318 10:31:06.166683 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-z55fc" Mar 18 10:31:06 crc kubenswrapper[4733]: I0318 10:31:06.175455 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 18 10:31:06 crc kubenswrapper[4733]: I0318 10:31:06.194743 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2h7tb\" (UniqueName: \"kubernetes.io/projected/e7849feb-5f1b-4b67-a3f7-8a419ebda0bd-kube-api-access-2h7tb\") pod \"ovn-controller-metrics-6trms\" (UID: \"e7849feb-5f1b-4b67-a3f7-8a419ebda0bd\") " pod="openstack/ovn-controller-metrics-6trms" Mar 18 10:31:06 crc kubenswrapper[4733]: I0318 10:31:06.203700 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-z55fc"] Mar 18 10:31:06 crc kubenswrapper[4733]: I0318 10:31:06.227961 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19376969-b236-4b21-b57f-3833a3c0c7b4-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-z55fc\" (UID: \"19376969-b236-4b21-b57f-3833a3c0c7b4\") " pod="openstack/dnsmasq-dns-5bf47b49b7-z55fc" Mar 18 10:31:06 crc kubenswrapper[4733]: I0318 10:31:06.228003 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/19376969-b236-4b21-b57f-3833a3c0c7b4-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-z55fc\" (UID: \"19376969-b236-4b21-b57f-3833a3c0c7b4\") " pod="openstack/dnsmasq-dns-5bf47b49b7-z55fc" Mar 18 10:31:06 crc kubenswrapper[4733]: I0318 10:31:06.228048 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19376969-b236-4b21-b57f-3833a3c0c7b4-config\") pod \"dnsmasq-dns-5bf47b49b7-z55fc\" (UID: \"19376969-b236-4b21-b57f-3833a3c0c7b4\") " pod="openstack/dnsmasq-dns-5bf47b49b7-z55fc" Mar 18 10:31:06 crc kubenswrapper[4733]: I0318 10:31:06.228083 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrzpq\" (UniqueName: \"kubernetes.io/projected/19376969-b236-4b21-b57f-3833a3c0c7b4-kube-api-access-nrzpq\") pod \"dnsmasq-dns-5bf47b49b7-z55fc\" (UID: \"19376969-b236-4b21-b57f-3833a3c0c7b4\") " pod="openstack/dnsmasq-dns-5bf47b49b7-z55fc" Mar 18 10:31:06 crc kubenswrapper[4733]: I0318 10:31:06.307321 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-tljb4"] Mar 18 10:31:06 crc kubenswrapper[4733]: I0318 10:31:06.308820 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-6trms" Mar 18 10:31:06 crc kubenswrapper[4733]: I0318 10:31:06.333231 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19376969-b236-4b21-b57f-3833a3c0c7b4-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-z55fc\" (UID: \"19376969-b236-4b21-b57f-3833a3c0c7b4\") " pod="openstack/dnsmasq-dns-5bf47b49b7-z55fc" Mar 18 10:31:06 crc kubenswrapper[4733]: I0318 10:31:06.333285 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/19376969-b236-4b21-b57f-3833a3c0c7b4-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-z55fc\" (UID: \"19376969-b236-4b21-b57f-3833a3c0c7b4\") " pod="openstack/dnsmasq-dns-5bf47b49b7-z55fc" Mar 18 10:31:06 crc kubenswrapper[4733]: I0318 10:31:06.333333 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19376969-b236-4b21-b57f-3833a3c0c7b4-config\") pod \"dnsmasq-dns-5bf47b49b7-z55fc\" (UID: \"19376969-b236-4b21-b57f-3833a3c0c7b4\") " pod="openstack/dnsmasq-dns-5bf47b49b7-z55fc" Mar 18 10:31:06 crc kubenswrapper[4733]: I0318 10:31:06.333367 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrzpq\" (UniqueName: \"kubernetes.io/projected/19376969-b236-4b21-b57f-3833a3c0c7b4-kube-api-access-nrzpq\") pod \"dnsmasq-dns-5bf47b49b7-z55fc\" (UID: \"19376969-b236-4b21-b57f-3833a3c0c7b4\") " pod="openstack/dnsmasq-dns-5bf47b49b7-z55fc" Mar 18 10:31:06 crc kubenswrapper[4733]: I0318 10:31:06.334598 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19376969-b236-4b21-b57f-3833a3c0c7b4-config\") pod \"dnsmasq-dns-5bf47b49b7-z55fc\" (UID: \"19376969-b236-4b21-b57f-3833a3c0c7b4\") " pod="openstack/dnsmasq-dns-5bf47b49b7-z55fc" Mar 18 10:31:06 crc kubenswrapper[4733]: I0318 10:31:06.334771 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19376969-b236-4b21-b57f-3833a3c0c7b4-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-z55fc\" (UID: \"19376969-b236-4b21-b57f-3833a3c0c7b4\") " pod="openstack/dnsmasq-dns-5bf47b49b7-z55fc" Mar 18 10:31:06 crc kubenswrapper[4733]: I0318 10:31:06.337585 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-rrvg6"] Mar 18 10:31:06 crc kubenswrapper[4733]: I0318 10:31:06.349091 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-rrvg6" Mar 18 10:31:06 crc kubenswrapper[4733]: I0318 10:31:06.349977 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/19376969-b236-4b21-b57f-3833a3c0c7b4-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-z55fc\" (UID: \"19376969-b236-4b21-b57f-3833a3c0c7b4\") " pod="openstack/dnsmasq-dns-5bf47b49b7-z55fc" Mar 18 10:31:06 crc kubenswrapper[4733]: I0318 10:31:06.351593 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 18 10:31:06 crc kubenswrapper[4733]: I0318 10:31:06.355918 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-rrvg6"] Mar 18 10:31:06 crc kubenswrapper[4733]: I0318 10:31:06.369799 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrzpq\" (UniqueName: \"kubernetes.io/projected/19376969-b236-4b21-b57f-3833a3c0c7b4-kube-api-access-nrzpq\") pod \"dnsmasq-dns-5bf47b49b7-z55fc\" (UID: \"19376969-b236-4b21-b57f-3833a3c0c7b4\") " pod="openstack/dnsmasq-dns-5bf47b49b7-z55fc" Mar 18 10:31:06 crc kubenswrapper[4733]: I0318 10:31:06.436074 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/15df79ef-9d7a-4310-ba27-bdf8cb200f0f-dns-svc\") pod \"dnsmasq-dns-8554648995-rrvg6\" (UID: \"15df79ef-9d7a-4310-ba27-bdf8cb200f0f\") " pod="openstack/dnsmasq-dns-8554648995-rrvg6" Mar 18 10:31:06 crc kubenswrapper[4733]: I0318 10:31:06.436153 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15df79ef-9d7a-4310-ba27-bdf8cb200f0f-config\") pod \"dnsmasq-dns-8554648995-rrvg6\" (UID: \"15df79ef-9d7a-4310-ba27-bdf8cb200f0f\") " pod="openstack/dnsmasq-dns-8554648995-rrvg6" Mar 18 10:31:06 crc kubenswrapper[4733]: I0318 10:31:06.436248 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5w85\" (UniqueName: \"kubernetes.io/projected/15df79ef-9d7a-4310-ba27-bdf8cb200f0f-kube-api-access-g5w85\") pod \"dnsmasq-dns-8554648995-rrvg6\" (UID: \"15df79ef-9d7a-4310-ba27-bdf8cb200f0f\") " pod="openstack/dnsmasq-dns-8554648995-rrvg6" Mar 18 10:31:06 crc kubenswrapper[4733]: I0318 10:31:06.436274 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/15df79ef-9d7a-4310-ba27-bdf8cb200f0f-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-rrvg6\" (UID: \"15df79ef-9d7a-4310-ba27-bdf8cb200f0f\") " pod="openstack/dnsmasq-dns-8554648995-rrvg6" Mar 18 10:31:06 crc kubenswrapper[4733]: I0318 10:31:06.436375 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/15df79ef-9d7a-4310-ba27-bdf8cb200f0f-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-rrvg6\" (UID: \"15df79ef-9d7a-4310-ba27-bdf8cb200f0f\") " pod="openstack/dnsmasq-dns-8554648995-rrvg6" Mar 18 10:31:06 crc kubenswrapper[4733]: I0318 10:31:06.538280 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/15df79ef-9d7a-4310-ba27-bdf8cb200f0f-dns-svc\") pod \"dnsmasq-dns-8554648995-rrvg6\" (UID: \"15df79ef-9d7a-4310-ba27-bdf8cb200f0f\") " pod="openstack/dnsmasq-dns-8554648995-rrvg6" Mar 18 10:31:06 crc kubenswrapper[4733]: I0318 10:31:06.538352 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15df79ef-9d7a-4310-ba27-bdf8cb200f0f-config\") pod \"dnsmasq-dns-8554648995-rrvg6\" (UID: \"15df79ef-9d7a-4310-ba27-bdf8cb200f0f\") " pod="openstack/dnsmasq-dns-8554648995-rrvg6" Mar 18 10:31:06 crc kubenswrapper[4733]: I0318 10:31:06.538397 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5w85\" (UniqueName: \"kubernetes.io/projected/15df79ef-9d7a-4310-ba27-bdf8cb200f0f-kube-api-access-g5w85\") pod \"dnsmasq-dns-8554648995-rrvg6\" (UID: \"15df79ef-9d7a-4310-ba27-bdf8cb200f0f\") " pod="openstack/dnsmasq-dns-8554648995-rrvg6" Mar 18 10:31:06 crc kubenswrapper[4733]: I0318 10:31:06.538422 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/15df79ef-9d7a-4310-ba27-bdf8cb200f0f-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-rrvg6\" (UID: \"15df79ef-9d7a-4310-ba27-bdf8cb200f0f\") " pod="openstack/dnsmasq-dns-8554648995-rrvg6" Mar 18 10:31:06 crc kubenswrapper[4733]: I0318 10:31:06.538496 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/15df79ef-9d7a-4310-ba27-bdf8cb200f0f-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-rrvg6\" (UID: \"15df79ef-9d7a-4310-ba27-bdf8cb200f0f\") " pod="openstack/dnsmasq-dns-8554648995-rrvg6" Mar 18 10:31:06 crc kubenswrapper[4733]: I0318 10:31:06.539945 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/15df79ef-9d7a-4310-ba27-bdf8cb200f0f-dns-svc\") pod \"dnsmasq-dns-8554648995-rrvg6\" (UID: \"15df79ef-9d7a-4310-ba27-bdf8cb200f0f\") " pod="openstack/dnsmasq-dns-8554648995-rrvg6" Mar 18 10:31:06 crc kubenswrapper[4733]: I0318 10:31:06.540265 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/15df79ef-9d7a-4310-ba27-bdf8cb200f0f-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-rrvg6\" (UID: \"15df79ef-9d7a-4310-ba27-bdf8cb200f0f\") " pod="openstack/dnsmasq-dns-8554648995-rrvg6" Mar 18 10:31:06 crc kubenswrapper[4733]: I0318 10:31:06.541580 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/15df79ef-9d7a-4310-ba27-bdf8cb200f0f-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-rrvg6\" (UID: \"15df79ef-9d7a-4310-ba27-bdf8cb200f0f\") " pod="openstack/dnsmasq-dns-8554648995-rrvg6" Mar 18 10:31:06 crc kubenswrapper[4733]: I0318 10:31:06.543112 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15df79ef-9d7a-4310-ba27-bdf8cb200f0f-config\") pod \"dnsmasq-dns-8554648995-rrvg6\" (UID: \"15df79ef-9d7a-4310-ba27-bdf8cb200f0f\") " pod="openstack/dnsmasq-dns-8554648995-rrvg6" Mar 18 10:31:06 crc kubenswrapper[4733]: I0318 10:31:06.547475 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"0868210e-9d93-4f63-b425-7db21f13cd90","Type":"ContainerStarted","Data":"ee032d9694b4a39ad832b4921248ce94a5c6074e3c7ce3055076e87f6505375c"} Mar 18 10:31:06 crc kubenswrapper[4733]: I0318 10:31:06.550686 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"a8c27598-870d-4de0-a986-47042d7d6f4c","Type":"ContainerStarted","Data":"b06a5b369913b1df1835f783d2fda94d69d9748e0d8b40519583ecfe90984a75"} Mar 18 10:31:06 crc kubenswrapper[4733]: I0318 10:31:06.555118 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5w85\" (UniqueName: \"kubernetes.io/projected/15df79ef-9d7a-4310-ba27-bdf8cb200f0f-kube-api-access-g5w85\") pod \"dnsmasq-dns-8554648995-rrvg6\" (UID: \"15df79ef-9d7a-4310-ba27-bdf8cb200f0f\") " pod="openstack/dnsmasq-dns-8554648995-rrvg6" Mar 18 10:31:06 crc kubenswrapper[4733]: I0318 10:31:06.597212 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-z55fc" Mar 18 10:31:06 crc kubenswrapper[4733]: I0318 10:31:06.668153 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-rrvg6" Mar 18 10:31:07 crc kubenswrapper[4733]: I0318 10:31:07.187504 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="681a4fb9-f5dc-4b7d-aad7-45d15f11de1c" path="/var/lib/kubelet/pods/681a4fb9-f5dc-4b7d-aad7-45d15f11de1c/volumes" Mar 18 10:31:07 crc kubenswrapper[4733]: I0318 10:31:07.188073 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="725b76ca-c6aa-47f4-b75b-7ba4cd999979" path="/var/lib/kubelet/pods/725b76ca-c6aa-47f4-b75b-7ba4cd999979/volumes" Mar 18 10:31:07 crc kubenswrapper[4733]: I0318 10:31:07.312680 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-6trms"] Mar 18 10:31:07 crc kubenswrapper[4733]: I0318 10:31:07.561118 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-xh24t" podUID="a56bac49-b398-4b61-9b54-7969acd2dc93" containerName="dnsmasq-dns" containerID="cri-o://9ab9c30c68bdbce8904477ffce48bb66e4703ed2c3ced83026789c4904ce7735" gracePeriod=10 Mar 18 10:31:07 crc kubenswrapper[4733]: I0318 10:31:07.561558 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-tljb4" podUID="d6ec9568-99c8-4bee-a97c-46400fcc0e73" containerName="dnsmasq-dns" containerID="cri-o://aef65315ca9ffdfc8f038573ba4d7c7dad7e497bb3b3225251bd4117319d3146" gracePeriod=10 Mar 18 10:31:08 crc kubenswrapper[4733]: I0318 10:31:08.569393 4733 generic.go:334] "Generic (PLEG): container finished" podID="a56bac49-b398-4b61-9b54-7969acd2dc93" containerID="9ab9c30c68bdbce8904477ffce48bb66e4703ed2c3ced83026789c4904ce7735" exitCode=0 Mar 18 10:31:08 crc kubenswrapper[4733]: I0318 10:31:08.569471 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-xh24t" event={"ID":"a56bac49-b398-4b61-9b54-7969acd2dc93","Type":"ContainerDied","Data":"9ab9c30c68bdbce8904477ffce48bb66e4703ed2c3ced83026789c4904ce7735"} Mar 18 10:31:08 crc kubenswrapper[4733]: I0318 10:31:08.571976 4733 generic.go:334] "Generic (PLEG): container finished" podID="d6ec9568-99c8-4bee-a97c-46400fcc0e73" containerID="aef65315ca9ffdfc8f038573ba4d7c7dad7e497bb3b3225251bd4117319d3146" exitCode=0 Mar 18 10:31:08 crc kubenswrapper[4733]: I0318 10:31:08.572053 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-tljb4" event={"ID":"d6ec9568-99c8-4bee-a97c-46400fcc0e73","Type":"ContainerDied","Data":"aef65315ca9ffdfc8f038573ba4d7c7dad7e497bb3b3225251bd4117319d3146"} Mar 18 10:31:08 crc kubenswrapper[4733]: I0318 10:31:08.573021 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-6trms" event={"ID":"e7849feb-5f1b-4b67-a3f7-8a419ebda0bd","Type":"ContainerStarted","Data":"e7e81fa094228e3b5ea6026dec0e1023cf195bd21cad69e71c8bbeda57a23101"} Mar 18 10:31:11 crc kubenswrapper[4733]: I0318 10:31:11.101360 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-tljb4" Mar 18 10:31:11 crc kubenswrapper[4733]: I0318 10:31:11.105454 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-xh24t" Mar 18 10:31:11 crc kubenswrapper[4733]: I0318 10:31:11.248303 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a56bac49-b398-4b61-9b54-7969acd2dc93-dns-svc\") pod \"a56bac49-b398-4b61-9b54-7969acd2dc93\" (UID: \"a56bac49-b398-4b61-9b54-7969acd2dc93\") " Mar 18 10:31:11 crc kubenswrapper[4733]: I0318 10:31:11.248348 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d6ec9568-99c8-4bee-a97c-46400fcc0e73-dns-svc\") pod \"d6ec9568-99c8-4bee-a97c-46400fcc0e73\" (UID: \"d6ec9568-99c8-4bee-a97c-46400fcc0e73\") " Mar 18 10:31:11 crc kubenswrapper[4733]: I0318 10:31:11.248405 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6ec9568-99c8-4bee-a97c-46400fcc0e73-config\") pod \"d6ec9568-99c8-4bee-a97c-46400fcc0e73\" (UID: \"d6ec9568-99c8-4bee-a97c-46400fcc0e73\") " Mar 18 10:31:11 crc kubenswrapper[4733]: I0318 10:31:11.248431 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tggzq\" (UniqueName: \"kubernetes.io/projected/d6ec9568-99c8-4bee-a97c-46400fcc0e73-kube-api-access-tggzq\") pod \"d6ec9568-99c8-4bee-a97c-46400fcc0e73\" (UID: \"d6ec9568-99c8-4bee-a97c-46400fcc0e73\") " Mar 18 10:31:11 crc kubenswrapper[4733]: I0318 10:31:11.248447 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a56bac49-b398-4b61-9b54-7969acd2dc93-config\") pod \"a56bac49-b398-4b61-9b54-7969acd2dc93\" (UID: \"a56bac49-b398-4b61-9b54-7969acd2dc93\") " Mar 18 10:31:11 crc kubenswrapper[4733]: I0318 10:31:11.248469 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mfbr\" (UniqueName: \"kubernetes.io/projected/a56bac49-b398-4b61-9b54-7969acd2dc93-kube-api-access-5mfbr\") pod \"a56bac49-b398-4b61-9b54-7969acd2dc93\" (UID: \"a56bac49-b398-4b61-9b54-7969acd2dc93\") " Mar 18 10:31:11 crc kubenswrapper[4733]: I0318 10:31:11.261552 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a56bac49-b398-4b61-9b54-7969acd2dc93-kube-api-access-5mfbr" (OuterVolumeSpecName: "kube-api-access-5mfbr") pod "a56bac49-b398-4b61-9b54-7969acd2dc93" (UID: "a56bac49-b398-4b61-9b54-7969acd2dc93"). InnerVolumeSpecName "kube-api-access-5mfbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:31:11 crc kubenswrapper[4733]: I0318 10:31:11.264272 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6ec9568-99c8-4bee-a97c-46400fcc0e73-kube-api-access-tggzq" (OuterVolumeSpecName: "kube-api-access-tggzq") pod "d6ec9568-99c8-4bee-a97c-46400fcc0e73" (UID: "d6ec9568-99c8-4bee-a97c-46400fcc0e73"). InnerVolumeSpecName "kube-api-access-tggzq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:31:11 crc kubenswrapper[4733]: I0318 10:31:11.304072 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6ec9568-99c8-4bee-a97c-46400fcc0e73-config" (OuterVolumeSpecName: "config") pod "d6ec9568-99c8-4bee-a97c-46400fcc0e73" (UID: "d6ec9568-99c8-4bee-a97c-46400fcc0e73"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:31:11 crc kubenswrapper[4733]: I0318 10:31:11.309069 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a56bac49-b398-4b61-9b54-7969acd2dc93-config" (OuterVolumeSpecName: "config") pod "a56bac49-b398-4b61-9b54-7969acd2dc93" (UID: "a56bac49-b398-4b61-9b54-7969acd2dc93"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:31:11 crc kubenswrapper[4733]: I0318 10:31:11.322859 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6ec9568-99c8-4bee-a97c-46400fcc0e73-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d6ec9568-99c8-4bee-a97c-46400fcc0e73" (UID: "d6ec9568-99c8-4bee-a97c-46400fcc0e73"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:31:11 crc kubenswrapper[4733]: I0318 10:31:11.323884 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a56bac49-b398-4b61-9b54-7969acd2dc93-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a56bac49-b398-4b61-9b54-7969acd2dc93" (UID: "a56bac49-b398-4b61-9b54-7969acd2dc93"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:31:11 crc kubenswrapper[4733]: I0318 10:31:11.352348 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mfbr\" (UniqueName: \"kubernetes.io/projected/a56bac49-b398-4b61-9b54-7969acd2dc93-kube-api-access-5mfbr\") on node \"crc\" DevicePath \"\"" Mar 18 10:31:11 crc kubenswrapper[4733]: I0318 10:31:11.352380 4733 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a56bac49-b398-4b61-9b54-7969acd2dc93-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 10:31:11 crc kubenswrapper[4733]: I0318 10:31:11.352390 4733 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d6ec9568-99c8-4bee-a97c-46400fcc0e73-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 10:31:11 crc kubenswrapper[4733]: I0318 10:31:11.352398 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6ec9568-99c8-4bee-a97c-46400fcc0e73-config\") on node \"crc\" DevicePath \"\"" Mar 18 10:31:11 crc kubenswrapper[4733]: I0318 10:31:11.352407 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tggzq\" (UniqueName: \"kubernetes.io/projected/d6ec9568-99c8-4bee-a97c-46400fcc0e73-kube-api-access-tggzq\") on node \"crc\" DevicePath \"\"" Mar 18 10:31:11 crc kubenswrapper[4733]: I0318 10:31:11.352415 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a56bac49-b398-4b61-9b54-7969acd2dc93-config\") on node \"crc\" DevicePath \"\"" Mar 18 10:31:11 crc kubenswrapper[4733]: I0318 10:31:11.598871 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-xh24t" event={"ID":"a56bac49-b398-4b61-9b54-7969acd2dc93","Type":"ContainerDied","Data":"cf4476f986138503b3408c91ec78e55f73e59536bc0804b03b95667a22e6c6a6"} Mar 18 10:31:11 crc kubenswrapper[4733]: I0318 10:31:11.598939 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-xh24t" Mar 18 10:31:11 crc kubenswrapper[4733]: I0318 10:31:11.598946 4733 scope.go:117] "RemoveContainer" containerID="9ab9c30c68bdbce8904477ffce48bb66e4703ed2c3ced83026789c4904ce7735" Mar 18 10:31:11 crc kubenswrapper[4733]: I0318 10:31:11.602572 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-tljb4" event={"ID":"d6ec9568-99c8-4bee-a97c-46400fcc0e73","Type":"ContainerDied","Data":"642fd85cd38aa86a0c841253dfb5fda87ce7251b22474e12b3ef33923f92f9b2"} Mar 18 10:31:11 crc kubenswrapper[4733]: I0318 10:31:11.602630 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-tljb4" Mar 18 10:31:11 crc kubenswrapper[4733]: I0318 10:31:11.640874 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-xh24t"] Mar 18 10:31:11 crc kubenswrapper[4733]: I0318 10:31:11.648368 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-xh24t"] Mar 18 10:31:11 crc kubenswrapper[4733]: I0318 10:31:11.656593 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-tljb4"] Mar 18 10:31:11 crc kubenswrapper[4733]: I0318 10:31:11.664637 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-tljb4"] Mar 18 10:31:13 crc kubenswrapper[4733]: I0318 10:31:13.188936 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a56bac49-b398-4b61-9b54-7969acd2dc93" path="/var/lib/kubelet/pods/a56bac49-b398-4b61-9b54-7969acd2dc93/volumes" Mar 18 10:31:13 crc kubenswrapper[4733]: I0318 10:31:13.190461 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6ec9568-99c8-4bee-a97c-46400fcc0e73" path="/var/lib/kubelet/pods/d6ec9568-99c8-4bee-a97c-46400fcc0e73/volumes" Mar 18 10:31:13 crc kubenswrapper[4733]: I0318 10:31:13.802447 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-z55fc"] Mar 18 10:31:15 crc kubenswrapper[4733]: I0318 10:31:15.374818 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-rrvg6"] Mar 18 10:31:20 crc kubenswrapper[4733]: E0318 10:31:20.261966 4733 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Mar 18 10:31:20 crc kubenswrapper[4733]: E0318 10:31:20.262630 4733 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Mar 18 10:31:20 crc kubenswrapper[4733]: E0318 10:31:20.262819 4733 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-state-metrics,Image:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,Command:[],Args:[--resources=pods --namespaces=openstack],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http-metrics,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},ContainerPort{Name:telemetry,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-77qsg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/livez,Port:{0 8080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-state-metrics-0_openstack(55f0b230-09f2-4be2-aa1f-76a37f3fe30c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 10:31:20 crc kubenswrapper[4733]: E0318 10:31:20.264302 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/kube-state-metrics-0" podUID="55f0b230-09f2-4be2-aa1f-76a37f3fe30c" Mar 18 10:31:20 crc kubenswrapper[4733]: I0318 10:31:20.680533 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-z55fc" event={"ID":"19376969-b236-4b21-b57f-3833a3c0c7b4","Type":"ContainerStarted","Data":"d334497d76d55a41bc85eeb60a10080fe3c2d76a582435f142bf759662b2098b"} Mar 18 10:31:20 crc kubenswrapper[4733]: E0318 10:31:20.682007 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0\\\"\"" pod="openstack/kube-state-metrics-0" podUID="55f0b230-09f2-4be2-aa1f-76a37f3fe30c" Mar 18 10:31:20 crc kubenswrapper[4733]: I0318 10:31:20.861755 4733 scope.go:117] "RemoveContainer" containerID="0d7f9ebe26354b0fdcafbf9243319e234596728fe76f62846f8d6f2de9c01686" Mar 18 10:31:20 crc kubenswrapper[4733]: W0318 10:31:20.867388 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod15df79ef_9d7a_4310_ba27_bdf8cb200f0f.slice/crio-1ad8ff69c6adc3dabc943b2d0fb235bc6e4c5c162e015c87081a85eb5257721c WatchSource:0}: Error finding container 1ad8ff69c6adc3dabc943b2d0fb235bc6e4c5c162e015c87081a85eb5257721c: Status 404 returned error can't find the container with id 1ad8ff69c6adc3dabc943b2d0fb235bc6e4c5c162e015c87081a85eb5257721c Mar 18 10:31:21 crc kubenswrapper[4733]: I0318 10:31:21.079336 4733 scope.go:117] "RemoveContainer" containerID="aef65315ca9ffdfc8f038573ba4d7c7dad7e497bb3b3225251bd4117319d3146" Mar 18 10:31:21 crc kubenswrapper[4733]: I0318 10:31:21.280502 4733 scope.go:117] "RemoveContainer" containerID="f1bfaec48682e00a29092241786f91d62461cdabac1600bdef26a808c7697bdd" Mar 18 10:31:21 crc kubenswrapper[4733]: I0318 10:31:21.697483 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-6trms" event={"ID":"e7849feb-5f1b-4b67-a3f7-8a419ebda0bd","Type":"ContainerStarted","Data":"89f64d45993ca9a528d13be0f8de5723e62397975bf7abe7b8b32795b7459d68"} Mar 18 10:31:21 crc kubenswrapper[4733]: I0318 10:31:21.699784 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-ljrgt" event={"ID":"d75a8d54-aca8-49cd-9062-6389baaf7a09","Type":"ContainerStarted","Data":"c9bad20081449289687cd76260cb1b4999faa861437220370b9811519c736d36"} Mar 18 10:31:21 crc kubenswrapper[4733]: I0318 10:31:21.703026 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rh64b" event={"ID":"e3c842d3-b3dd-4cf2-9df0-16cea4061bc5","Type":"ContainerStarted","Data":"3b5a58b07f40c47d3810a2f753da5adaefb77778dab24aa461dc75931cdadf89"} Mar 18 10:31:21 crc kubenswrapper[4733]: I0318 10:31:21.703571 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-rh64b" Mar 18 10:31:21 crc kubenswrapper[4733]: I0318 10:31:21.705115 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"a8c27598-870d-4de0-a986-47042d7d6f4c","Type":"ContainerStarted","Data":"b65a0da116ca68b29800d866ab03f9d76e087c6073b691718aa6b8e2d620e06c"} Mar 18 10:31:21 crc kubenswrapper[4733]: I0318 10:31:21.708817 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"dc60b49b-96fa-40fd-a8e5-40c810f5ef80","Type":"ContainerStarted","Data":"1996bad471830d839e4c51afbee66ae68a9d0380538f2fb4187d0e6cb1e23827"} Mar 18 10:31:21 crc kubenswrapper[4733]: I0318 10:31:21.713381 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-6trms" podStartSLOduration=3.645966103 podStartE2EDuration="16.713368595s" podCreationTimestamp="2026-03-18 10:31:05 +0000 UTC" firstStartedPulling="2026-03-18 10:31:08.072163804 +0000 UTC m=+1107.563898129" lastFinishedPulling="2026-03-18 10:31:21.139566286 +0000 UTC m=+1120.631300621" observedRunningTime="2026-03-18 10:31:21.711143152 +0000 UTC m=+1121.202877467" watchObservedRunningTime="2026-03-18 10:31:21.713368595 +0000 UTC m=+1121.205102920" Mar 18 10:31:21 crc kubenswrapper[4733]: I0318 10:31:21.721883 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"dd66892e-808c-405a-ac8e-366b6ca8b148","Type":"ContainerStarted","Data":"123a82e05833cb782336a6b1c9d7f242be6afa92f95d5c9bcf898f5a656a5076"} Mar 18 10:31:21 crc kubenswrapper[4733]: I0318 10:31:21.722603 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 18 10:31:21 crc kubenswrapper[4733]: I0318 10:31:21.725764 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-z55fc" event={"ID":"19376969-b236-4b21-b57f-3833a3c0c7b4","Type":"ContainerStarted","Data":"1f57d62929ee7549431c614848f4b7ee032f7791e41c871d302148faa6989a10"} Mar 18 10:31:21 crc kubenswrapper[4733]: I0318 10:31:21.727071 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-rrvg6" event={"ID":"15df79ef-9d7a-4310-ba27-bdf8cb200f0f","Type":"ContainerStarted","Data":"e32e3455bac4748d83432ff47c120fde28d910e682c598fcca6672f025864937"} Mar 18 10:31:21 crc kubenswrapper[4733]: I0318 10:31:21.727092 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-rrvg6" event={"ID":"15df79ef-9d7a-4310-ba27-bdf8cb200f0f","Type":"ContainerStarted","Data":"1ad8ff69c6adc3dabc943b2d0fb235bc6e4c5c162e015c87081a85eb5257721c"} Mar 18 10:31:21 crc kubenswrapper[4733]: I0318 10:31:21.731684 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"0868210e-9d93-4f63-b425-7db21f13cd90","Type":"ContainerStarted","Data":"32d5c065f6f149974bcb872025471f86b31a160e54f0c13567cc06fac5be118b"} Mar 18 10:31:21 crc kubenswrapper[4733]: I0318 10:31:21.734549 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"0208d826-df0f-41c8-83a7-821a21b7b85d","Type":"ContainerStarted","Data":"3874dea0990730a5c77d703a00ba1f3a9abfb0de15e4fe3acdd1a096b1bf4ecc"} Mar 18 10:31:21 crc kubenswrapper[4733]: I0318 10:31:21.793957 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-rh64b" podStartSLOduration=5.761864006 podStartE2EDuration="21.793937545s" podCreationTimestamp="2026-03-18 10:31:00 +0000 UTC" firstStartedPulling="2026-03-18 10:31:05.032588958 +0000 UTC m=+1104.524323273" lastFinishedPulling="2026-03-18 10:31:21.064662487 +0000 UTC m=+1120.556396812" observedRunningTime="2026-03-18 10:31:21.785051443 +0000 UTC m=+1121.276785788" watchObservedRunningTime="2026-03-18 10:31:21.793937545 +0000 UTC m=+1121.285671870" Mar 18 10:31:22 crc kubenswrapper[4733]: I0318 10:31:22.747270 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f0570ce4-1455-4698-85cf-01f7108d9e7f","Type":"ContainerStarted","Data":"1b521608cd076add0dc6ea82ec6fd5b69318ec8068de497c0a6615c97830553d"} Mar 18 10:31:22 crc kubenswrapper[4733]: I0318 10:31:22.750496 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"a8c27598-870d-4de0-a986-47042d7d6f4c","Type":"ContainerStarted","Data":"07644ad53f49ea3b6fa47644d62cc985a5a2fc0647cac5021225d9225b5c3e56"} Mar 18 10:31:22 crc kubenswrapper[4733]: I0318 10:31:22.753554 4733 generic.go:334] "Generic (PLEG): container finished" podID="19376969-b236-4b21-b57f-3833a3c0c7b4" containerID="1f57d62929ee7549431c614848f4b7ee032f7791e41c871d302148faa6989a10" exitCode=0 Mar 18 10:31:22 crc kubenswrapper[4733]: I0318 10:31:22.753632 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-z55fc" event={"ID":"19376969-b236-4b21-b57f-3833a3c0c7b4","Type":"ContainerDied","Data":"1f57d62929ee7549431c614848f4b7ee032f7791e41c871d302148faa6989a10"} Mar 18 10:31:22 crc kubenswrapper[4733]: I0318 10:31:22.753713 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-z55fc" event={"ID":"19376969-b236-4b21-b57f-3833a3c0c7b4","Type":"ContainerStarted","Data":"58c76ce9b93e82f88df6f0ff6ee6c5e0a60fd6945aebd859567ed2120900425f"} Mar 18 10:31:22 crc kubenswrapper[4733]: I0318 10:31:22.753805 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5bf47b49b7-z55fc" Mar 18 10:31:22 crc kubenswrapper[4733]: I0318 10:31:22.762431 4733 generic.go:334] "Generic (PLEG): container finished" podID="d75a8d54-aca8-49cd-9062-6389baaf7a09" containerID="c9bad20081449289687cd76260cb1b4999faa861437220370b9811519c736d36" exitCode=0 Mar 18 10:31:22 crc kubenswrapper[4733]: I0318 10:31:22.762577 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-ljrgt" event={"ID":"d75a8d54-aca8-49cd-9062-6389baaf7a09","Type":"ContainerDied","Data":"c9bad20081449289687cd76260cb1b4999faa861437220370b9811519c736d36"} Mar 18 10:31:22 crc kubenswrapper[4733]: I0318 10:31:22.770774 4733 generic.go:334] "Generic (PLEG): container finished" podID="15df79ef-9d7a-4310-ba27-bdf8cb200f0f" containerID="e32e3455bac4748d83432ff47c120fde28d910e682c598fcca6672f025864937" exitCode=0 Mar 18 10:31:22 crc kubenswrapper[4733]: I0318 10:31:22.770916 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-rrvg6" event={"ID":"15df79ef-9d7a-4310-ba27-bdf8cb200f0f","Type":"ContainerDied","Data":"e32e3455bac4748d83432ff47c120fde28d910e682c598fcca6672f025864937"} Mar 18 10:31:22 crc kubenswrapper[4733]: I0318 10:31:22.778664 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"0868210e-9d93-4f63-b425-7db21f13cd90","Type":"ContainerStarted","Data":"bf511622c4bebbf135ee7da9030a30c0e739ce4aa3cf29a5e9df7905992d8113"} Mar 18 10:31:22 crc kubenswrapper[4733]: I0318 10:31:22.788459 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b4a4e3e2-bd4d-4f8d-97bc-51267378ab03","Type":"ContainerStarted","Data":"0fb5e774f72bc7530e7861681639d72697b8c0245883531528195b98bc45ea93"} Mar 18 10:31:22 crc kubenswrapper[4733]: I0318 10:31:22.808529 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=19.435525729 podStartE2EDuration="27.808512206s" podCreationTimestamp="2026-03-18 10:30:55 +0000 UTC" firstStartedPulling="2026-03-18 10:31:04.663557394 +0000 UTC m=+1104.155291719" lastFinishedPulling="2026-03-18 10:31:13.036543871 +0000 UTC m=+1112.528278196" observedRunningTime="2026-03-18 10:31:21.874933677 +0000 UTC m=+1121.366668002" watchObservedRunningTime="2026-03-18 10:31:22.808512206 +0000 UTC m=+1122.300246551" Mar 18 10:31:22 crc kubenswrapper[4733]: I0318 10:31:22.837902 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=4.675424847 podStartE2EDuration="19.837878907s" podCreationTimestamp="2026-03-18 10:31:03 +0000 UTC" firstStartedPulling="2026-03-18 10:31:05.941089007 +0000 UTC m=+1105.432823332" lastFinishedPulling="2026-03-18 10:31:21.103543037 +0000 UTC m=+1120.595277392" observedRunningTime="2026-03-18 10:31:22.831676162 +0000 UTC m=+1122.323410527" watchObservedRunningTime="2026-03-18 10:31:22.837878907 +0000 UTC m=+1122.329613232" Mar 18 10:31:22 crc kubenswrapper[4733]: I0318 10:31:22.905171 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 18 10:31:22 crc kubenswrapper[4733]: I0318 10:31:22.944978 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=7.832178363 podStartE2EDuration="22.944959257s" podCreationTimestamp="2026-03-18 10:31:00 +0000 UTC" firstStartedPulling="2026-03-18 10:31:05.950124223 +0000 UTC m=+1105.441858548" lastFinishedPulling="2026-03-18 10:31:21.062905117 +0000 UTC m=+1120.554639442" observedRunningTime="2026-03-18 10:31:22.940558013 +0000 UTC m=+1122.432292338" watchObservedRunningTime="2026-03-18 10:31:22.944959257 +0000 UTC m=+1122.436693582" Mar 18 10:31:23 crc kubenswrapper[4733]: I0318 10:31:22.965461 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5bf47b49b7-z55fc" podStartSLOduration=16.965438267 podStartE2EDuration="16.965438267s" podCreationTimestamp="2026-03-18 10:31:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:31:22.960300182 +0000 UTC m=+1122.452034517" watchObservedRunningTime="2026-03-18 10:31:22.965438267 +0000 UTC m=+1122.457172592" Mar 18 10:31:24 crc kubenswrapper[4733]: I0318 10:31:24.905324 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 18 10:31:24 crc kubenswrapper[4733]: I0318 10:31:24.972963 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-rrvg6" event={"ID":"15df79ef-9d7a-4310-ba27-bdf8cb200f0f","Type":"ContainerStarted","Data":"e9b6a3c12243c23c29491c07886c30384e8dc7b44b11048b3a29f67cf6a0e54b"} Mar 18 10:31:24 crc kubenswrapper[4733]: I0318 10:31:24.974129 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8554648995-rrvg6" Mar 18 10:31:24 crc kubenswrapper[4733]: I0318 10:31:24.978676 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-ljrgt" event={"ID":"d75a8d54-aca8-49cd-9062-6389baaf7a09","Type":"ContainerStarted","Data":"977df54e9e21058ae200793f039a61cc9e8ef7c488904477a179841b9c705157"} Mar 18 10:31:24 crc kubenswrapper[4733]: I0318 10:31:24.978955 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-ljrgt" Mar 18 10:31:24 crc kubenswrapper[4733]: I0318 10:31:24.979090 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-ljrgt" Mar 18 10:31:24 crc kubenswrapper[4733]: I0318 10:31:24.979174 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-ljrgt" event={"ID":"d75a8d54-aca8-49cd-9062-6389baaf7a09","Type":"ContainerStarted","Data":"0f25d0fd17f75d63c2160078393331e1696cc8d7b2ceed1f3ff8cb5a51395ea4"} Mar 18 10:31:25 crc kubenswrapper[4733]: I0318 10:31:25.003129 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8554648995-rrvg6" podStartSLOduration=19.003108131 podStartE2EDuration="19.003108131s" podCreationTimestamp="2026-03-18 10:31:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:31:24.993691964 +0000 UTC m=+1124.485426289" watchObservedRunningTime="2026-03-18 10:31:25.003108131 +0000 UTC m=+1124.494842466" Mar 18 10:31:25 crc kubenswrapper[4733]: I0318 10:31:25.028633 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-ljrgt" podStartSLOduration=15.157665566 podStartE2EDuration="25.028614703s" podCreationTimestamp="2026-03-18 10:31:00 +0000 UTC" firstStartedPulling="2026-03-18 10:31:04.728999556 +0000 UTC m=+1104.220733881" lastFinishedPulling="2026-03-18 10:31:14.599948693 +0000 UTC m=+1114.091683018" observedRunningTime="2026-03-18 10:31:25.026488642 +0000 UTC m=+1124.518222987" watchObservedRunningTime="2026-03-18 10:31:25.028614703 +0000 UTC m=+1124.520349038" Mar 18 10:31:25 crc kubenswrapper[4733]: I0318 10:31:25.722078 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 18 10:31:25 crc kubenswrapper[4733]: I0318 10:31:25.778425 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 18 10:31:25 crc kubenswrapper[4733]: I0318 10:31:25.947407 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 18 10:31:25 crc kubenswrapper[4733]: I0318 10:31:25.987210 4733 generic.go:334] "Generic (PLEG): container finished" podID="dc60b49b-96fa-40fd-a8e5-40c810f5ef80" containerID="1996bad471830d839e4c51afbee66ae68a9d0380538f2fb4187d0e6cb1e23827" exitCode=0 Mar 18 10:31:25 crc kubenswrapper[4733]: I0318 10:31:25.987251 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"dc60b49b-96fa-40fd-a8e5-40c810f5ef80","Type":"ContainerDied","Data":"1996bad471830d839e4c51afbee66ae68a9d0380538f2fb4187d0e6cb1e23827"} Mar 18 10:31:25 crc kubenswrapper[4733]: I0318 10:31:25.987901 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 18 10:31:26 crc kubenswrapper[4733]: I0318 10:31:26.781425 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 18 10:31:26 crc kubenswrapper[4733]: I0318 10:31:26.995531 4733 generic.go:334] "Generic (PLEG): container finished" podID="0208d826-df0f-41c8-83a7-821a21b7b85d" containerID="3874dea0990730a5c77d703a00ba1f3a9abfb0de15e4fe3acdd1a096b1bf4ecc" exitCode=0 Mar 18 10:31:26 crc kubenswrapper[4733]: I0318 10:31:26.995675 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"0208d826-df0f-41c8-83a7-821a21b7b85d","Type":"ContainerDied","Data":"3874dea0990730a5c77d703a00ba1f3a9abfb0de15e4fe3acdd1a096b1bf4ecc"} Mar 18 10:31:26 crc kubenswrapper[4733]: I0318 10:31:26.999659 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"dc60b49b-96fa-40fd-a8e5-40c810f5ef80","Type":"ContainerStarted","Data":"9ca944cb9aa23fc7d777229b0dcc77815b008c75e467a83230e086f840b8322e"} Mar 18 10:31:27 crc kubenswrapper[4733]: I0318 10:31:27.056976 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=18.668398087 podStartE2EDuration="35.056957954s" podCreationTimestamp="2026-03-18 10:30:52 +0000 UTC" firstStartedPulling="2026-03-18 10:31:04.675094471 +0000 UTC m=+1104.166828796" lastFinishedPulling="2026-03-18 10:31:21.063654338 +0000 UTC m=+1120.555388663" observedRunningTime="2026-03-18 10:31:27.049637056 +0000 UTC m=+1126.541371381" watchObservedRunningTime="2026-03-18 10:31:27.056957954 +0000 UTC m=+1126.548692279" Mar 18 10:31:27 crc kubenswrapper[4733]: I0318 10:31:27.059007 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 18 10:31:27 crc kubenswrapper[4733]: I0318 10:31:27.234633 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 18 10:31:27 crc kubenswrapper[4733]: E0318 10:31:27.234943 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6ec9568-99c8-4bee-a97c-46400fcc0e73" containerName="init" Mar 18 10:31:27 crc kubenswrapper[4733]: I0318 10:31:27.234958 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6ec9568-99c8-4bee-a97c-46400fcc0e73" containerName="init" Mar 18 10:31:27 crc kubenswrapper[4733]: E0318 10:31:27.234996 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6ec9568-99c8-4bee-a97c-46400fcc0e73" containerName="dnsmasq-dns" Mar 18 10:31:27 crc kubenswrapper[4733]: I0318 10:31:27.235002 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6ec9568-99c8-4bee-a97c-46400fcc0e73" containerName="dnsmasq-dns" Mar 18 10:31:27 crc kubenswrapper[4733]: E0318 10:31:27.235020 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a56bac49-b398-4b61-9b54-7969acd2dc93" containerName="init" Mar 18 10:31:27 crc kubenswrapper[4733]: I0318 10:31:27.235026 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="a56bac49-b398-4b61-9b54-7969acd2dc93" containerName="init" Mar 18 10:31:27 crc kubenswrapper[4733]: E0318 10:31:27.235038 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a56bac49-b398-4b61-9b54-7969acd2dc93" containerName="dnsmasq-dns" Mar 18 10:31:27 crc kubenswrapper[4733]: I0318 10:31:27.235045 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="a56bac49-b398-4b61-9b54-7969acd2dc93" containerName="dnsmasq-dns" Mar 18 10:31:27 crc kubenswrapper[4733]: I0318 10:31:27.235179 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6ec9568-99c8-4bee-a97c-46400fcc0e73" containerName="dnsmasq-dns" Mar 18 10:31:27 crc kubenswrapper[4733]: I0318 10:31:27.235206 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="a56bac49-b398-4b61-9b54-7969acd2dc93" containerName="dnsmasq-dns" Mar 18 10:31:27 crc kubenswrapper[4733]: I0318 10:31:27.235965 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 18 10:31:27 crc kubenswrapper[4733]: I0318 10:31:27.246813 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 18 10:31:27 crc kubenswrapper[4733]: I0318 10:31:27.246864 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-r2dgj" Mar 18 10:31:27 crc kubenswrapper[4733]: I0318 10:31:27.247244 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 18 10:31:27 crc kubenswrapper[4733]: I0318 10:31:27.247548 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 18 10:31:27 crc kubenswrapper[4733]: I0318 10:31:27.268908 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 18 10:31:27 crc kubenswrapper[4733]: I0318 10:31:27.276973 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96c7007d-b722-4518-a298-269808d7dfc5-config\") pod \"ovn-northd-0\" (UID: \"96c7007d-b722-4518-a298-269808d7dfc5\") " pod="openstack/ovn-northd-0" Mar 18 10:31:27 crc kubenswrapper[4733]: I0318 10:31:27.277029 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/96c7007d-b722-4518-a298-269808d7dfc5-scripts\") pod \"ovn-northd-0\" (UID: \"96c7007d-b722-4518-a298-269808d7dfc5\") " pod="openstack/ovn-northd-0" Mar 18 10:31:27 crc kubenswrapper[4733]: I0318 10:31:27.277062 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/96c7007d-b722-4518-a298-269808d7dfc5-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"96c7007d-b722-4518-a298-269808d7dfc5\") " pod="openstack/ovn-northd-0" Mar 18 10:31:27 crc kubenswrapper[4733]: I0318 10:31:27.277090 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdljd\" (UniqueName: \"kubernetes.io/projected/96c7007d-b722-4518-a298-269808d7dfc5-kube-api-access-qdljd\") pod \"ovn-northd-0\" (UID: \"96c7007d-b722-4518-a298-269808d7dfc5\") " pod="openstack/ovn-northd-0" Mar 18 10:31:27 crc kubenswrapper[4733]: I0318 10:31:27.277124 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/96c7007d-b722-4518-a298-269808d7dfc5-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"96c7007d-b722-4518-a298-269808d7dfc5\") " pod="openstack/ovn-northd-0" Mar 18 10:31:27 crc kubenswrapper[4733]: I0318 10:31:27.277173 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96c7007d-b722-4518-a298-269808d7dfc5-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"96c7007d-b722-4518-a298-269808d7dfc5\") " pod="openstack/ovn-northd-0" Mar 18 10:31:27 crc kubenswrapper[4733]: I0318 10:31:27.277215 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/96c7007d-b722-4518-a298-269808d7dfc5-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"96c7007d-b722-4518-a298-269808d7dfc5\") " pod="openstack/ovn-northd-0" Mar 18 10:31:27 crc kubenswrapper[4733]: I0318 10:31:27.379269 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/96c7007d-b722-4518-a298-269808d7dfc5-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"96c7007d-b722-4518-a298-269808d7dfc5\") " pod="openstack/ovn-northd-0" Mar 18 10:31:27 crc kubenswrapper[4733]: I0318 10:31:27.379352 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdljd\" (UniqueName: \"kubernetes.io/projected/96c7007d-b722-4518-a298-269808d7dfc5-kube-api-access-qdljd\") pod \"ovn-northd-0\" (UID: \"96c7007d-b722-4518-a298-269808d7dfc5\") " pod="openstack/ovn-northd-0" Mar 18 10:31:27 crc kubenswrapper[4733]: I0318 10:31:27.379402 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/96c7007d-b722-4518-a298-269808d7dfc5-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"96c7007d-b722-4518-a298-269808d7dfc5\") " pod="openstack/ovn-northd-0" Mar 18 10:31:27 crc kubenswrapper[4733]: I0318 10:31:27.379475 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96c7007d-b722-4518-a298-269808d7dfc5-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"96c7007d-b722-4518-a298-269808d7dfc5\") " pod="openstack/ovn-northd-0" Mar 18 10:31:27 crc kubenswrapper[4733]: I0318 10:31:27.379505 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/96c7007d-b722-4518-a298-269808d7dfc5-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"96c7007d-b722-4518-a298-269808d7dfc5\") " pod="openstack/ovn-northd-0" Mar 18 10:31:27 crc kubenswrapper[4733]: I0318 10:31:27.379546 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96c7007d-b722-4518-a298-269808d7dfc5-config\") pod \"ovn-northd-0\" (UID: \"96c7007d-b722-4518-a298-269808d7dfc5\") " pod="openstack/ovn-northd-0" Mar 18 10:31:27 crc kubenswrapper[4733]: I0318 10:31:27.379571 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/96c7007d-b722-4518-a298-269808d7dfc5-scripts\") pod \"ovn-northd-0\" (UID: \"96c7007d-b722-4518-a298-269808d7dfc5\") " pod="openstack/ovn-northd-0" Mar 18 10:31:27 crc kubenswrapper[4733]: I0318 10:31:27.380536 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/96c7007d-b722-4518-a298-269808d7dfc5-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"96c7007d-b722-4518-a298-269808d7dfc5\") " pod="openstack/ovn-northd-0" Mar 18 10:31:27 crc kubenswrapper[4733]: I0318 10:31:27.380550 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/96c7007d-b722-4518-a298-269808d7dfc5-scripts\") pod \"ovn-northd-0\" (UID: \"96c7007d-b722-4518-a298-269808d7dfc5\") " pod="openstack/ovn-northd-0" Mar 18 10:31:27 crc kubenswrapper[4733]: I0318 10:31:27.384985 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96c7007d-b722-4518-a298-269808d7dfc5-config\") pod \"ovn-northd-0\" (UID: \"96c7007d-b722-4518-a298-269808d7dfc5\") " pod="openstack/ovn-northd-0" Mar 18 10:31:27 crc kubenswrapper[4733]: I0318 10:31:27.385486 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/96c7007d-b722-4518-a298-269808d7dfc5-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"96c7007d-b722-4518-a298-269808d7dfc5\") " pod="openstack/ovn-northd-0" Mar 18 10:31:27 crc kubenswrapper[4733]: I0318 10:31:27.385962 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/96c7007d-b722-4518-a298-269808d7dfc5-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"96c7007d-b722-4518-a298-269808d7dfc5\") " pod="openstack/ovn-northd-0" Mar 18 10:31:27 crc kubenswrapper[4733]: I0318 10:31:27.389109 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96c7007d-b722-4518-a298-269808d7dfc5-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"96c7007d-b722-4518-a298-269808d7dfc5\") " pod="openstack/ovn-northd-0" Mar 18 10:31:27 crc kubenswrapper[4733]: I0318 10:31:27.409751 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdljd\" (UniqueName: \"kubernetes.io/projected/96c7007d-b722-4518-a298-269808d7dfc5-kube-api-access-qdljd\") pod \"ovn-northd-0\" (UID: \"96c7007d-b722-4518-a298-269808d7dfc5\") " pod="openstack/ovn-northd-0" Mar 18 10:31:27 crc kubenswrapper[4733]: I0318 10:31:27.573939 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 18 10:31:28 crc kubenswrapper[4733]: I0318 10:31:28.007760 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"0208d826-df0f-41c8-83a7-821a21b7b85d","Type":"ContainerStarted","Data":"4a28505a80aefa21b75c5f6a19c883fa6aad86f7eabe19eaca8f22fb40c92bd2"} Mar 18 10:31:28 crc kubenswrapper[4733]: I0318 10:31:28.031981 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 18 10:31:28 crc kubenswrapper[4733]: W0318 10:31:28.041605 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod96c7007d_b722_4518_a298_269808d7dfc5.slice/crio-2a35526730e3ef6d6132defbecd56a9e4135c02443cefa8726bb9a19cdde7b0f WatchSource:0}: Error finding container 2a35526730e3ef6d6132defbecd56a9e4135c02443cefa8726bb9a19cdde7b0f: Status 404 returned error can't find the container with id 2a35526730e3ef6d6132defbecd56a9e4135c02443cefa8726bb9a19cdde7b0f Mar 18 10:31:28 crc kubenswrapper[4733]: I0318 10:31:28.052383 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=18.958102094 podStartE2EDuration="35.052354942s" podCreationTimestamp="2026-03-18 10:30:53 +0000 UTC" firstStartedPulling="2026-03-18 10:31:05.010963356 +0000 UTC m=+1104.502697671" lastFinishedPulling="2026-03-18 10:31:21.105216194 +0000 UTC m=+1120.596950519" observedRunningTime="2026-03-18 10:31:28.043287975 +0000 UTC m=+1127.535022340" watchObservedRunningTime="2026-03-18 10:31:28.052354942 +0000 UTC m=+1127.544089287" Mar 18 10:31:29 crc kubenswrapper[4733]: I0318 10:31:29.019896 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"96c7007d-b722-4518-a298-269808d7dfc5","Type":"ContainerStarted","Data":"2a35526730e3ef6d6132defbecd56a9e4135c02443cefa8726bb9a19cdde7b0f"} Mar 18 10:31:30 crc kubenswrapper[4733]: I0318 10:31:30.027387 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"96c7007d-b722-4518-a298-269808d7dfc5","Type":"ContainerStarted","Data":"bee4a30e65cbe318a5ccae86e3a925ae69bbb7b94faa9f5ce7d933a6bbfbce90"} Mar 18 10:31:30 crc kubenswrapper[4733]: I0318 10:31:30.027807 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 18 10:31:30 crc kubenswrapper[4733]: I0318 10:31:30.027818 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"96c7007d-b722-4518-a298-269808d7dfc5","Type":"ContainerStarted","Data":"67279349151b2537441da5a22cefbccf06053b41e8df222a62d51688bd4048e5"} Mar 18 10:31:30 crc kubenswrapper[4733]: I0318 10:31:30.045598 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.96362579 podStartE2EDuration="3.045580018s" podCreationTimestamp="2026-03-18 10:31:27 +0000 UTC" firstStartedPulling="2026-03-18 10:31:28.044587272 +0000 UTC m=+1127.536321607" lastFinishedPulling="2026-03-18 10:31:29.12654151 +0000 UTC m=+1128.618275835" observedRunningTime="2026-03-18 10:31:30.042668546 +0000 UTC m=+1129.534402881" watchObservedRunningTime="2026-03-18 10:31:30.045580018 +0000 UTC m=+1129.537314343" Mar 18 10:31:30 crc kubenswrapper[4733]: I0318 10:31:30.458438 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 18 10:31:31 crc kubenswrapper[4733]: I0318 10:31:31.598420 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5bf47b49b7-z55fc" Mar 18 10:31:31 crc kubenswrapper[4733]: I0318 10:31:31.670375 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8554648995-rrvg6" Mar 18 10:31:31 crc kubenswrapper[4733]: I0318 10:31:31.743986 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-z55fc"] Mar 18 10:31:32 crc kubenswrapper[4733]: I0318 10:31:32.040072 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5bf47b49b7-z55fc" podUID="19376969-b236-4b21-b57f-3833a3c0c7b4" containerName="dnsmasq-dns" containerID="cri-o://58c76ce9b93e82f88df6f0ff6ee6c5e0a60fd6945aebd859567ed2120900425f" gracePeriod=10 Mar 18 10:31:32 crc kubenswrapper[4733]: I0318 10:31:32.499881 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-z55fc" Mar 18 10:31:32 crc kubenswrapper[4733]: I0318 10:31:32.585493 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19376969-b236-4b21-b57f-3833a3c0c7b4-dns-svc\") pod \"19376969-b236-4b21-b57f-3833a3c0c7b4\" (UID: \"19376969-b236-4b21-b57f-3833a3c0c7b4\") " Mar 18 10:31:32 crc kubenswrapper[4733]: I0318 10:31:32.585584 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/19376969-b236-4b21-b57f-3833a3c0c7b4-ovsdbserver-nb\") pod \"19376969-b236-4b21-b57f-3833a3c0c7b4\" (UID: \"19376969-b236-4b21-b57f-3833a3c0c7b4\") " Mar 18 10:31:32 crc kubenswrapper[4733]: I0318 10:31:32.585624 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19376969-b236-4b21-b57f-3833a3c0c7b4-config\") pod \"19376969-b236-4b21-b57f-3833a3c0c7b4\" (UID: \"19376969-b236-4b21-b57f-3833a3c0c7b4\") " Mar 18 10:31:32 crc kubenswrapper[4733]: I0318 10:31:32.585841 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrzpq\" (UniqueName: \"kubernetes.io/projected/19376969-b236-4b21-b57f-3833a3c0c7b4-kube-api-access-nrzpq\") pod \"19376969-b236-4b21-b57f-3833a3c0c7b4\" (UID: \"19376969-b236-4b21-b57f-3833a3c0c7b4\") " Mar 18 10:31:32 crc kubenswrapper[4733]: I0318 10:31:32.598633 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19376969-b236-4b21-b57f-3833a3c0c7b4-kube-api-access-nrzpq" (OuterVolumeSpecName: "kube-api-access-nrzpq") pod "19376969-b236-4b21-b57f-3833a3c0c7b4" (UID: "19376969-b236-4b21-b57f-3833a3c0c7b4"). InnerVolumeSpecName "kube-api-access-nrzpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:31:32 crc kubenswrapper[4733]: I0318 10:31:32.625020 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19376969-b236-4b21-b57f-3833a3c0c7b4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "19376969-b236-4b21-b57f-3833a3c0c7b4" (UID: "19376969-b236-4b21-b57f-3833a3c0c7b4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:31:32 crc kubenswrapper[4733]: I0318 10:31:32.631880 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19376969-b236-4b21-b57f-3833a3c0c7b4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "19376969-b236-4b21-b57f-3833a3c0c7b4" (UID: "19376969-b236-4b21-b57f-3833a3c0c7b4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:31:32 crc kubenswrapper[4733]: I0318 10:31:32.637088 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19376969-b236-4b21-b57f-3833a3c0c7b4-config" (OuterVolumeSpecName: "config") pod "19376969-b236-4b21-b57f-3833a3c0c7b4" (UID: "19376969-b236-4b21-b57f-3833a3c0c7b4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:31:32 crc kubenswrapper[4733]: I0318 10:31:32.687068 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrzpq\" (UniqueName: \"kubernetes.io/projected/19376969-b236-4b21-b57f-3833a3c0c7b4-kube-api-access-nrzpq\") on node \"crc\" DevicePath \"\"" Mar 18 10:31:32 crc kubenswrapper[4733]: I0318 10:31:32.687101 4733 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19376969-b236-4b21-b57f-3833a3c0c7b4-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 10:31:32 crc kubenswrapper[4733]: I0318 10:31:32.687115 4733 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/19376969-b236-4b21-b57f-3833a3c0c7b4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 10:31:32 crc kubenswrapper[4733]: I0318 10:31:32.687127 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19376969-b236-4b21-b57f-3833a3c0c7b4-config\") on node \"crc\" DevicePath \"\"" Mar 18 10:31:33 crc kubenswrapper[4733]: I0318 10:31:33.049910 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"55f0b230-09f2-4be2-aa1f-76a37f3fe30c","Type":"ContainerStarted","Data":"416e5a0f591185bf6f079ff41418fd4332b9d5c05db84cf23ab9ab814c1c773f"} Mar 18 10:31:33 crc kubenswrapper[4733]: I0318 10:31:33.050477 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 18 10:31:33 crc kubenswrapper[4733]: I0318 10:31:33.052408 4733 generic.go:334] "Generic (PLEG): container finished" podID="19376969-b236-4b21-b57f-3833a3c0c7b4" containerID="58c76ce9b93e82f88df6f0ff6ee6c5e0a60fd6945aebd859567ed2120900425f" exitCode=0 Mar 18 10:31:33 crc kubenswrapper[4733]: I0318 10:31:33.052447 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-z55fc" event={"ID":"19376969-b236-4b21-b57f-3833a3c0c7b4","Type":"ContainerDied","Data":"58c76ce9b93e82f88df6f0ff6ee6c5e0a60fd6945aebd859567ed2120900425f"} Mar 18 10:31:33 crc kubenswrapper[4733]: I0318 10:31:33.052470 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-z55fc" event={"ID":"19376969-b236-4b21-b57f-3833a3c0c7b4","Type":"ContainerDied","Data":"d334497d76d55a41bc85eeb60a10080fe3c2d76a582435f142bf759662b2098b"} Mar 18 10:31:33 crc kubenswrapper[4733]: I0318 10:31:33.052490 4733 scope.go:117] "RemoveContainer" containerID="58c76ce9b93e82f88df6f0ff6ee6c5e0a60fd6945aebd859567ed2120900425f" Mar 18 10:31:33 crc kubenswrapper[4733]: I0318 10:31:33.052535 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-z55fc" Mar 18 10:31:33 crc kubenswrapper[4733]: I0318 10:31:33.091461 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=8.164764908 podStartE2EDuration="36.091431752s" podCreationTimestamp="2026-03-18 10:30:57 +0000 UTC" firstStartedPulling="2026-03-18 10:31:04.671965172 +0000 UTC m=+1104.163699497" lastFinishedPulling="2026-03-18 10:31:32.598632006 +0000 UTC m=+1132.090366341" observedRunningTime="2026-03-18 10:31:33.068652557 +0000 UTC m=+1132.560386882" watchObservedRunningTime="2026-03-18 10:31:33.091431752 +0000 UTC m=+1132.583166097" Mar 18 10:31:33 crc kubenswrapper[4733]: I0318 10:31:33.097026 4733 scope.go:117] "RemoveContainer" containerID="1f57d62929ee7549431c614848f4b7ee032f7791e41c871d302148faa6989a10" Mar 18 10:31:33 crc kubenswrapper[4733]: I0318 10:31:33.107744 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-z55fc"] Mar 18 10:31:33 crc kubenswrapper[4733]: I0318 10:31:33.113782 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-z55fc"] Mar 18 10:31:33 crc kubenswrapper[4733]: I0318 10:31:33.118734 4733 scope.go:117] "RemoveContainer" containerID="58c76ce9b93e82f88df6f0ff6ee6c5e0a60fd6945aebd859567ed2120900425f" Mar 18 10:31:33 crc kubenswrapper[4733]: E0318 10:31:33.119175 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58c76ce9b93e82f88df6f0ff6ee6c5e0a60fd6945aebd859567ed2120900425f\": container with ID starting with 58c76ce9b93e82f88df6f0ff6ee6c5e0a60fd6945aebd859567ed2120900425f not found: ID does not exist" containerID="58c76ce9b93e82f88df6f0ff6ee6c5e0a60fd6945aebd859567ed2120900425f" Mar 18 10:31:33 crc kubenswrapper[4733]: I0318 10:31:33.119220 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58c76ce9b93e82f88df6f0ff6ee6c5e0a60fd6945aebd859567ed2120900425f"} err="failed to get container status \"58c76ce9b93e82f88df6f0ff6ee6c5e0a60fd6945aebd859567ed2120900425f\": rpc error: code = NotFound desc = could not find container \"58c76ce9b93e82f88df6f0ff6ee6c5e0a60fd6945aebd859567ed2120900425f\": container with ID starting with 58c76ce9b93e82f88df6f0ff6ee6c5e0a60fd6945aebd859567ed2120900425f not found: ID does not exist" Mar 18 10:31:33 crc kubenswrapper[4733]: I0318 10:31:33.119240 4733 scope.go:117] "RemoveContainer" containerID="1f57d62929ee7549431c614848f4b7ee032f7791e41c871d302148faa6989a10" Mar 18 10:31:33 crc kubenswrapper[4733]: E0318 10:31:33.119426 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f57d62929ee7549431c614848f4b7ee032f7791e41c871d302148faa6989a10\": container with ID starting with 1f57d62929ee7549431c614848f4b7ee032f7791e41c871d302148faa6989a10 not found: ID does not exist" containerID="1f57d62929ee7549431c614848f4b7ee032f7791e41c871d302148faa6989a10" Mar 18 10:31:33 crc kubenswrapper[4733]: I0318 10:31:33.119447 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f57d62929ee7549431c614848f4b7ee032f7791e41c871d302148faa6989a10"} err="failed to get container status \"1f57d62929ee7549431c614848f4b7ee032f7791e41c871d302148faa6989a10\": rpc error: code = NotFound desc = could not find container \"1f57d62929ee7549431c614848f4b7ee032f7791e41c871d302148faa6989a10\": container with ID starting with 1f57d62929ee7549431c614848f4b7ee032f7791e41c871d302148faa6989a10 not found: ID does not exist" Mar 18 10:31:33 crc kubenswrapper[4733]: I0318 10:31:33.190450 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19376969-b236-4b21-b57f-3833a3c0c7b4" path="/var/lib/kubelet/pods/19376969-b236-4b21-b57f-3833a3c0c7b4/volumes" Mar 18 10:31:33 crc kubenswrapper[4733]: I0318 10:31:33.908176 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 18 10:31:33 crc kubenswrapper[4733]: I0318 10:31:33.908269 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 18 10:31:34 crc kubenswrapper[4733]: I0318 10:31:34.060714 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 18 10:31:34 crc kubenswrapper[4733]: I0318 10:31:34.145047 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 18 10:31:35 crc kubenswrapper[4733]: I0318 10:31:35.320835 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 18 10:31:35 crc kubenswrapper[4733]: I0318 10:31:35.320901 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 18 10:31:35 crc kubenswrapper[4733]: I0318 10:31:35.395318 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 18 10:31:36 crc kubenswrapper[4733]: I0318 10:31:36.160259 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 18 10:31:36 crc kubenswrapper[4733]: I0318 10:31:36.561755 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5e1e-account-create-update-r9bb4"] Mar 18 10:31:36 crc kubenswrapper[4733]: E0318 10:31:36.562132 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19376969-b236-4b21-b57f-3833a3c0c7b4" containerName="init" Mar 18 10:31:36 crc kubenswrapper[4733]: I0318 10:31:36.562150 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="19376969-b236-4b21-b57f-3833a3c0c7b4" containerName="init" Mar 18 10:31:36 crc kubenswrapper[4733]: E0318 10:31:36.562211 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19376969-b236-4b21-b57f-3833a3c0c7b4" containerName="dnsmasq-dns" Mar 18 10:31:36 crc kubenswrapper[4733]: I0318 10:31:36.562219 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="19376969-b236-4b21-b57f-3833a3c0c7b4" containerName="dnsmasq-dns" Mar 18 10:31:36 crc kubenswrapper[4733]: I0318 10:31:36.562425 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="19376969-b236-4b21-b57f-3833a3c0c7b4" containerName="dnsmasq-dns" Mar 18 10:31:36 crc kubenswrapper[4733]: I0318 10:31:36.563013 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5e1e-account-create-update-r9bb4" Mar 18 10:31:36 crc kubenswrapper[4733]: I0318 10:31:36.566085 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 18 10:31:36 crc kubenswrapper[4733]: I0318 10:31:36.579491 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5e1e-account-create-update-r9bb4"] Mar 18 10:31:36 crc kubenswrapper[4733]: I0318 10:31:36.634329 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-mr65v"] Mar 18 10:31:36 crc kubenswrapper[4733]: I0318 10:31:36.635253 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mr65v" Mar 18 10:31:36 crc kubenswrapper[4733]: I0318 10:31:36.662349 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6346115-9e7a-4489-916d-a129aa83a6dd-operator-scripts\") pod \"keystone-5e1e-account-create-update-r9bb4\" (UID: \"a6346115-9e7a-4489-916d-a129aa83a6dd\") " pod="openstack/keystone-5e1e-account-create-update-r9bb4" Mar 18 10:31:36 crc kubenswrapper[4733]: I0318 10:31:36.662472 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6k7n9\" (UniqueName: \"kubernetes.io/projected/a6346115-9e7a-4489-916d-a129aa83a6dd-kube-api-access-6k7n9\") pod \"keystone-5e1e-account-create-update-r9bb4\" (UID: \"a6346115-9e7a-4489-916d-a129aa83a6dd\") " pod="openstack/keystone-5e1e-account-create-update-r9bb4" Mar 18 10:31:36 crc kubenswrapper[4733]: I0318 10:31:36.670480 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-mr65v"] Mar 18 10:31:36 crc kubenswrapper[4733]: I0318 10:31:36.732785 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-5gwmb"] Mar 18 10:31:36 crc kubenswrapper[4733]: I0318 10:31:36.733811 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-5gwmb" Mar 18 10:31:36 crc kubenswrapper[4733]: I0318 10:31:36.745379 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-5gwmb"] Mar 18 10:31:36 crc kubenswrapper[4733]: I0318 10:31:36.764039 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2qhr\" (UniqueName: \"kubernetes.io/projected/30a7c351-0be1-4547-bacc-8ff02cb59328-kube-api-access-w2qhr\") pod \"keystone-db-create-mr65v\" (UID: \"30a7c351-0be1-4547-bacc-8ff02cb59328\") " pod="openstack/keystone-db-create-mr65v" Mar 18 10:31:36 crc kubenswrapper[4733]: I0318 10:31:36.764732 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6346115-9e7a-4489-916d-a129aa83a6dd-operator-scripts\") pod \"keystone-5e1e-account-create-update-r9bb4\" (UID: \"a6346115-9e7a-4489-916d-a129aa83a6dd\") " pod="openstack/keystone-5e1e-account-create-update-r9bb4" Mar 18 10:31:36 crc kubenswrapper[4733]: I0318 10:31:36.764805 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6k7n9\" (UniqueName: \"kubernetes.io/projected/a6346115-9e7a-4489-916d-a129aa83a6dd-kube-api-access-6k7n9\") pod \"keystone-5e1e-account-create-update-r9bb4\" (UID: \"a6346115-9e7a-4489-916d-a129aa83a6dd\") " pod="openstack/keystone-5e1e-account-create-update-r9bb4" Mar 18 10:31:36 crc kubenswrapper[4733]: I0318 10:31:36.764855 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30a7c351-0be1-4547-bacc-8ff02cb59328-operator-scripts\") pod \"keystone-db-create-mr65v\" (UID: \"30a7c351-0be1-4547-bacc-8ff02cb59328\") " pod="openstack/keystone-db-create-mr65v" Mar 18 10:31:36 crc kubenswrapper[4733]: I0318 10:31:36.765698 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6346115-9e7a-4489-916d-a129aa83a6dd-operator-scripts\") pod \"keystone-5e1e-account-create-update-r9bb4\" (UID: \"a6346115-9e7a-4489-916d-a129aa83a6dd\") " pod="openstack/keystone-5e1e-account-create-update-r9bb4" Mar 18 10:31:36 crc kubenswrapper[4733]: I0318 10:31:36.790609 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6k7n9\" (UniqueName: \"kubernetes.io/projected/a6346115-9e7a-4489-916d-a129aa83a6dd-kube-api-access-6k7n9\") pod \"keystone-5e1e-account-create-update-r9bb4\" (UID: \"a6346115-9e7a-4489-916d-a129aa83a6dd\") " pod="openstack/keystone-5e1e-account-create-update-r9bb4" Mar 18 10:31:36 crc kubenswrapper[4733]: I0318 10:31:36.838163 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-0937-account-create-update-bfx7n"] Mar 18 10:31:36 crc kubenswrapper[4733]: I0318 10:31:36.839372 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0937-account-create-update-bfx7n" Mar 18 10:31:36 crc kubenswrapper[4733]: I0318 10:31:36.841676 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 18 10:31:36 crc kubenswrapper[4733]: I0318 10:31:36.846139 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-0937-account-create-update-bfx7n"] Mar 18 10:31:36 crc kubenswrapper[4733]: I0318 10:31:36.867283 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0a5425fb-7059-4262-9c68-1420a5f3b4f1-operator-scripts\") pod \"placement-db-create-5gwmb\" (UID: \"0a5425fb-7059-4262-9c68-1420a5f3b4f1\") " pod="openstack/placement-db-create-5gwmb" Mar 18 10:31:36 crc kubenswrapper[4733]: I0318 10:31:36.867334 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2qhr\" (UniqueName: \"kubernetes.io/projected/30a7c351-0be1-4547-bacc-8ff02cb59328-kube-api-access-w2qhr\") pod \"keystone-db-create-mr65v\" (UID: \"30a7c351-0be1-4547-bacc-8ff02cb59328\") " pod="openstack/keystone-db-create-mr65v" Mar 18 10:31:36 crc kubenswrapper[4733]: I0318 10:31:36.867396 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkxpg\" (UniqueName: \"kubernetes.io/projected/0a5425fb-7059-4262-9c68-1420a5f3b4f1-kube-api-access-vkxpg\") pod \"placement-db-create-5gwmb\" (UID: \"0a5425fb-7059-4262-9c68-1420a5f3b4f1\") " pod="openstack/placement-db-create-5gwmb" Mar 18 10:31:36 crc kubenswrapper[4733]: I0318 10:31:36.867433 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30a7c351-0be1-4547-bacc-8ff02cb59328-operator-scripts\") pod \"keystone-db-create-mr65v\" (UID: \"30a7c351-0be1-4547-bacc-8ff02cb59328\") " pod="openstack/keystone-db-create-mr65v" Mar 18 10:31:36 crc kubenswrapper[4733]: I0318 10:31:36.868310 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30a7c351-0be1-4547-bacc-8ff02cb59328-operator-scripts\") pod \"keystone-db-create-mr65v\" (UID: \"30a7c351-0be1-4547-bacc-8ff02cb59328\") " pod="openstack/keystone-db-create-mr65v" Mar 18 10:31:36 crc kubenswrapper[4733]: I0318 10:31:36.880282 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5e1e-account-create-update-r9bb4" Mar 18 10:31:36 crc kubenswrapper[4733]: I0318 10:31:36.883756 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2qhr\" (UniqueName: \"kubernetes.io/projected/30a7c351-0be1-4547-bacc-8ff02cb59328-kube-api-access-w2qhr\") pod \"keystone-db-create-mr65v\" (UID: \"30a7c351-0be1-4547-bacc-8ff02cb59328\") " pod="openstack/keystone-db-create-mr65v" Mar 18 10:31:37 crc kubenswrapper[4733]: I0318 10:31:36.968519 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mr65v" Mar 18 10:31:37 crc kubenswrapper[4733]: I0318 10:31:36.968916 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/07730b47-54ba-4b79-952e-6fb12b3b5279-operator-scripts\") pod \"placement-0937-account-create-update-bfx7n\" (UID: \"07730b47-54ba-4b79-952e-6fb12b3b5279\") " pod="openstack/placement-0937-account-create-update-bfx7n" Mar 18 10:31:37 crc kubenswrapper[4733]: I0318 10:31:36.968994 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0a5425fb-7059-4262-9c68-1420a5f3b4f1-operator-scripts\") pod \"placement-db-create-5gwmb\" (UID: \"0a5425fb-7059-4262-9c68-1420a5f3b4f1\") " pod="openstack/placement-db-create-5gwmb" Mar 18 10:31:37 crc kubenswrapper[4733]: I0318 10:31:36.969079 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkxpg\" (UniqueName: \"kubernetes.io/projected/0a5425fb-7059-4262-9c68-1420a5f3b4f1-kube-api-access-vkxpg\") pod \"placement-db-create-5gwmb\" (UID: \"0a5425fb-7059-4262-9c68-1420a5f3b4f1\") " pod="openstack/placement-db-create-5gwmb" Mar 18 10:31:37 crc kubenswrapper[4733]: I0318 10:31:36.969101 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrskv\" (UniqueName: \"kubernetes.io/projected/07730b47-54ba-4b79-952e-6fb12b3b5279-kube-api-access-hrskv\") pod \"placement-0937-account-create-update-bfx7n\" (UID: \"07730b47-54ba-4b79-952e-6fb12b3b5279\") " pod="openstack/placement-0937-account-create-update-bfx7n" Mar 18 10:31:37 crc kubenswrapper[4733]: I0318 10:31:36.969939 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0a5425fb-7059-4262-9c68-1420a5f3b4f1-operator-scripts\") pod \"placement-db-create-5gwmb\" (UID: \"0a5425fb-7059-4262-9c68-1420a5f3b4f1\") " pod="openstack/placement-db-create-5gwmb" Mar 18 10:31:37 crc kubenswrapper[4733]: I0318 10:31:37.042554 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkxpg\" (UniqueName: \"kubernetes.io/projected/0a5425fb-7059-4262-9c68-1420a5f3b4f1-kube-api-access-vkxpg\") pod \"placement-db-create-5gwmb\" (UID: \"0a5425fb-7059-4262-9c68-1420a5f3b4f1\") " pod="openstack/placement-db-create-5gwmb" Mar 18 10:31:37 crc kubenswrapper[4733]: I0318 10:31:37.050221 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-5gwmb" Mar 18 10:31:37 crc kubenswrapper[4733]: I0318 10:31:37.070674 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrskv\" (UniqueName: \"kubernetes.io/projected/07730b47-54ba-4b79-952e-6fb12b3b5279-kube-api-access-hrskv\") pod \"placement-0937-account-create-update-bfx7n\" (UID: \"07730b47-54ba-4b79-952e-6fb12b3b5279\") " pod="openstack/placement-0937-account-create-update-bfx7n" Mar 18 10:31:37 crc kubenswrapper[4733]: I0318 10:31:37.071732 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/07730b47-54ba-4b79-952e-6fb12b3b5279-operator-scripts\") pod \"placement-0937-account-create-update-bfx7n\" (UID: \"07730b47-54ba-4b79-952e-6fb12b3b5279\") " pod="openstack/placement-0937-account-create-update-bfx7n" Mar 18 10:31:37 crc kubenswrapper[4733]: I0318 10:31:37.071888 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/07730b47-54ba-4b79-952e-6fb12b3b5279-operator-scripts\") pod \"placement-0937-account-create-update-bfx7n\" (UID: \"07730b47-54ba-4b79-952e-6fb12b3b5279\") " pod="openstack/placement-0937-account-create-update-bfx7n" Mar 18 10:31:37 crc kubenswrapper[4733]: I0318 10:31:37.090955 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrskv\" (UniqueName: \"kubernetes.io/projected/07730b47-54ba-4b79-952e-6fb12b3b5279-kube-api-access-hrskv\") pod \"placement-0937-account-create-update-bfx7n\" (UID: \"07730b47-54ba-4b79-952e-6fb12b3b5279\") " pod="openstack/placement-0937-account-create-update-bfx7n" Mar 18 10:31:37 crc kubenswrapper[4733]: I0318 10:31:37.159863 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0937-account-create-update-bfx7n" Mar 18 10:31:37 crc kubenswrapper[4733]: I0318 10:31:37.367814 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5e1e-account-create-update-r9bb4"] Mar 18 10:31:37 crc kubenswrapper[4733]: W0318 10:31:37.372112 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6346115_9e7a_4489_916d_a129aa83a6dd.slice/crio-6836746bdde3ee4414bdc9c36bf575e7d73bcbca50d7355ca4160c2ed4ad5259 WatchSource:0}: Error finding container 6836746bdde3ee4414bdc9c36bf575e7d73bcbca50d7355ca4160c2ed4ad5259: Status 404 returned error can't find the container with id 6836746bdde3ee4414bdc9c36bf575e7d73bcbca50d7355ca4160c2ed4ad5259 Mar 18 10:31:37 crc kubenswrapper[4733]: I0318 10:31:37.483774 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-mr65v"] Mar 18 10:31:37 crc kubenswrapper[4733]: I0318 10:31:37.569583 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-5gwmb"] Mar 18 10:31:37 crc kubenswrapper[4733]: I0318 10:31:37.663217 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-0937-account-create-update-bfx7n"] Mar 18 10:31:37 crc kubenswrapper[4733]: W0318 10:31:37.684782 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod07730b47_54ba_4b79_952e_6fb12b3b5279.slice/crio-4dd3dd5968f22d9782c54516f96136d7407f843a2525378831477533afa22b84 WatchSource:0}: Error finding container 4dd3dd5968f22d9782c54516f96136d7407f843a2525378831477533afa22b84: Status 404 returned error can't find the container with id 4dd3dd5968f22d9782c54516f96136d7407f843a2525378831477533afa22b84 Mar 18 10:31:37 crc kubenswrapper[4733]: I0318 10:31:37.802591 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 18 10:31:37 crc kubenswrapper[4733]: I0318 10:31:37.859547 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-h4pnt"] Mar 18 10:31:37 crc kubenswrapper[4733]: I0318 10:31:37.860844 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-h4pnt" Mar 18 10:31:37 crc kubenswrapper[4733]: I0318 10:31:37.894455 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-h4pnt"] Mar 18 10:31:37 crc kubenswrapper[4733]: I0318 10:31:37.985412 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7bdf8dbb-ffe1-48d1-9c79-22e37dd882be-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-h4pnt\" (UID: \"7bdf8dbb-ffe1-48d1-9c79-22e37dd882be\") " pod="openstack/dnsmasq-dns-b8fbc5445-h4pnt" Mar 18 10:31:37 crc kubenswrapper[4733]: I0318 10:31:37.985577 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clgwm\" (UniqueName: \"kubernetes.io/projected/7bdf8dbb-ffe1-48d1-9c79-22e37dd882be-kube-api-access-clgwm\") pod \"dnsmasq-dns-b8fbc5445-h4pnt\" (UID: \"7bdf8dbb-ffe1-48d1-9c79-22e37dd882be\") " pod="openstack/dnsmasq-dns-b8fbc5445-h4pnt" Mar 18 10:31:37 crc kubenswrapper[4733]: I0318 10:31:37.985647 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7bdf8dbb-ffe1-48d1-9c79-22e37dd882be-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-h4pnt\" (UID: \"7bdf8dbb-ffe1-48d1-9c79-22e37dd882be\") " pod="openstack/dnsmasq-dns-b8fbc5445-h4pnt" Mar 18 10:31:37 crc kubenswrapper[4733]: I0318 10:31:37.985689 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bdf8dbb-ffe1-48d1-9c79-22e37dd882be-config\") pod \"dnsmasq-dns-b8fbc5445-h4pnt\" (UID: \"7bdf8dbb-ffe1-48d1-9c79-22e37dd882be\") " pod="openstack/dnsmasq-dns-b8fbc5445-h4pnt" Mar 18 10:31:37 crc kubenswrapper[4733]: I0318 10:31:37.985754 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7bdf8dbb-ffe1-48d1-9c79-22e37dd882be-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-h4pnt\" (UID: \"7bdf8dbb-ffe1-48d1-9c79-22e37dd882be\") " pod="openstack/dnsmasq-dns-b8fbc5445-h4pnt" Mar 18 10:31:38 crc kubenswrapper[4733]: I0318 10:31:38.091758 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clgwm\" (UniqueName: \"kubernetes.io/projected/7bdf8dbb-ffe1-48d1-9c79-22e37dd882be-kube-api-access-clgwm\") pod \"dnsmasq-dns-b8fbc5445-h4pnt\" (UID: \"7bdf8dbb-ffe1-48d1-9c79-22e37dd882be\") " pod="openstack/dnsmasq-dns-b8fbc5445-h4pnt" Mar 18 10:31:38 crc kubenswrapper[4733]: I0318 10:31:38.091850 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7bdf8dbb-ffe1-48d1-9c79-22e37dd882be-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-h4pnt\" (UID: \"7bdf8dbb-ffe1-48d1-9c79-22e37dd882be\") " pod="openstack/dnsmasq-dns-b8fbc5445-h4pnt" Mar 18 10:31:38 crc kubenswrapper[4733]: I0318 10:31:38.091897 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bdf8dbb-ffe1-48d1-9c79-22e37dd882be-config\") pod \"dnsmasq-dns-b8fbc5445-h4pnt\" (UID: \"7bdf8dbb-ffe1-48d1-9c79-22e37dd882be\") " pod="openstack/dnsmasq-dns-b8fbc5445-h4pnt" Mar 18 10:31:38 crc kubenswrapper[4733]: I0318 10:31:38.091927 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7bdf8dbb-ffe1-48d1-9c79-22e37dd882be-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-h4pnt\" (UID: \"7bdf8dbb-ffe1-48d1-9c79-22e37dd882be\") " pod="openstack/dnsmasq-dns-b8fbc5445-h4pnt" Mar 18 10:31:38 crc kubenswrapper[4733]: I0318 10:31:38.091966 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7bdf8dbb-ffe1-48d1-9c79-22e37dd882be-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-h4pnt\" (UID: \"7bdf8dbb-ffe1-48d1-9c79-22e37dd882be\") " pod="openstack/dnsmasq-dns-b8fbc5445-h4pnt" Mar 18 10:31:38 crc kubenswrapper[4733]: I0318 10:31:38.092906 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7bdf8dbb-ffe1-48d1-9c79-22e37dd882be-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-h4pnt\" (UID: \"7bdf8dbb-ffe1-48d1-9c79-22e37dd882be\") " pod="openstack/dnsmasq-dns-b8fbc5445-h4pnt" Mar 18 10:31:38 crc kubenswrapper[4733]: I0318 10:31:38.093860 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7bdf8dbb-ffe1-48d1-9c79-22e37dd882be-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-h4pnt\" (UID: \"7bdf8dbb-ffe1-48d1-9c79-22e37dd882be\") " pod="openstack/dnsmasq-dns-b8fbc5445-h4pnt" Mar 18 10:31:38 crc kubenswrapper[4733]: I0318 10:31:38.100251 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7bdf8dbb-ffe1-48d1-9c79-22e37dd882be-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-h4pnt\" (UID: \"7bdf8dbb-ffe1-48d1-9c79-22e37dd882be\") " pod="openstack/dnsmasq-dns-b8fbc5445-h4pnt" Mar 18 10:31:38 crc kubenswrapper[4733]: I0318 10:31:38.112066 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bdf8dbb-ffe1-48d1-9c79-22e37dd882be-config\") pod \"dnsmasq-dns-b8fbc5445-h4pnt\" (UID: \"7bdf8dbb-ffe1-48d1-9c79-22e37dd882be\") " pod="openstack/dnsmasq-dns-b8fbc5445-h4pnt" Mar 18 10:31:38 crc kubenswrapper[4733]: I0318 10:31:38.113283 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5e1e-account-create-update-r9bb4" event={"ID":"a6346115-9e7a-4489-916d-a129aa83a6dd","Type":"ContainerStarted","Data":"1a52840f130018d9dd9a4d4957090d0bfe7cddccea8c86d998fc7ce63f88d2c3"} Mar 18 10:31:38 crc kubenswrapper[4733]: I0318 10:31:38.113331 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5e1e-account-create-update-r9bb4" event={"ID":"a6346115-9e7a-4489-916d-a129aa83a6dd","Type":"ContainerStarted","Data":"6836746bdde3ee4414bdc9c36bf575e7d73bcbca50d7355ca4160c2ed4ad5259"} Mar 18 10:31:38 crc kubenswrapper[4733]: I0318 10:31:38.114319 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clgwm\" (UniqueName: \"kubernetes.io/projected/7bdf8dbb-ffe1-48d1-9c79-22e37dd882be-kube-api-access-clgwm\") pod \"dnsmasq-dns-b8fbc5445-h4pnt\" (UID: \"7bdf8dbb-ffe1-48d1-9c79-22e37dd882be\") " pod="openstack/dnsmasq-dns-b8fbc5445-h4pnt" Mar 18 10:31:38 crc kubenswrapper[4733]: I0318 10:31:38.124692 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-mr65v" event={"ID":"30a7c351-0be1-4547-bacc-8ff02cb59328","Type":"ContainerStarted","Data":"cb5c331f367d49d9d35cab0a581b0fd4e3d8921934861b35f887d6648ae09cfb"} Mar 18 10:31:38 crc kubenswrapper[4733]: I0318 10:31:38.124757 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-mr65v" event={"ID":"30a7c351-0be1-4547-bacc-8ff02cb59328","Type":"ContainerStarted","Data":"e708787994921982a80be3d54b5684318bbda7cf4ce148559d792c7f967c93d8"} Mar 18 10:31:38 crc kubenswrapper[4733]: I0318 10:31:38.128004 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-5gwmb" event={"ID":"0a5425fb-7059-4262-9c68-1420a5f3b4f1","Type":"ContainerStarted","Data":"ce9a99c6df86d54aacd4034e75a79275a1f1a3fe6a26a1b9d309967e3b0b146b"} Mar 18 10:31:38 crc kubenswrapper[4733]: I0318 10:31:38.128052 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-5gwmb" event={"ID":"0a5425fb-7059-4262-9c68-1420a5f3b4f1","Type":"ContainerStarted","Data":"f99fa9061248dffa81f529bc9a8356e3700d233c627489528d2203d1084f078a"} Mar 18 10:31:38 crc kubenswrapper[4733]: I0318 10:31:38.138680 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-0937-account-create-update-bfx7n" event={"ID":"07730b47-54ba-4b79-952e-6fb12b3b5279","Type":"ContainerStarted","Data":"881999bdad04a088176edfb2a1165638bbb818ce5892ed189c2612e4735ca703"} Mar 18 10:31:38 crc kubenswrapper[4733]: I0318 10:31:38.138866 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-5e1e-account-create-update-r9bb4" podStartSLOduration=2.138845979 podStartE2EDuration="2.138845979s" podCreationTimestamp="2026-03-18 10:31:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:31:38.128360592 +0000 UTC m=+1137.620094917" watchObservedRunningTime="2026-03-18 10:31:38.138845979 +0000 UTC m=+1137.630580304" Mar 18 10:31:38 crc kubenswrapper[4733]: I0318 10:31:38.138889 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-0937-account-create-update-bfx7n" event={"ID":"07730b47-54ba-4b79-952e-6fb12b3b5279","Type":"ContainerStarted","Data":"4dd3dd5968f22d9782c54516f96136d7407f843a2525378831477533afa22b84"} Mar 18 10:31:38 crc kubenswrapper[4733]: I0318 10:31:38.156388 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-mr65v" podStartSLOduration=2.156372835 podStartE2EDuration="2.156372835s" podCreationTimestamp="2026-03-18 10:31:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:31:38.149753788 +0000 UTC m=+1137.641488123" watchObservedRunningTime="2026-03-18 10:31:38.156372835 +0000 UTC m=+1137.648107160" Mar 18 10:31:38 crc kubenswrapper[4733]: I0318 10:31:38.175870 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-5gwmb" podStartSLOduration=2.175856936 podStartE2EDuration="2.175856936s" podCreationTimestamp="2026-03-18 10:31:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:31:38.168594071 +0000 UTC m=+1137.660328396" watchObservedRunningTime="2026-03-18 10:31:38.175856936 +0000 UTC m=+1137.667591261" Mar 18 10:31:38 crc kubenswrapper[4733]: I0318 10:31:38.205765 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-h4pnt" Mar 18 10:31:38 crc kubenswrapper[4733]: I0318 10:31:38.696573 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-0937-account-create-update-bfx7n" podStartSLOduration=2.696553071 podStartE2EDuration="2.696553071s" podCreationTimestamp="2026-03-18 10:31:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:31:38.185469028 +0000 UTC m=+1137.677203353" watchObservedRunningTime="2026-03-18 10:31:38.696553071 +0000 UTC m=+1138.188287396" Mar 18 10:31:38 crc kubenswrapper[4733]: I0318 10:31:38.699825 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-h4pnt"] Mar 18 10:31:38 crc kubenswrapper[4733]: W0318 10:31:38.705836 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7bdf8dbb_ffe1_48d1_9c79_22e37dd882be.slice/crio-5dd7dc77696d4097c1648883d1fba422fc00eb1a9ede4031a68c1b0d6e1e9d1c WatchSource:0}: Error finding container 5dd7dc77696d4097c1648883d1fba422fc00eb1a9ede4031a68c1b0d6e1e9d1c: Status 404 returned error can't find the container with id 5dd7dc77696d4097c1648883d1fba422fc00eb1a9ede4031a68c1b0d6e1e9d1c Mar 18 10:31:39 crc kubenswrapper[4733]: I0318 10:31:39.077555 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 18 10:31:39 crc kubenswrapper[4733]: I0318 10:31:39.082517 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 18 10:31:39 crc kubenswrapper[4733]: I0318 10:31:39.085504 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 18 10:31:39 crc kubenswrapper[4733]: I0318 10:31:39.085706 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 18 10:31:39 crc kubenswrapper[4733]: I0318 10:31:39.085853 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 18 10:31:39 crc kubenswrapper[4733]: I0318 10:31:39.086001 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-bhzrc" Mar 18 10:31:39 crc kubenswrapper[4733]: I0318 10:31:39.113076 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 18 10:31:39 crc kubenswrapper[4733]: I0318 10:31:39.150766 4733 generic.go:334] "Generic (PLEG): container finished" podID="a6346115-9e7a-4489-916d-a129aa83a6dd" containerID="1a52840f130018d9dd9a4d4957090d0bfe7cddccea8c86d998fc7ce63f88d2c3" exitCode=0 Mar 18 10:31:39 crc kubenswrapper[4733]: I0318 10:31:39.151162 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5e1e-account-create-update-r9bb4" event={"ID":"a6346115-9e7a-4489-916d-a129aa83a6dd","Type":"ContainerDied","Data":"1a52840f130018d9dd9a4d4957090d0bfe7cddccea8c86d998fc7ce63f88d2c3"} Mar 18 10:31:39 crc kubenswrapper[4733]: I0318 10:31:39.153241 4733 generic.go:334] "Generic (PLEG): container finished" podID="0a5425fb-7059-4262-9c68-1420a5f3b4f1" containerID="ce9a99c6df86d54aacd4034e75a79275a1f1a3fe6a26a1b9d309967e3b0b146b" exitCode=0 Mar 18 10:31:39 crc kubenswrapper[4733]: I0318 10:31:39.153325 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-5gwmb" event={"ID":"0a5425fb-7059-4262-9c68-1420a5f3b4f1","Type":"ContainerDied","Data":"ce9a99c6df86d54aacd4034e75a79275a1f1a3fe6a26a1b9d309967e3b0b146b"} Mar 18 10:31:39 crc kubenswrapper[4733]: I0318 10:31:39.155291 4733 generic.go:334] "Generic (PLEG): container finished" podID="7bdf8dbb-ffe1-48d1-9c79-22e37dd882be" containerID="be1323a707d76c996153e9edb3286a8842293d7d0852b41ecba2e5d11f48e074" exitCode=0 Mar 18 10:31:39 crc kubenswrapper[4733]: I0318 10:31:39.155353 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-h4pnt" event={"ID":"7bdf8dbb-ffe1-48d1-9c79-22e37dd882be","Type":"ContainerDied","Data":"be1323a707d76c996153e9edb3286a8842293d7d0852b41ecba2e5d11f48e074"} Mar 18 10:31:39 crc kubenswrapper[4733]: I0318 10:31:39.155375 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-h4pnt" event={"ID":"7bdf8dbb-ffe1-48d1-9c79-22e37dd882be","Type":"ContainerStarted","Data":"5dd7dc77696d4097c1648883d1fba422fc00eb1a9ede4031a68c1b0d6e1e9d1c"} Mar 18 10:31:39 crc kubenswrapper[4733]: I0318 10:31:39.159086 4733 generic.go:334] "Generic (PLEG): container finished" podID="07730b47-54ba-4b79-952e-6fb12b3b5279" containerID="881999bdad04a088176edfb2a1165638bbb818ce5892ed189c2612e4735ca703" exitCode=0 Mar 18 10:31:39 crc kubenswrapper[4733]: I0318 10:31:39.159198 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-0937-account-create-update-bfx7n" event={"ID":"07730b47-54ba-4b79-952e-6fb12b3b5279","Type":"ContainerDied","Data":"881999bdad04a088176edfb2a1165638bbb818ce5892ed189c2612e4735ca703"} Mar 18 10:31:39 crc kubenswrapper[4733]: I0318 10:31:39.160903 4733 generic.go:334] "Generic (PLEG): container finished" podID="30a7c351-0be1-4547-bacc-8ff02cb59328" containerID="cb5c331f367d49d9d35cab0a581b0fd4e3d8921934861b35f887d6648ae09cfb" exitCode=0 Mar 18 10:31:39 crc kubenswrapper[4733]: I0318 10:31:39.160994 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-mr65v" event={"ID":"30a7c351-0be1-4547-bacc-8ff02cb59328","Type":"ContainerDied","Data":"cb5c331f367d49d9d35cab0a581b0fd4e3d8921934861b35f887d6648ae09cfb"} Mar 18 10:31:39 crc kubenswrapper[4733]: I0318 10:31:39.209058 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"4f94cfc9-67cc-474c-8d99-58a9d4e0273f\") " pod="openstack/swift-storage-0" Mar 18 10:31:39 crc kubenswrapper[4733]: I0318 10:31:39.209156 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f94cfc9-67cc-474c-8d99-58a9d4e0273f-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"4f94cfc9-67cc-474c-8d99-58a9d4e0273f\") " pod="openstack/swift-storage-0" Mar 18 10:31:39 crc kubenswrapper[4733]: I0318 10:31:39.209325 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4f94cfc9-67cc-474c-8d99-58a9d4e0273f-etc-swift\") pod \"swift-storage-0\" (UID: \"4f94cfc9-67cc-474c-8d99-58a9d4e0273f\") " pod="openstack/swift-storage-0" Mar 18 10:31:39 crc kubenswrapper[4733]: I0318 10:31:39.209349 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9pgx\" (UniqueName: \"kubernetes.io/projected/4f94cfc9-67cc-474c-8d99-58a9d4e0273f-kube-api-access-n9pgx\") pod \"swift-storage-0\" (UID: \"4f94cfc9-67cc-474c-8d99-58a9d4e0273f\") " pod="openstack/swift-storage-0" Mar 18 10:31:39 crc kubenswrapper[4733]: I0318 10:31:39.209391 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/4f94cfc9-67cc-474c-8d99-58a9d4e0273f-cache\") pod \"swift-storage-0\" (UID: \"4f94cfc9-67cc-474c-8d99-58a9d4e0273f\") " pod="openstack/swift-storage-0" Mar 18 10:31:39 crc kubenswrapper[4733]: I0318 10:31:39.209413 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/4f94cfc9-67cc-474c-8d99-58a9d4e0273f-lock\") pod \"swift-storage-0\" (UID: \"4f94cfc9-67cc-474c-8d99-58a9d4e0273f\") " pod="openstack/swift-storage-0" Mar 18 10:31:39 crc kubenswrapper[4733]: I0318 10:31:39.311259 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4f94cfc9-67cc-474c-8d99-58a9d4e0273f-etc-swift\") pod \"swift-storage-0\" (UID: \"4f94cfc9-67cc-474c-8d99-58a9d4e0273f\") " pod="openstack/swift-storage-0" Mar 18 10:31:39 crc kubenswrapper[4733]: I0318 10:31:39.311346 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9pgx\" (UniqueName: \"kubernetes.io/projected/4f94cfc9-67cc-474c-8d99-58a9d4e0273f-kube-api-access-n9pgx\") pod \"swift-storage-0\" (UID: \"4f94cfc9-67cc-474c-8d99-58a9d4e0273f\") " pod="openstack/swift-storage-0" Mar 18 10:31:39 crc kubenswrapper[4733]: I0318 10:31:39.311385 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/4f94cfc9-67cc-474c-8d99-58a9d4e0273f-cache\") pod \"swift-storage-0\" (UID: \"4f94cfc9-67cc-474c-8d99-58a9d4e0273f\") " pod="openstack/swift-storage-0" Mar 18 10:31:39 crc kubenswrapper[4733]: I0318 10:31:39.311442 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/4f94cfc9-67cc-474c-8d99-58a9d4e0273f-lock\") pod \"swift-storage-0\" (UID: \"4f94cfc9-67cc-474c-8d99-58a9d4e0273f\") " pod="openstack/swift-storage-0" Mar 18 10:31:39 crc kubenswrapper[4733]: I0318 10:31:39.311521 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"4f94cfc9-67cc-474c-8d99-58a9d4e0273f\") " pod="openstack/swift-storage-0" Mar 18 10:31:39 crc kubenswrapper[4733]: E0318 10:31:39.311526 4733 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 18 10:31:39 crc kubenswrapper[4733]: E0318 10:31:39.311554 4733 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 18 10:31:39 crc kubenswrapper[4733]: E0318 10:31:39.311612 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4f94cfc9-67cc-474c-8d99-58a9d4e0273f-etc-swift podName:4f94cfc9-67cc-474c-8d99-58a9d4e0273f nodeName:}" failed. No retries permitted until 2026-03-18 10:31:39.811591225 +0000 UTC m=+1139.303325650 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4f94cfc9-67cc-474c-8d99-58a9d4e0273f-etc-swift") pod "swift-storage-0" (UID: "4f94cfc9-67cc-474c-8d99-58a9d4e0273f") : configmap "swift-ring-files" not found Mar 18 10:31:39 crc kubenswrapper[4733]: I0318 10:31:39.311728 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f94cfc9-67cc-474c-8d99-58a9d4e0273f-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"4f94cfc9-67cc-474c-8d99-58a9d4e0273f\") " pod="openstack/swift-storage-0" Mar 18 10:31:39 crc kubenswrapper[4733]: I0318 10:31:39.312006 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/4f94cfc9-67cc-474c-8d99-58a9d4e0273f-lock\") pod \"swift-storage-0\" (UID: \"4f94cfc9-67cc-474c-8d99-58a9d4e0273f\") " pod="openstack/swift-storage-0" Mar 18 10:31:39 crc kubenswrapper[4733]: I0318 10:31:39.312299 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/4f94cfc9-67cc-474c-8d99-58a9d4e0273f-cache\") pod \"swift-storage-0\" (UID: \"4f94cfc9-67cc-474c-8d99-58a9d4e0273f\") " pod="openstack/swift-storage-0" Mar 18 10:31:39 crc kubenswrapper[4733]: I0318 10:31:39.312517 4733 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"4f94cfc9-67cc-474c-8d99-58a9d4e0273f\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/swift-storage-0" Mar 18 10:31:39 crc kubenswrapper[4733]: I0318 10:31:39.322130 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f94cfc9-67cc-474c-8d99-58a9d4e0273f-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"4f94cfc9-67cc-474c-8d99-58a9d4e0273f\") " pod="openstack/swift-storage-0" Mar 18 10:31:39 crc kubenswrapper[4733]: I0318 10:31:39.331466 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9pgx\" (UniqueName: \"kubernetes.io/projected/4f94cfc9-67cc-474c-8d99-58a9d4e0273f-kube-api-access-n9pgx\") pod \"swift-storage-0\" (UID: \"4f94cfc9-67cc-474c-8d99-58a9d4e0273f\") " pod="openstack/swift-storage-0" Mar 18 10:31:39 crc kubenswrapper[4733]: I0318 10:31:39.341126 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"4f94cfc9-67cc-474c-8d99-58a9d4e0273f\") " pod="openstack/swift-storage-0" Mar 18 10:31:39 crc kubenswrapper[4733]: I0318 10:31:39.820150 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4f94cfc9-67cc-474c-8d99-58a9d4e0273f-etc-swift\") pod \"swift-storage-0\" (UID: \"4f94cfc9-67cc-474c-8d99-58a9d4e0273f\") " pod="openstack/swift-storage-0" Mar 18 10:31:39 crc kubenswrapper[4733]: E0318 10:31:39.820388 4733 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 18 10:31:39 crc kubenswrapper[4733]: E0318 10:31:39.821249 4733 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 18 10:31:39 crc kubenswrapper[4733]: E0318 10:31:39.821337 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4f94cfc9-67cc-474c-8d99-58a9d4e0273f-etc-swift podName:4f94cfc9-67cc-474c-8d99-58a9d4e0273f nodeName:}" failed. No retries permitted until 2026-03-18 10:31:40.82131139 +0000 UTC m=+1140.313045725 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4f94cfc9-67cc-474c-8d99-58a9d4e0273f-etc-swift") pod "swift-storage-0" (UID: "4f94cfc9-67cc-474c-8d99-58a9d4e0273f") : configmap "swift-ring-files" not found Mar 18 10:31:40 crc kubenswrapper[4733]: I0318 10:31:40.172227 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-h4pnt" event={"ID":"7bdf8dbb-ffe1-48d1-9c79-22e37dd882be","Type":"ContainerStarted","Data":"76c24ca485c5cd0e612df85bd0c8ef951256abf933fe1359b894cd82b8ea15fb"} Mar 18 10:31:40 crc kubenswrapper[4733]: I0318 10:31:40.205635 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-h4pnt" podStartSLOduration=3.205619546 podStartE2EDuration="3.205619546s" podCreationTimestamp="2026-03-18 10:31:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:31:40.198578337 +0000 UTC m=+1139.690312672" watchObservedRunningTime="2026-03-18 10:31:40.205619546 +0000 UTC m=+1139.697353871" Mar 18 10:31:40 crc kubenswrapper[4733]: I0318 10:31:40.636521 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-fvlqt"] Mar 18 10:31:40 crc kubenswrapper[4733]: I0318 10:31:40.637857 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-fvlqt" Mar 18 10:31:40 crc kubenswrapper[4733]: I0318 10:31:40.643492 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-fvlqt"] Mar 18 10:31:40 crc kubenswrapper[4733]: I0318 10:31:40.710101 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0937-account-create-update-bfx7n" Mar 18 10:31:40 crc kubenswrapper[4733]: I0318 10:31:40.715000 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-5795-account-create-update-nkww7"] Mar 18 10:31:40 crc kubenswrapper[4733]: E0318 10:31:40.715421 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07730b47-54ba-4b79-952e-6fb12b3b5279" containerName="mariadb-account-create-update" Mar 18 10:31:40 crc kubenswrapper[4733]: I0318 10:31:40.715439 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="07730b47-54ba-4b79-952e-6fb12b3b5279" containerName="mariadb-account-create-update" Mar 18 10:31:40 crc kubenswrapper[4733]: I0318 10:31:40.715629 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="07730b47-54ba-4b79-952e-6fb12b3b5279" containerName="mariadb-account-create-update" Mar 18 10:31:40 crc kubenswrapper[4733]: I0318 10:31:40.716318 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5795-account-create-update-nkww7" Mar 18 10:31:40 crc kubenswrapper[4733]: I0318 10:31:40.719129 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 18 10:31:40 crc kubenswrapper[4733]: I0318 10:31:40.743660 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spqc4\" (UniqueName: \"kubernetes.io/projected/84d4401f-2343-41fa-82ae-877674337bf4-kube-api-access-spqc4\") pod \"glance-db-create-fvlqt\" (UID: \"84d4401f-2343-41fa-82ae-877674337bf4\") " pod="openstack/glance-db-create-fvlqt" Mar 18 10:31:40 crc kubenswrapper[4733]: I0318 10:31:40.743695 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84d4401f-2343-41fa-82ae-877674337bf4-operator-scripts\") pod \"glance-db-create-fvlqt\" (UID: \"84d4401f-2343-41fa-82ae-877674337bf4\") " pod="openstack/glance-db-create-fvlqt" Mar 18 10:31:40 crc kubenswrapper[4733]: I0318 10:31:40.753751 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-5795-account-create-update-nkww7"] Mar 18 10:31:40 crc kubenswrapper[4733]: I0318 10:31:40.786159 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-5gwmb" Mar 18 10:31:40 crc kubenswrapper[4733]: I0318 10:31:40.835776 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mr65v" Mar 18 10:31:40 crc kubenswrapper[4733]: I0318 10:31:40.841961 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5e1e-account-create-update-r9bb4" Mar 18 10:31:40 crc kubenswrapper[4733]: I0318 10:31:40.845709 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0a5425fb-7059-4262-9c68-1420a5f3b4f1-operator-scripts\") pod \"0a5425fb-7059-4262-9c68-1420a5f3b4f1\" (UID: \"0a5425fb-7059-4262-9c68-1420a5f3b4f1\") " Mar 18 10:31:40 crc kubenswrapper[4733]: I0318 10:31:40.845807 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/07730b47-54ba-4b79-952e-6fb12b3b5279-operator-scripts\") pod \"07730b47-54ba-4b79-952e-6fb12b3b5279\" (UID: \"07730b47-54ba-4b79-952e-6fb12b3b5279\") " Mar 18 10:31:40 crc kubenswrapper[4733]: I0318 10:31:40.845836 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrskv\" (UniqueName: \"kubernetes.io/projected/07730b47-54ba-4b79-952e-6fb12b3b5279-kube-api-access-hrskv\") pod \"07730b47-54ba-4b79-952e-6fb12b3b5279\" (UID: \"07730b47-54ba-4b79-952e-6fb12b3b5279\") " Mar 18 10:31:40 crc kubenswrapper[4733]: I0318 10:31:40.845878 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkxpg\" (UniqueName: \"kubernetes.io/projected/0a5425fb-7059-4262-9c68-1420a5f3b4f1-kube-api-access-vkxpg\") pod \"0a5425fb-7059-4262-9c68-1420a5f3b4f1\" (UID: \"0a5425fb-7059-4262-9c68-1420a5f3b4f1\") " Mar 18 10:31:40 crc kubenswrapper[4733]: I0318 10:31:40.846175 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a5425fb-7059-4262-9c68-1420a5f3b4f1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0a5425fb-7059-4262-9c68-1420a5f3b4f1" (UID: "0a5425fb-7059-4262-9c68-1420a5f3b4f1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:31:40 crc kubenswrapper[4733]: I0318 10:31:40.846210 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff5315db-fb68-4558-85c1-cf538d0e2770-operator-scripts\") pod \"glance-5795-account-create-update-nkww7\" (UID: \"ff5315db-fb68-4558-85c1-cf538d0e2770\") " pod="openstack/glance-5795-account-create-update-nkww7" Mar 18 10:31:40 crc kubenswrapper[4733]: I0318 10:31:40.846255 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4f94cfc9-67cc-474c-8d99-58a9d4e0273f-etc-swift\") pod \"swift-storage-0\" (UID: \"4f94cfc9-67cc-474c-8d99-58a9d4e0273f\") " pod="openstack/swift-storage-0" Mar 18 10:31:40 crc kubenswrapper[4733]: I0318 10:31:40.846286 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spqc4\" (UniqueName: \"kubernetes.io/projected/84d4401f-2343-41fa-82ae-877674337bf4-kube-api-access-spqc4\") pod \"glance-db-create-fvlqt\" (UID: \"84d4401f-2343-41fa-82ae-877674337bf4\") " pod="openstack/glance-db-create-fvlqt" Mar 18 10:31:40 crc kubenswrapper[4733]: I0318 10:31:40.846308 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84d4401f-2343-41fa-82ae-877674337bf4-operator-scripts\") pod \"glance-db-create-fvlqt\" (UID: \"84d4401f-2343-41fa-82ae-877674337bf4\") " pod="openstack/glance-db-create-fvlqt" Mar 18 10:31:40 crc kubenswrapper[4733]: I0318 10:31:40.846353 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfscv\" (UniqueName: \"kubernetes.io/projected/ff5315db-fb68-4558-85c1-cf538d0e2770-kube-api-access-jfscv\") pod \"glance-5795-account-create-update-nkww7\" (UID: \"ff5315db-fb68-4558-85c1-cf538d0e2770\") " pod="openstack/glance-5795-account-create-update-nkww7" Mar 18 10:31:40 crc kubenswrapper[4733]: I0318 10:31:40.846462 4733 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0a5425fb-7059-4262-9c68-1420a5f3b4f1-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 10:31:40 crc kubenswrapper[4733]: E0318 10:31:40.846583 4733 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 18 10:31:40 crc kubenswrapper[4733]: E0318 10:31:40.846640 4733 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 18 10:31:40 crc kubenswrapper[4733]: E0318 10:31:40.846686 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4f94cfc9-67cc-474c-8d99-58a9d4e0273f-etc-swift podName:4f94cfc9-67cc-474c-8d99-58a9d4e0273f nodeName:}" failed. No retries permitted until 2026-03-18 10:31:42.846668957 +0000 UTC m=+1142.338403282 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4f94cfc9-67cc-474c-8d99-58a9d4e0273f-etc-swift") pod "swift-storage-0" (UID: "4f94cfc9-67cc-474c-8d99-58a9d4e0273f") : configmap "swift-ring-files" not found Mar 18 10:31:40 crc kubenswrapper[4733]: I0318 10:31:40.846977 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84d4401f-2343-41fa-82ae-877674337bf4-operator-scripts\") pod \"glance-db-create-fvlqt\" (UID: \"84d4401f-2343-41fa-82ae-877674337bf4\") " pod="openstack/glance-db-create-fvlqt" Mar 18 10:31:40 crc kubenswrapper[4733]: I0318 10:31:40.847648 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07730b47-54ba-4b79-952e-6fb12b3b5279-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "07730b47-54ba-4b79-952e-6fb12b3b5279" (UID: "07730b47-54ba-4b79-952e-6fb12b3b5279"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:31:40 crc kubenswrapper[4733]: I0318 10:31:40.866320 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07730b47-54ba-4b79-952e-6fb12b3b5279-kube-api-access-hrskv" (OuterVolumeSpecName: "kube-api-access-hrskv") pod "07730b47-54ba-4b79-952e-6fb12b3b5279" (UID: "07730b47-54ba-4b79-952e-6fb12b3b5279"). InnerVolumeSpecName "kube-api-access-hrskv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:31:40 crc kubenswrapper[4733]: I0318 10:31:40.867834 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spqc4\" (UniqueName: \"kubernetes.io/projected/84d4401f-2343-41fa-82ae-877674337bf4-kube-api-access-spqc4\") pod \"glance-db-create-fvlqt\" (UID: \"84d4401f-2343-41fa-82ae-877674337bf4\") " pod="openstack/glance-db-create-fvlqt" Mar 18 10:31:40 crc kubenswrapper[4733]: I0318 10:31:40.876495 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a5425fb-7059-4262-9c68-1420a5f3b4f1-kube-api-access-vkxpg" (OuterVolumeSpecName: "kube-api-access-vkxpg") pod "0a5425fb-7059-4262-9c68-1420a5f3b4f1" (UID: "0a5425fb-7059-4262-9c68-1420a5f3b4f1"). InnerVolumeSpecName "kube-api-access-vkxpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:31:40 crc kubenswrapper[4733]: I0318 10:31:40.954384 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2qhr\" (UniqueName: \"kubernetes.io/projected/30a7c351-0be1-4547-bacc-8ff02cb59328-kube-api-access-w2qhr\") pod \"30a7c351-0be1-4547-bacc-8ff02cb59328\" (UID: \"30a7c351-0be1-4547-bacc-8ff02cb59328\") " Mar 18 10:31:40 crc kubenswrapper[4733]: I0318 10:31:40.954441 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6346115-9e7a-4489-916d-a129aa83a6dd-operator-scripts\") pod \"a6346115-9e7a-4489-916d-a129aa83a6dd\" (UID: \"a6346115-9e7a-4489-916d-a129aa83a6dd\") " Mar 18 10:31:40 crc kubenswrapper[4733]: I0318 10:31:40.954581 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6k7n9\" (UniqueName: \"kubernetes.io/projected/a6346115-9e7a-4489-916d-a129aa83a6dd-kube-api-access-6k7n9\") pod \"a6346115-9e7a-4489-916d-a129aa83a6dd\" (UID: \"a6346115-9e7a-4489-916d-a129aa83a6dd\") " Mar 18 10:31:40 crc kubenswrapper[4733]: I0318 10:31:40.954653 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30a7c351-0be1-4547-bacc-8ff02cb59328-operator-scripts\") pod \"30a7c351-0be1-4547-bacc-8ff02cb59328\" (UID: \"30a7c351-0be1-4547-bacc-8ff02cb59328\") " Mar 18 10:31:40 crc kubenswrapper[4733]: I0318 10:31:40.954876 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff5315db-fb68-4558-85c1-cf538d0e2770-operator-scripts\") pod \"glance-5795-account-create-update-nkww7\" (UID: \"ff5315db-fb68-4558-85c1-cf538d0e2770\") " pod="openstack/glance-5795-account-create-update-nkww7" Mar 18 10:31:40 crc kubenswrapper[4733]: I0318 10:31:40.955027 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfscv\" (UniqueName: \"kubernetes.io/projected/ff5315db-fb68-4558-85c1-cf538d0e2770-kube-api-access-jfscv\") pod \"glance-5795-account-create-update-nkww7\" (UID: \"ff5315db-fb68-4558-85c1-cf538d0e2770\") " pod="openstack/glance-5795-account-create-update-nkww7" Mar 18 10:31:40 crc kubenswrapper[4733]: I0318 10:31:40.955301 4733 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/07730b47-54ba-4b79-952e-6fb12b3b5279-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 10:31:40 crc kubenswrapper[4733]: I0318 10:31:40.955315 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hrskv\" (UniqueName: \"kubernetes.io/projected/07730b47-54ba-4b79-952e-6fb12b3b5279-kube-api-access-hrskv\") on node \"crc\" DevicePath \"\"" Mar 18 10:31:40 crc kubenswrapper[4733]: I0318 10:31:40.955328 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vkxpg\" (UniqueName: \"kubernetes.io/projected/0a5425fb-7059-4262-9c68-1420a5f3b4f1-kube-api-access-vkxpg\") on node \"crc\" DevicePath \"\"" Mar 18 10:31:40 crc kubenswrapper[4733]: I0318 10:31:40.956639 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6346115-9e7a-4489-916d-a129aa83a6dd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a6346115-9e7a-4489-916d-a129aa83a6dd" (UID: "a6346115-9e7a-4489-916d-a129aa83a6dd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:31:40 crc kubenswrapper[4733]: I0318 10:31:40.956895 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30a7c351-0be1-4547-bacc-8ff02cb59328-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "30a7c351-0be1-4547-bacc-8ff02cb59328" (UID: "30a7c351-0be1-4547-bacc-8ff02cb59328"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:31:40 crc kubenswrapper[4733]: I0318 10:31:40.957493 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff5315db-fb68-4558-85c1-cf538d0e2770-operator-scripts\") pod \"glance-5795-account-create-update-nkww7\" (UID: \"ff5315db-fb68-4558-85c1-cf538d0e2770\") " pod="openstack/glance-5795-account-create-update-nkww7" Mar 18 10:31:40 crc kubenswrapper[4733]: I0318 10:31:40.959865 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6346115-9e7a-4489-916d-a129aa83a6dd-kube-api-access-6k7n9" (OuterVolumeSpecName: "kube-api-access-6k7n9") pod "a6346115-9e7a-4489-916d-a129aa83a6dd" (UID: "a6346115-9e7a-4489-916d-a129aa83a6dd"). InnerVolumeSpecName "kube-api-access-6k7n9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:31:40 crc kubenswrapper[4733]: I0318 10:31:40.959913 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30a7c351-0be1-4547-bacc-8ff02cb59328-kube-api-access-w2qhr" (OuterVolumeSpecName: "kube-api-access-w2qhr") pod "30a7c351-0be1-4547-bacc-8ff02cb59328" (UID: "30a7c351-0be1-4547-bacc-8ff02cb59328"). InnerVolumeSpecName "kube-api-access-w2qhr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:31:40 crc kubenswrapper[4733]: I0318 10:31:40.972813 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfscv\" (UniqueName: \"kubernetes.io/projected/ff5315db-fb68-4558-85c1-cf538d0e2770-kube-api-access-jfscv\") pod \"glance-5795-account-create-update-nkww7\" (UID: \"ff5315db-fb68-4558-85c1-cf538d0e2770\") " pod="openstack/glance-5795-account-create-update-nkww7" Mar 18 10:31:41 crc kubenswrapper[4733]: I0318 10:31:41.056330 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6k7n9\" (UniqueName: \"kubernetes.io/projected/a6346115-9e7a-4489-916d-a129aa83a6dd-kube-api-access-6k7n9\") on node \"crc\" DevicePath \"\"" Mar 18 10:31:41 crc kubenswrapper[4733]: I0318 10:31:41.056364 4733 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30a7c351-0be1-4547-bacc-8ff02cb59328-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 10:31:41 crc kubenswrapper[4733]: I0318 10:31:41.056375 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2qhr\" (UniqueName: \"kubernetes.io/projected/30a7c351-0be1-4547-bacc-8ff02cb59328-kube-api-access-w2qhr\") on node \"crc\" DevicePath \"\"" Mar 18 10:31:41 crc kubenswrapper[4733]: I0318 10:31:41.056386 4733 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6346115-9e7a-4489-916d-a129aa83a6dd-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 10:31:41 crc kubenswrapper[4733]: I0318 10:31:41.083321 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-fvlqt" Mar 18 10:31:41 crc kubenswrapper[4733]: I0318 10:31:41.105171 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5795-account-create-update-nkww7" Mar 18 10:31:41 crc kubenswrapper[4733]: I0318 10:31:41.200698 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5e1e-account-create-update-r9bb4" event={"ID":"a6346115-9e7a-4489-916d-a129aa83a6dd","Type":"ContainerDied","Data":"6836746bdde3ee4414bdc9c36bf575e7d73bcbca50d7355ca4160c2ed4ad5259"} Mar 18 10:31:41 crc kubenswrapper[4733]: I0318 10:31:41.200744 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6836746bdde3ee4414bdc9c36bf575e7d73bcbca50d7355ca4160c2ed4ad5259" Mar 18 10:31:41 crc kubenswrapper[4733]: I0318 10:31:41.200749 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5e1e-account-create-update-r9bb4" Mar 18 10:31:41 crc kubenswrapper[4733]: I0318 10:31:41.202419 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-mr65v" event={"ID":"30a7c351-0be1-4547-bacc-8ff02cb59328","Type":"ContainerDied","Data":"e708787994921982a80be3d54b5684318bbda7cf4ce148559d792c7f967c93d8"} Mar 18 10:31:41 crc kubenswrapper[4733]: I0318 10:31:41.202448 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e708787994921982a80be3d54b5684318bbda7cf4ce148559d792c7f967c93d8" Mar 18 10:31:41 crc kubenswrapper[4733]: I0318 10:31:41.202508 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mr65v" Mar 18 10:31:41 crc kubenswrapper[4733]: I0318 10:31:41.209216 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-5gwmb" event={"ID":"0a5425fb-7059-4262-9c68-1420a5f3b4f1","Type":"ContainerDied","Data":"f99fa9061248dffa81f529bc9a8356e3700d233c627489528d2203d1084f078a"} Mar 18 10:31:41 crc kubenswrapper[4733]: I0318 10:31:41.209267 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f99fa9061248dffa81f529bc9a8356e3700d233c627489528d2203d1084f078a" Mar 18 10:31:41 crc kubenswrapper[4733]: I0318 10:31:41.209356 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-5gwmb" Mar 18 10:31:41 crc kubenswrapper[4733]: I0318 10:31:41.211884 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-0937-account-create-update-bfx7n" event={"ID":"07730b47-54ba-4b79-952e-6fb12b3b5279","Type":"ContainerDied","Data":"4dd3dd5968f22d9782c54516f96136d7407f843a2525378831477533afa22b84"} Mar 18 10:31:41 crc kubenswrapper[4733]: I0318 10:31:41.211959 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4dd3dd5968f22d9782c54516f96136d7407f843a2525378831477533afa22b84" Mar 18 10:31:41 crc kubenswrapper[4733]: I0318 10:31:41.211932 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0937-account-create-update-bfx7n" Mar 18 10:31:41 crc kubenswrapper[4733]: I0318 10:31:41.212068 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-h4pnt" Mar 18 10:31:41 crc kubenswrapper[4733]: I0318 10:31:41.707683 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-5795-account-create-update-nkww7"] Mar 18 10:31:41 crc kubenswrapper[4733]: W0318 10:31:41.710515 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff5315db_fb68_4558_85c1_cf538d0e2770.slice/crio-2fb7df9dc8f479fa452223731ed81c51e757e022bc1a6f323d9c6bcfdc951d47 WatchSource:0}: Error finding container 2fb7df9dc8f479fa452223731ed81c51e757e022bc1a6f323d9c6bcfdc951d47: Status 404 returned error can't find the container with id 2fb7df9dc8f479fa452223731ed81c51e757e022bc1a6f323d9c6bcfdc951d47 Mar 18 10:31:41 crc kubenswrapper[4733]: W0318 10:31:41.717999 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84d4401f_2343_41fa_82ae_877674337bf4.slice/crio-b8b55af5d60c9bfa951021421dc142b3018bef82cf543260041378c3f0e7cbde WatchSource:0}: Error finding container b8b55af5d60c9bfa951021421dc142b3018bef82cf543260041378c3f0e7cbde: Status 404 returned error can't find the container with id b8b55af5d60c9bfa951021421dc142b3018bef82cf543260041378c3f0e7cbde Mar 18 10:31:41 crc kubenswrapper[4733]: I0318 10:31:41.721857 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 18 10:31:41 crc kubenswrapper[4733]: I0318 10:31:41.725584 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-fvlqt"] Mar 18 10:31:42 crc kubenswrapper[4733]: I0318 10:31:42.220915 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-5795-account-create-update-nkww7" event={"ID":"ff5315db-fb68-4558-85c1-cf538d0e2770","Type":"ContainerStarted","Data":"2fb7df9dc8f479fa452223731ed81c51e757e022bc1a6f323d9c6bcfdc951d47"} Mar 18 10:31:42 crc kubenswrapper[4733]: I0318 10:31:42.222261 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-fvlqt" event={"ID":"84d4401f-2343-41fa-82ae-877674337bf4","Type":"ContainerStarted","Data":"b8b55af5d60c9bfa951021421dc142b3018bef82cf543260041378c3f0e7cbde"} Mar 18 10:31:42 crc kubenswrapper[4733]: I0318 10:31:42.548748 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-dj64z"] Mar 18 10:31:42 crc kubenswrapper[4733]: E0318 10:31:42.549088 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6346115-9e7a-4489-916d-a129aa83a6dd" containerName="mariadb-account-create-update" Mar 18 10:31:42 crc kubenswrapper[4733]: I0318 10:31:42.549105 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6346115-9e7a-4489-916d-a129aa83a6dd" containerName="mariadb-account-create-update" Mar 18 10:31:42 crc kubenswrapper[4733]: E0318 10:31:42.549119 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30a7c351-0be1-4547-bacc-8ff02cb59328" containerName="mariadb-database-create" Mar 18 10:31:42 crc kubenswrapper[4733]: I0318 10:31:42.549126 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="30a7c351-0be1-4547-bacc-8ff02cb59328" containerName="mariadb-database-create" Mar 18 10:31:42 crc kubenswrapper[4733]: E0318 10:31:42.549136 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a5425fb-7059-4262-9c68-1420a5f3b4f1" containerName="mariadb-database-create" Mar 18 10:31:42 crc kubenswrapper[4733]: I0318 10:31:42.549147 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a5425fb-7059-4262-9c68-1420a5f3b4f1" containerName="mariadb-database-create" Mar 18 10:31:42 crc kubenswrapper[4733]: I0318 10:31:42.549328 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="30a7c351-0be1-4547-bacc-8ff02cb59328" containerName="mariadb-database-create" Mar 18 10:31:42 crc kubenswrapper[4733]: I0318 10:31:42.549342 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a5425fb-7059-4262-9c68-1420a5f3b4f1" containerName="mariadb-database-create" Mar 18 10:31:42 crc kubenswrapper[4733]: I0318 10:31:42.549356 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6346115-9e7a-4489-916d-a129aa83a6dd" containerName="mariadb-account-create-update" Mar 18 10:31:42 crc kubenswrapper[4733]: I0318 10:31:42.549845 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-dj64z" Mar 18 10:31:42 crc kubenswrapper[4733]: I0318 10:31:42.553671 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 18 10:31:42 crc kubenswrapper[4733]: I0318 10:31:42.565095 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-dj64z"] Mar 18 10:31:42 crc kubenswrapper[4733]: I0318 10:31:42.695825 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x78zp\" (UniqueName: \"kubernetes.io/projected/53658e2a-4376-49b7-82eb-f46c3dee3b6a-kube-api-access-x78zp\") pod \"root-account-create-update-dj64z\" (UID: \"53658e2a-4376-49b7-82eb-f46c3dee3b6a\") " pod="openstack/root-account-create-update-dj64z" Mar 18 10:31:42 crc kubenswrapper[4733]: I0318 10:31:42.695886 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53658e2a-4376-49b7-82eb-f46c3dee3b6a-operator-scripts\") pod \"root-account-create-update-dj64z\" (UID: \"53658e2a-4376-49b7-82eb-f46c3dee3b6a\") " pod="openstack/root-account-create-update-dj64z" Mar 18 10:31:42 crc kubenswrapper[4733]: I0318 10:31:42.797952 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x78zp\" (UniqueName: \"kubernetes.io/projected/53658e2a-4376-49b7-82eb-f46c3dee3b6a-kube-api-access-x78zp\") pod \"root-account-create-update-dj64z\" (UID: \"53658e2a-4376-49b7-82eb-f46c3dee3b6a\") " pod="openstack/root-account-create-update-dj64z" Mar 18 10:31:42 crc kubenswrapper[4733]: I0318 10:31:42.798023 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53658e2a-4376-49b7-82eb-f46c3dee3b6a-operator-scripts\") pod \"root-account-create-update-dj64z\" (UID: \"53658e2a-4376-49b7-82eb-f46c3dee3b6a\") " pod="openstack/root-account-create-update-dj64z" Mar 18 10:31:42 crc kubenswrapper[4733]: I0318 10:31:42.798847 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53658e2a-4376-49b7-82eb-f46c3dee3b6a-operator-scripts\") pod \"root-account-create-update-dj64z\" (UID: \"53658e2a-4376-49b7-82eb-f46c3dee3b6a\") " pod="openstack/root-account-create-update-dj64z" Mar 18 10:31:42 crc kubenswrapper[4733]: I0318 10:31:42.816853 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x78zp\" (UniqueName: \"kubernetes.io/projected/53658e2a-4376-49b7-82eb-f46c3dee3b6a-kube-api-access-x78zp\") pod \"root-account-create-update-dj64z\" (UID: \"53658e2a-4376-49b7-82eb-f46c3dee3b6a\") " pod="openstack/root-account-create-update-dj64z" Mar 18 10:31:42 crc kubenswrapper[4733]: I0318 10:31:42.867855 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-dj64z" Mar 18 10:31:42 crc kubenswrapper[4733]: I0318 10:31:42.900161 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4f94cfc9-67cc-474c-8d99-58a9d4e0273f-etc-swift\") pod \"swift-storage-0\" (UID: \"4f94cfc9-67cc-474c-8d99-58a9d4e0273f\") " pod="openstack/swift-storage-0" Mar 18 10:31:42 crc kubenswrapper[4733]: E0318 10:31:42.900431 4733 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 18 10:31:42 crc kubenswrapper[4733]: E0318 10:31:42.900448 4733 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 18 10:31:42 crc kubenswrapper[4733]: E0318 10:31:42.900497 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4f94cfc9-67cc-474c-8d99-58a9d4e0273f-etc-swift podName:4f94cfc9-67cc-474c-8d99-58a9d4e0273f nodeName:}" failed. No retries permitted until 2026-03-18 10:31:46.900479547 +0000 UTC m=+1146.392213872 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4f94cfc9-67cc-474c-8d99-58a9d4e0273f-etc-swift") pod "swift-storage-0" (UID: "4f94cfc9-67cc-474c-8d99-58a9d4e0273f") : configmap "swift-ring-files" not found Mar 18 10:31:42 crc kubenswrapper[4733]: I0318 10:31:42.909898 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-5ngrz"] Mar 18 10:31:42 crc kubenswrapper[4733]: I0318 10:31:42.911107 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-5ngrz" Mar 18 10:31:42 crc kubenswrapper[4733]: I0318 10:31:42.916664 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 18 10:31:42 crc kubenswrapper[4733]: I0318 10:31:42.916676 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 18 10:31:42 crc kubenswrapper[4733]: I0318 10:31:42.926901 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 18 10:31:42 crc kubenswrapper[4733]: I0318 10:31:42.948308 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-5ngrz"] Mar 18 10:31:42 crc kubenswrapper[4733]: I0318 10:31:42.976004 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-nfmp2"] Mar 18 10:31:42 crc kubenswrapper[4733]: I0318 10:31:42.976984 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-nfmp2" Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.012173 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ca2a461b-b7e7-4c74-9da8-2df85b95e6fd-swiftconf\") pod \"swift-ring-rebalance-5ngrz\" (UID: \"ca2a461b-b7e7-4c74-9da8-2df85b95e6fd\") " pod="openstack/swift-ring-rebalance-5ngrz" Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.012397 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca2a461b-b7e7-4c74-9da8-2df85b95e6fd-combined-ca-bundle\") pod \"swift-ring-rebalance-5ngrz\" (UID: \"ca2a461b-b7e7-4c74-9da8-2df85b95e6fd\") " pod="openstack/swift-ring-rebalance-5ngrz" Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.012520 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ca2a461b-b7e7-4c74-9da8-2df85b95e6fd-scripts\") pod \"swift-ring-rebalance-5ngrz\" (UID: \"ca2a461b-b7e7-4c74-9da8-2df85b95e6fd\") " pod="openstack/swift-ring-rebalance-5ngrz" Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.012631 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qh78t\" (UniqueName: \"kubernetes.io/projected/ca2a461b-b7e7-4c74-9da8-2df85b95e6fd-kube-api-access-qh78t\") pod \"swift-ring-rebalance-5ngrz\" (UID: \"ca2a461b-b7e7-4c74-9da8-2df85b95e6fd\") " pod="openstack/swift-ring-rebalance-5ngrz" Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.012713 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ca2a461b-b7e7-4c74-9da8-2df85b95e6fd-dispersionconf\") pod \"swift-ring-rebalance-5ngrz\" (UID: \"ca2a461b-b7e7-4c74-9da8-2df85b95e6fd\") " pod="openstack/swift-ring-rebalance-5ngrz" Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.012807 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ca2a461b-b7e7-4c74-9da8-2df85b95e6fd-etc-swift\") pod \"swift-ring-rebalance-5ngrz\" (UID: \"ca2a461b-b7e7-4c74-9da8-2df85b95e6fd\") " pod="openstack/swift-ring-rebalance-5ngrz" Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.012877 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ca2a461b-b7e7-4c74-9da8-2df85b95e6fd-ring-data-devices\") pod \"swift-ring-rebalance-5ngrz\" (UID: \"ca2a461b-b7e7-4c74-9da8-2df85b95e6fd\") " pod="openstack/swift-ring-rebalance-5ngrz" Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.014969 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-nfmp2"] Mar 18 10:31:43 crc kubenswrapper[4733]: E0318 10:31:43.018590 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-qh78t ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/swift-ring-rebalance-5ngrz" podUID="ca2a461b-b7e7-4c74-9da8-2df85b95e6fd" Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.037412 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-5ngrz"] Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.133320 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5e3fc960-7783-4952-90c9-1551c780ae03-dispersionconf\") pod \"swift-ring-rebalance-nfmp2\" (UID: \"5e3fc960-7783-4952-90c9-1551c780ae03\") " pod="openstack/swift-ring-rebalance-nfmp2" Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.133386 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e3fc960-7783-4952-90c9-1551c780ae03-combined-ca-bundle\") pod \"swift-ring-rebalance-nfmp2\" (UID: \"5e3fc960-7783-4952-90c9-1551c780ae03\") " pod="openstack/swift-ring-rebalance-nfmp2" Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.133419 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qh78t\" (UniqueName: \"kubernetes.io/projected/ca2a461b-b7e7-4c74-9da8-2df85b95e6fd-kube-api-access-qh78t\") pod \"swift-ring-rebalance-5ngrz\" (UID: \"ca2a461b-b7e7-4c74-9da8-2df85b95e6fd\") " pod="openstack/swift-ring-rebalance-5ngrz" Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.133446 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ca2a461b-b7e7-4c74-9da8-2df85b95e6fd-dispersionconf\") pod \"swift-ring-rebalance-5ngrz\" (UID: \"ca2a461b-b7e7-4c74-9da8-2df85b95e6fd\") " pod="openstack/swift-ring-rebalance-5ngrz" Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.133481 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5e3fc960-7783-4952-90c9-1551c780ae03-etc-swift\") pod \"swift-ring-rebalance-nfmp2\" (UID: \"5e3fc960-7783-4952-90c9-1551c780ae03\") " pod="openstack/swift-ring-rebalance-nfmp2" Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.133504 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5e3fc960-7783-4952-90c9-1551c780ae03-swiftconf\") pod \"swift-ring-rebalance-nfmp2\" (UID: \"5e3fc960-7783-4952-90c9-1551c780ae03\") " pod="openstack/swift-ring-rebalance-nfmp2" Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.133536 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ca2a461b-b7e7-4c74-9da8-2df85b95e6fd-etc-swift\") pod \"swift-ring-rebalance-5ngrz\" (UID: \"ca2a461b-b7e7-4c74-9da8-2df85b95e6fd\") " pod="openstack/swift-ring-rebalance-5ngrz" Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.133559 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ca2a461b-b7e7-4c74-9da8-2df85b95e6fd-ring-data-devices\") pod \"swift-ring-rebalance-5ngrz\" (UID: \"ca2a461b-b7e7-4c74-9da8-2df85b95e6fd\") " pod="openstack/swift-ring-rebalance-5ngrz" Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.133586 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5e3fc960-7783-4952-90c9-1551c780ae03-scripts\") pod \"swift-ring-rebalance-nfmp2\" (UID: \"5e3fc960-7783-4952-90c9-1551c780ae03\") " pod="openstack/swift-ring-rebalance-nfmp2" Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.133629 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5e3fc960-7783-4952-90c9-1551c780ae03-ring-data-devices\") pod \"swift-ring-rebalance-nfmp2\" (UID: \"5e3fc960-7783-4952-90c9-1551c780ae03\") " pod="openstack/swift-ring-rebalance-nfmp2" Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.133645 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2xzc\" (UniqueName: \"kubernetes.io/projected/5e3fc960-7783-4952-90c9-1551c780ae03-kube-api-access-c2xzc\") pod \"swift-ring-rebalance-nfmp2\" (UID: \"5e3fc960-7783-4952-90c9-1551c780ae03\") " pod="openstack/swift-ring-rebalance-nfmp2" Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.133662 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ca2a461b-b7e7-4c74-9da8-2df85b95e6fd-swiftconf\") pod \"swift-ring-rebalance-5ngrz\" (UID: \"ca2a461b-b7e7-4c74-9da8-2df85b95e6fd\") " pod="openstack/swift-ring-rebalance-5ngrz" Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.133681 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca2a461b-b7e7-4c74-9da8-2df85b95e6fd-combined-ca-bundle\") pod \"swift-ring-rebalance-5ngrz\" (UID: \"ca2a461b-b7e7-4c74-9da8-2df85b95e6fd\") " pod="openstack/swift-ring-rebalance-5ngrz" Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.133709 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ca2a461b-b7e7-4c74-9da8-2df85b95e6fd-scripts\") pod \"swift-ring-rebalance-5ngrz\" (UID: \"ca2a461b-b7e7-4c74-9da8-2df85b95e6fd\") " pod="openstack/swift-ring-rebalance-5ngrz" Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.134803 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ca2a461b-b7e7-4c74-9da8-2df85b95e6fd-scripts\") pod \"swift-ring-rebalance-5ngrz\" (UID: \"ca2a461b-b7e7-4c74-9da8-2df85b95e6fd\") " pod="openstack/swift-ring-rebalance-5ngrz" Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.135614 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ca2a461b-b7e7-4c74-9da8-2df85b95e6fd-etc-swift\") pod \"swift-ring-rebalance-5ngrz\" (UID: \"ca2a461b-b7e7-4c74-9da8-2df85b95e6fd\") " pod="openstack/swift-ring-rebalance-5ngrz" Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.136070 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ca2a461b-b7e7-4c74-9da8-2df85b95e6fd-ring-data-devices\") pod \"swift-ring-rebalance-5ngrz\" (UID: \"ca2a461b-b7e7-4c74-9da8-2df85b95e6fd\") " pod="openstack/swift-ring-rebalance-5ngrz" Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.142357 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca2a461b-b7e7-4c74-9da8-2df85b95e6fd-combined-ca-bundle\") pod \"swift-ring-rebalance-5ngrz\" (UID: \"ca2a461b-b7e7-4c74-9da8-2df85b95e6fd\") " pod="openstack/swift-ring-rebalance-5ngrz" Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.163789 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ca2a461b-b7e7-4c74-9da8-2df85b95e6fd-swiftconf\") pod \"swift-ring-rebalance-5ngrz\" (UID: \"ca2a461b-b7e7-4c74-9da8-2df85b95e6fd\") " pod="openstack/swift-ring-rebalance-5ngrz" Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.167538 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ca2a461b-b7e7-4c74-9da8-2df85b95e6fd-dispersionconf\") pod \"swift-ring-rebalance-5ngrz\" (UID: \"ca2a461b-b7e7-4c74-9da8-2df85b95e6fd\") " pod="openstack/swift-ring-rebalance-5ngrz" Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.211076 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qh78t\" (UniqueName: \"kubernetes.io/projected/ca2a461b-b7e7-4c74-9da8-2df85b95e6fd-kube-api-access-qh78t\") pod \"swift-ring-rebalance-5ngrz\" (UID: \"ca2a461b-b7e7-4c74-9da8-2df85b95e6fd\") " pod="openstack/swift-ring-rebalance-5ngrz" Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.272323 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5e3fc960-7783-4952-90c9-1551c780ae03-ring-data-devices\") pod \"swift-ring-rebalance-nfmp2\" (UID: \"5e3fc960-7783-4952-90c9-1551c780ae03\") " pod="openstack/swift-ring-rebalance-nfmp2" Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.272367 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2xzc\" (UniqueName: \"kubernetes.io/projected/5e3fc960-7783-4952-90c9-1551c780ae03-kube-api-access-c2xzc\") pod \"swift-ring-rebalance-nfmp2\" (UID: \"5e3fc960-7783-4952-90c9-1551c780ae03\") " pod="openstack/swift-ring-rebalance-nfmp2" Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.272425 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5e3fc960-7783-4952-90c9-1551c780ae03-dispersionconf\") pod \"swift-ring-rebalance-nfmp2\" (UID: \"5e3fc960-7783-4952-90c9-1551c780ae03\") " pod="openstack/swift-ring-rebalance-nfmp2" Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.272467 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e3fc960-7783-4952-90c9-1551c780ae03-combined-ca-bundle\") pod \"swift-ring-rebalance-nfmp2\" (UID: \"5e3fc960-7783-4952-90c9-1551c780ae03\") " pod="openstack/swift-ring-rebalance-nfmp2" Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.272526 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5e3fc960-7783-4952-90c9-1551c780ae03-etc-swift\") pod \"swift-ring-rebalance-nfmp2\" (UID: \"5e3fc960-7783-4952-90c9-1551c780ae03\") " pod="openstack/swift-ring-rebalance-nfmp2" Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.272546 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5e3fc960-7783-4952-90c9-1551c780ae03-swiftconf\") pod \"swift-ring-rebalance-nfmp2\" (UID: \"5e3fc960-7783-4952-90c9-1551c780ae03\") " pod="openstack/swift-ring-rebalance-nfmp2" Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.272587 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5e3fc960-7783-4952-90c9-1551c780ae03-scripts\") pod \"swift-ring-rebalance-nfmp2\" (UID: \"5e3fc960-7783-4952-90c9-1551c780ae03\") " pod="openstack/swift-ring-rebalance-nfmp2" Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.273373 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5e3fc960-7783-4952-90c9-1551c780ae03-scripts\") pod \"swift-ring-rebalance-nfmp2\" (UID: \"5e3fc960-7783-4952-90c9-1551c780ae03\") " pod="openstack/swift-ring-rebalance-nfmp2" Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.274466 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5e3fc960-7783-4952-90c9-1551c780ae03-ring-data-devices\") pod \"swift-ring-rebalance-nfmp2\" (UID: \"5e3fc960-7783-4952-90c9-1551c780ae03\") " pod="openstack/swift-ring-rebalance-nfmp2" Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.275744 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5e3fc960-7783-4952-90c9-1551c780ae03-etc-swift\") pod \"swift-ring-rebalance-nfmp2\" (UID: \"5e3fc960-7783-4952-90c9-1551c780ae03\") " pod="openstack/swift-ring-rebalance-nfmp2" Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.280959 4733 generic.go:334] "Generic (PLEG): container finished" podID="84d4401f-2343-41fa-82ae-877674337bf4" containerID="c39830f7afc41d4e539449a18ea110efdcfaa942a6d99809d36a31233d6cb82b" exitCode=0 Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.281686 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e3fc960-7783-4952-90c9-1551c780ae03-combined-ca-bundle\") pod \"swift-ring-rebalance-nfmp2\" (UID: \"5e3fc960-7783-4952-90c9-1551c780ae03\") " pod="openstack/swift-ring-rebalance-nfmp2" Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.281808 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-fvlqt" event={"ID":"84d4401f-2343-41fa-82ae-877674337bf4","Type":"ContainerDied","Data":"c39830f7afc41d4e539449a18ea110efdcfaa942a6d99809d36a31233d6cb82b"} Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.281831 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5e3fc960-7783-4952-90c9-1551c780ae03-dispersionconf\") pod \"swift-ring-rebalance-nfmp2\" (UID: \"5e3fc960-7783-4952-90c9-1551c780ae03\") " pod="openstack/swift-ring-rebalance-nfmp2" Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.284979 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5e3fc960-7783-4952-90c9-1551c780ae03-swiftconf\") pod \"swift-ring-rebalance-nfmp2\" (UID: \"5e3fc960-7783-4952-90c9-1551c780ae03\") " pod="openstack/swift-ring-rebalance-nfmp2" Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.287522 4733 generic.go:334] "Generic (PLEG): container finished" podID="ff5315db-fb68-4558-85c1-cf538d0e2770" containerID="754b489534a1d2e07cfe28b803b1041b8c35b7a8d870ab7643873669d480405d" exitCode=0 Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.287592 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-5ngrz" Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.287963 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-5795-account-create-update-nkww7" event={"ID":"ff5315db-fb68-4558-85c1-cf538d0e2770","Type":"ContainerDied","Data":"754b489534a1d2e07cfe28b803b1041b8c35b7a8d870ab7643873669d480405d"} Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.304368 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2xzc\" (UniqueName: \"kubernetes.io/projected/5e3fc960-7783-4952-90c9-1551c780ae03-kube-api-access-c2xzc\") pod \"swift-ring-rebalance-nfmp2\" (UID: \"5e3fc960-7783-4952-90c9-1551c780ae03\") " pod="openstack/swift-ring-rebalance-nfmp2" Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.311772 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-5ngrz" Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.337254 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-bhzrc" Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.346261 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-nfmp2" Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.422073 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-dj64z"] Mar 18 10:31:43 crc kubenswrapper[4733]: W0318 10:31:43.433719 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod53658e2a_4376_49b7_82eb_f46c3dee3b6a.slice/crio-aebaf7db681fba4c59c36876ae242626d173f1734a9acd6bdfa4175925861c86 WatchSource:0}: Error finding container aebaf7db681fba4c59c36876ae242626d173f1734a9acd6bdfa4175925861c86: Status 404 returned error can't find the container with id aebaf7db681fba4c59c36876ae242626d173f1734a9acd6bdfa4175925861c86 Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.474771 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ca2a461b-b7e7-4c74-9da8-2df85b95e6fd-etc-swift\") pod \"ca2a461b-b7e7-4c74-9da8-2df85b95e6fd\" (UID: \"ca2a461b-b7e7-4c74-9da8-2df85b95e6fd\") " Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.474945 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ca2a461b-b7e7-4c74-9da8-2df85b95e6fd-ring-data-devices\") pod \"ca2a461b-b7e7-4c74-9da8-2df85b95e6fd\" (UID: \"ca2a461b-b7e7-4c74-9da8-2df85b95e6fd\") " Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.475040 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ca2a461b-b7e7-4c74-9da8-2df85b95e6fd-scripts\") pod \"ca2a461b-b7e7-4c74-9da8-2df85b95e6fd\" (UID: \"ca2a461b-b7e7-4c74-9da8-2df85b95e6fd\") " Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.475139 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ca2a461b-b7e7-4c74-9da8-2df85b95e6fd-dispersionconf\") pod \"ca2a461b-b7e7-4c74-9da8-2df85b95e6fd\" (UID: \"ca2a461b-b7e7-4c74-9da8-2df85b95e6fd\") " Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.481672 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca2a461b-b7e7-4c74-9da8-2df85b95e6fd-combined-ca-bundle\") pod \"ca2a461b-b7e7-4c74-9da8-2df85b95e6fd\" (UID: \"ca2a461b-b7e7-4c74-9da8-2df85b95e6fd\") " Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.481749 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qh78t\" (UniqueName: \"kubernetes.io/projected/ca2a461b-b7e7-4c74-9da8-2df85b95e6fd-kube-api-access-qh78t\") pod \"ca2a461b-b7e7-4c74-9da8-2df85b95e6fd\" (UID: \"ca2a461b-b7e7-4c74-9da8-2df85b95e6fd\") " Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.482535 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca2a461b-b7e7-4c74-9da8-2df85b95e6fd-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "ca2a461b-b7e7-4c74-9da8-2df85b95e6fd" (UID: "ca2a461b-b7e7-4c74-9da8-2df85b95e6fd"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.482988 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca2a461b-b7e7-4c74-9da8-2df85b95e6fd-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "ca2a461b-b7e7-4c74-9da8-2df85b95e6fd" (UID: "ca2a461b-b7e7-4c74-9da8-2df85b95e6fd"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.483053 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ca2a461b-b7e7-4c74-9da8-2df85b95e6fd-swiftconf\") pod \"ca2a461b-b7e7-4c74-9da8-2df85b95e6fd\" (UID: \"ca2a461b-b7e7-4c74-9da8-2df85b95e6fd\") " Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.483620 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca2a461b-b7e7-4c74-9da8-2df85b95e6fd-scripts" (OuterVolumeSpecName: "scripts") pod "ca2a461b-b7e7-4c74-9da8-2df85b95e6fd" (UID: "ca2a461b-b7e7-4c74-9da8-2df85b95e6fd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.484737 4733 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ca2a461b-b7e7-4c74-9da8-2df85b95e6fd-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.484770 4733 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ca2a461b-b7e7-4c74-9da8-2df85b95e6fd-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.484783 4733 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ca2a461b-b7e7-4c74-9da8-2df85b95e6fd-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.488870 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca2a461b-b7e7-4c74-9da8-2df85b95e6fd-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "ca2a461b-b7e7-4c74-9da8-2df85b95e6fd" (UID: "ca2a461b-b7e7-4c74-9da8-2df85b95e6fd"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.492881 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca2a461b-b7e7-4c74-9da8-2df85b95e6fd-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "ca2a461b-b7e7-4c74-9da8-2df85b95e6fd" (UID: "ca2a461b-b7e7-4c74-9da8-2df85b95e6fd"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.505893 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca2a461b-b7e7-4c74-9da8-2df85b95e6fd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ca2a461b-b7e7-4c74-9da8-2df85b95e6fd" (UID: "ca2a461b-b7e7-4c74-9da8-2df85b95e6fd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.510753 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca2a461b-b7e7-4c74-9da8-2df85b95e6fd-kube-api-access-qh78t" (OuterVolumeSpecName: "kube-api-access-qh78t") pod "ca2a461b-b7e7-4c74-9da8-2df85b95e6fd" (UID: "ca2a461b-b7e7-4c74-9da8-2df85b95e6fd"). InnerVolumeSpecName "kube-api-access-qh78t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.590333 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qh78t\" (UniqueName: \"kubernetes.io/projected/ca2a461b-b7e7-4c74-9da8-2df85b95e6fd-kube-api-access-qh78t\") on node \"crc\" DevicePath \"\"" Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.595582 4733 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ca2a461b-b7e7-4c74-9da8-2df85b95e6fd-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.595703 4733 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ca2a461b-b7e7-4c74-9da8-2df85b95e6fd-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.595777 4733 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca2a461b-b7e7-4c74-9da8-2df85b95e6fd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 10:31:43 crc kubenswrapper[4733]: W0318 10:31:43.789355 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e3fc960_7783_4952_90c9_1551c780ae03.slice/crio-9c5077ef854e18aeb0823678f18007750be98c6ebb218cfdc6e156afe1f3ff45 WatchSource:0}: Error finding container 9c5077ef854e18aeb0823678f18007750be98c6ebb218cfdc6e156afe1f3ff45: Status 404 returned error can't find the container with id 9c5077ef854e18aeb0823678f18007750be98c6ebb218cfdc6e156afe1f3ff45 Mar 18 10:31:43 crc kubenswrapper[4733]: I0318 10:31:43.789683 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-nfmp2"] Mar 18 10:31:44 crc kubenswrapper[4733]: I0318 10:31:44.297392 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-nfmp2" event={"ID":"5e3fc960-7783-4952-90c9-1551c780ae03","Type":"ContainerStarted","Data":"9c5077ef854e18aeb0823678f18007750be98c6ebb218cfdc6e156afe1f3ff45"} Mar 18 10:31:44 crc kubenswrapper[4733]: I0318 10:31:44.299910 4733 generic.go:334] "Generic (PLEG): container finished" podID="53658e2a-4376-49b7-82eb-f46c3dee3b6a" containerID="9bc7f39c4918c4a53f61ec2045418343aab6acedb5d7104271be60607764a8a9" exitCode=0 Mar 18 10:31:44 crc kubenswrapper[4733]: I0318 10:31:44.299982 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-dj64z" event={"ID":"53658e2a-4376-49b7-82eb-f46c3dee3b6a","Type":"ContainerDied","Data":"9bc7f39c4918c4a53f61ec2045418343aab6acedb5d7104271be60607764a8a9"} Mar 18 10:31:44 crc kubenswrapper[4733]: I0318 10:31:44.300015 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-dj64z" event={"ID":"53658e2a-4376-49b7-82eb-f46c3dee3b6a","Type":"ContainerStarted","Data":"aebaf7db681fba4c59c36876ae242626d173f1734a9acd6bdfa4175925861c86"} Mar 18 10:31:44 crc kubenswrapper[4733]: I0318 10:31:44.300036 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-5ngrz" Mar 18 10:31:44 crc kubenswrapper[4733]: I0318 10:31:44.369177 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-5ngrz"] Mar 18 10:31:44 crc kubenswrapper[4733]: I0318 10:31:44.375519 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-5ngrz"] Mar 18 10:31:44 crc kubenswrapper[4733]: I0318 10:31:44.677704 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5795-account-create-update-nkww7" Mar 18 10:31:44 crc kubenswrapper[4733]: I0318 10:31:44.784887 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-fvlqt" Mar 18 10:31:44 crc kubenswrapper[4733]: I0318 10:31:44.815649 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfscv\" (UniqueName: \"kubernetes.io/projected/ff5315db-fb68-4558-85c1-cf538d0e2770-kube-api-access-jfscv\") pod \"ff5315db-fb68-4558-85c1-cf538d0e2770\" (UID: \"ff5315db-fb68-4558-85c1-cf538d0e2770\") " Mar 18 10:31:44 crc kubenswrapper[4733]: I0318 10:31:44.815700 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff5315db-fb68-4558-85c1-cf538d0e2770-operator-scripts\") pod \"ff5315db-fb68-4558-85c1-cf538d0e2770\" (UID: \"ff5315db-fb68-4558-85c1-cf538d0e2770\") " Mar 18 10:31:44 crc kubenswrapper[4733]: I0318 10:31:44.816557 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff5315db-fb68-4558-85c1-cf538d0e2770-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ff5315db-fb68-4558-85c1-cf538d0e2770" (UID: "ff5315db-fb68-4558-85c1-cf538d0e2770"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:31:44 crc kubenswrapper[4733]: I0318 10:31:44.821489 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff5315db-fb68-4558-85c1-cf538d0e2770-kube-api-access-jfscv" (OuterVolumeSpecName: "kube-api-access-jfscv") pod "ff5315db-fb68-4558-85c1-cf538d0e2770" (UID: "ff5315db-fb68-4558-85c1-cf538d0e2770"). InnerVolumeSpecName "kube-api-access-jfscv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:31:44 crc kubenswrapper[4733]: I0318 10:31:44.918078 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84d4401f-2343-41fa-82ae-877674337bf4-operator-scripts\") pod \"84d4401f-2343-41fa-82ae-877674337bf4\" (UID: \"84d4401f-2343-41fa-82ae-877674337bf4\") " Mar 18 10:31:44 crc kubenswrapper[4733]: I0318 10:31:44.918151 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spqc4\" (UniqueName: \"kubernetes.io/projected/84d4401f-2343-41fa-82ae-877674337bf4-kube-api-access-spqc4\") pod \"84d4401f-2343-41fa-82ae-877674337bf4\" (UID: \"84d4401f-2343-41fa-82ae-877674337bf4\") " Mar 18 10:31:44 crc kubenswrapper[4733]: I0318 10:31:44.918697 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfscv\" (UniqueName: \"kubernetes.io/projected/ff5315db-fb68-4558-85c1-cf538d0e2770-kube-api-access-jfscv\") on node \"crc\" DevicePath \"\"" Mar 18 10:31:44 crc kubenswrapper[4733]: I0318 10:31:44.918722 4733 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff5315db-fb68-4558-85c1-cf538d0e2770-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 10:31:44 crc kubenswrapper[4733]: I0318 10:31:44.919842 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84d4401f-2343-41fa-82ae-877674337bf4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "84d4401f-2343-41fa-82ae-877674337bf4" (UID: "84d4401f-2343-41fa-82ae-877674337bf4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:31:44 crc kubenswrapper[4733]: I0318 10:31:44.922924 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84d4401f-2343-41fa-82ae-877674337bf4-kube-api-access-spqc4" (OuterVolumeSpecName: "kube-api-access-spqc4") pod "84d4401f-2343-41fa-82ae-877674337bf4" (UID: "84d4401f-2343-41fa-82ae-877674337bf4"). InnerVolumeSpecName "kube-api-access-spqc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:31:45 crc kubenswrapper[4733]: I0318 10:31:45.020443 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-spqc4\" (UniqueName: \"kubernetes.io/projected/84d4401f-2343-41fa-82ae-877674337bf4-kube-api-access-spqc4\") on node \"crc\" DevicePath \"\"" Mar 18 10:31:45 crc kubenswrapper[4733]: I0318 10:31:45.020474 4733 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84d4401f-2343-41fa-82ae-877674337bf4-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 10:31:45 crc kubenswrapper[4733]: I0318 10:31:45.189382 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca2a461b-b7e7-4c74-9da8-2df85b95e6fd" path="/var/lib/kubelet/pods/ca2a461b-b7e7-4c74-9da8-2df85b95e6fd/volumes" Mar 18 10:31:45 crc kubenswrapper[4733]: I0318 10:31:45.309566 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-fvlqt" Mar 18 10:31:45 crc kubenswrapper[4733]: I0318 10:31:45.309425 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-fvlqt" event={"ID":"84d4401f-2343-41fa-82ae-877674337bf4","Type":"ContainerDied","Data":"b8b55af5d60c9bfa951021421dc142b3018bef82cf543260041378c3f0e7cbde"} Mar 18 10:31:45 crc kubenswrapper[4733]: I0318 10:31:45.310338 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8b55af5d60c9bfa951021421dc142b3018bef82cf543260041378c3f0e7cbde" Mar 18 10:31:45 crc kubenswrapper[4733]: I0318 10:31:45.317396 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5795-account-create-update-nkww7" Mar 18 10:31:45 crc kubenswrapper[4733]: I0318 10:31:45.317391 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-5795-account-create-update-nkww7" event={"ID":"ff5315db-fb68-4558-85c1-cf538d0e2770","Type":"ContainerDied","Data":"2fb7df9dc8f479fa452223731ed81c51e757e022bc1a6f323d9c6bcfdc951d47"} Mar 18 10:31:45 crc kubenswrapper[4733]: I0318 10:31:45.317443 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2fb7df9dc8f479fa452223731ed81c51e757e022bc1a6f323d9c6bcfdc951d47" Mar 18 10:31:46 crc kubenswrapper[4733]: I0318 10:31:46.947890 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4f94cfc9-67cc-474c-8d99-58a9d4e0273f-etc-swift\") pod \"swift-storage-0\" (UID: \"4f94cfc9-67cc-474c-8d99-58a9d4e0273f\") " pod="openstack/swift-storage-0" Mar 18 10:31:46 crc kubenswrapper[4733]: E0318 10:31:46.948489 4733 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 18 10:31:46 crc kubenswrapper[4733]: E0318 10:31:46.948510 4733 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 18 10:31:46 crc kubenswrapper[4733]: E0318 10:31:46.948558 4733 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4f94cfc9-67cc-474c-8d99-58a9d4e0273f-etc-swift podName:4f94cfc9-67cc-474c-8d99-58a9d4e0273f nodeName:}" failed. No retries permitted until 2026-03-18 10:31:54.948542033 +0000 UTC m=+1154.440276368 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4f94cfc9-67cc-474c-8d99-58a9d4e0273f-etc-swift") pod "swift-storage-0" (UID: "4f94cfc9-67cc-474c-8d99-58a9d4e0273f") : configmap "swift-ring-files" not found Mar 18 10:31:47 crc kubenswrapper[4733]: I0318 10:31:47.334584 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-dj64z" event={"ID":"53658e2a-4376-49b7-82eb-f46c3dee3b6a","Type":"ContainerDied","Data":"aebaf7db681fba4c59c36876ae242626d173f1734a9acd6bdfa4175925861c86"} Mar 18 10:31:47 crc kubenswrapper[4733]: I0318 10:31:47.334628 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aebaf7db681fba4c59c36876ae242626d173f1734a9acd6bdfa4175925861c86" Mar 18 10:31:47 crc kubenswrapper[4733]: I0318 10:31:47.399139 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-dj64z" Mar 18 10:31:47 crc kubenswrapper[4733]: I0318 10:31:47.558836 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53658e2a-4376-49b7-82eb-f46c3dee3b6a-operator-scripts\") pod \"53658e2a-4376-49b7-82eb-f46c3dee3b6a\" (UID: \"53658e2a-4376-49b7-82eb-f46c3dee3b6a\") " Mar 18 10:31:47 crc kubenswrapper[4733]: I0318 10:31:47.559092 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x78zp\" (UniqueName: \"kubernetes.io/projected/53658e2a-4376-49b7-82eb-f46c3dee3b6a-kube-api-access-x78zp\") pod \"53658e2a-4376-49b7-82eb-f46c3dee3b6a\" (UID: \"53658e2a-4376-49b7-82eb-f46c3dee3b6a\") " Mar 18 10:31:47 crc kubenswrapper[4733]: I0318 10:31:47.561487 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53658e2a-4376-49b7-82eb-f46c3dee3b6a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "53658e2a-4376-49b7-82eb-f46c3dee3b6a" (UID: "53658e2a-4376-49b7-82eb-f46c3dee3b6a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:31:47 crc kubenswrapper[4733]: I0318 10:31:47.563048 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53658e2a-4376-49b7-82eb-f46c3dee3b6a-kube-api-access-x78zp" (OuterVolumeSpecName: "kube-api-access-x78zp") pod "53658e2a-4376-49b7-82eb-f46c3dee3b6a" (UID: "53658e2a-4376-49b7-82eb-f46c3dee3b6a"). InnerVolumeSpecName "kube-api-access-x78zp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:31:47 crc kubenswrapper[4733]: I0318 10:31:47.636749 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 18 10:31:47 crc kubenswrapper[4733]: I0318 10:31:47.661148 4733 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53658e2a-4376-49b7-82eb-f46c3dee3b6a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 10:31:47 crc kubenswrapper[4733]: I0318 10:31:47.661413 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x78zp\" (UniqueName: \"kubernetes.io/projected/53658e2a-4376-49b7-82eb-f46c3dee3b6a-kube-api-access-x78zp\") on node \"crc\" DevicePath \"\"" Mar 18 10:31:48 crc kubenswrapper[4733]: I0318 10:31:48.207343 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-h4pnt" Mar 18 10:31:48 crc kubenswrapper[4733]: I0318 10:31:48.285931 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-rrvg6"] Mar 18 10:31:48 crc kubenswrapper[4733]: I0318 10:31:48.286276 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8554648995-rrvg6" podUID="15df79ef-9d7a-4310-ba27-bdf8cb200f0f" containerName="dnsmasq-dns" containerID="cri-o://e9b6a3c12243c23c29491c07886c30384e8dc7b44b11048b3a29f67cf6a0e54b" gracePeriod=10 Mar 18 10:31:48 crc kubenswrapper[4733]: I0318 10:31:48.344965 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-dj64z" Mar 18 10:31:48 crc kubenswrapper[4733]: I0318 10:31:48.344957 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-nfmp2" event={"ID":"5e3fc960-7783-4952-90c9-1551c780ae03","Type":"ContainerStarted","Data":"dfd0abb25a1e6ce4147875c303d9c9787b741334508ecf0f7ab8f557701355ad"} Mar 18 10:31:48 crc kubenswrapper[4733]: I0318 10:31:48.370646 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-nfmp2" podStartSLOduration=2.888059724 podStartE2EDuration="6.370627827s" podCreationTimestamp="2026-03-18 10:31:42 +0000 UTC" firstStartedPulling="2026-03-18 10:31:43.796831733 +0000 UTC m=+1143.288566068" lastFinishedPulling="2026-03-18 10:31:47.279399846 +0000 UTC m=+1146.771134171" observedRunningTime="2026-03-18 10:31:48.362744623 +0000 UTC m=+1147.854478948" watchObservedRunningTime="2026-03-18 10:31:48.370627827 +0000 UTC m=+1147.862362152" Mar 18 10:31:48 crc kubenswrapper[4733]: I0318 10:31:48.938956 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-dj64z"] Mar 18 10:31:48 crc kubenswrapper[4733]: I0318 10:31:48.952883 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-dj64z"] Mar 18 10:31:49 crc kubenswrapper[4733]: I0318 10:31:49.199104 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53658e2a-4376-49b7-82eb-f46c3dee3b6a" path="/var/lib/kubelet/pods/53658e2a-4376-49b7-82eb-f46c3dee3b6a/volumes" Mar 18 10:31:49 crc kubenswrapper[4733]: I0318 10:31:49.247165 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-rrvg6" Mar 18 10:31:49 crc kubenswrapper[4733]: I0318 10:31:49.354891 4733 generic.go:334] "Generic (PLEG): container finished" podID="15df79ef-9d7a-4310-ba27-bdf8cb200f0f" containerID="e9b6a3c12243c23c29491c07886c30384e8dc7b44b11048b3a29f67cf6a0e54b" exitCode=0 Mar 18 10:31:49 crc kubenswrapper[4733]: I0318 10:31:49.354942 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-rrvg6" Mar 18 10:31:49 crc kubenswrapper[4733]: I0318 10:31:49.354956 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-rrvg6" event={"ID":"15df79ef-9d7a-4310-ba27-bdf8cb200f0f","Type":"ContainerDied","Data":"e9b6a3c12243c23c29491c07886c30384e8dc7b44b11048b3a29f67cf6a0e54b"} Mar 18 10:31:49 crc kubenswrapper[4733]: I0318 10:31:49.355000 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-rrvg6" event={"ID":"15df79ef-9d7a-4310-ba27-bdf8cb200f0f","Type":"ContainerDied","Data":"1ad8ff69c6adc3dabc943b2d0fb235bc6e4c5c162e015c87081a85eb5257721c"} Mar 18 10:31:49 crc kubenswrapper[4733]: I0318 10:31:49.355023 4733 scope.go:117] "RemoveContainer" containerID="e9b6a3c12243c23c29491c07886c30384e8dc7b44b11048b3a29f67cf6a0e54b" Mar 18 10:31:49 crc kubenswrapper[4733]: I0318 10:31:49.402689 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/15df79ef-9d7a-4310-ba27-bdf8cb200f0f-ovsdbserver-nb\") pod \"15df79ef-9d7a-4310-ba27-bdf8cb200f0f\" (UID: \"15df79ef-9d7a-4310-ba27-bdf8cb200f0f\") " Mar 18 10:31:49 crc kubenswrapper[4733]: I0318 10:31:49.402785 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/15df79ef-9d7a-4310-ba27-bdf8cb200f0f-ovsdbserver-sb\") pod \"15df79ef-9d7a-4310-ba27-bdf8cb200f0f\" (UID: \"15df79ef-9d7a-4310-ba27-bdf8cb200f0f\") " Mar 18 10:31:49 crc kubenswrapper[4733]: I0318 10:31:49.402838 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/15df79ef-9d7a-4310-ba27-bdf8cb200f0f-dns-svc\") pod \"15df79ef-9d7a-4310-ba27-bdf8cb200f0f\" (UID: \"15df79ef-9d7a-4310-ba27-bdf8cb200f0f\") " Mar 18 10:31:49 crc kubenswrapper[4733]: I0318 10:31:49.402881 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5w85\" (UniqueName: \"kubernetes.io/projected/15df79ef-9d7a-4310-ba27-bdf8cb200f0f-kube-api-access-g5w85\") pod \"15df79ef-9d7a-4310-ba27-bdf8cb200f0f\" (UID: \"15df79ef-9d7a-4310-ba27-bdf8cb200f0f\") " Mar 18 10:31:49 crc kubenswrapper[4733]: I0318 10:31:49.402899 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15df79ef-9d7a-4310-ba27-bdf8cb200f0f-config\") pod \"15df79ef-9d7a-4310-ba27-bdf8cb200f0f\" (UID: \"15df79ef-9d7a-4310-ba27-bdf8cb200f0f\") " Mar 18 10:31:49 crc kubenswrapper[4733]: I0318 10:31:49.425741 4733 scope.go:117] "RemoveContainer" containerID="e32e3455bac4748d83432ff47c120fde28d910e682c598fcca6672f025864937" Mar 18 10:31:49 crc kubenswrapper[4733]: I0318 10:31:49.437375 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15df79ef-9d7a-4310-ba27-bdf8cb200f0f-kube-api-access-g5w85" (OuterVolumeSpecName: "kube-api-access-g5w85") pod "15df79ef-9d7a-4310-ba27-bdf8cb200f0f" (UID: "15df79ef-9d7a-4310-ba27-bdf8cb200f0f"). InnerVolumeSpecName "kube-api-access-g5w85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:31:49 crc kubenswrapper[4733]: I0318 10:31:49.448976 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15df79ef-9d7a-4310-ba27-bdf8cb200f0f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "15df79ef-9d7a-4310-ba27-bdf8cb200f0f" (UID: "15df79ef-9d7a-4310-ba27-bdf8cb200f0f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:31:49 crc kubenswrapper[4733]: I0318 10:31:49.462548 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15df79ef-9d7a-4310-ba27-bdf8cb200f0f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "15df79ef-9d7a-4310-ba27-bdf8cb200f0f" (UID: "15df79ef-9d7a-4310-ba27-bdf8cb200f0f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:31:49 crc kubenswrapper[4733]: I0318 10:31:49.479509 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15df79ef-9d7a-4310-ba27-bdf8cb200f0f-config" (OuterVolumeSpecName: "config") pod "15df79ef-9d7a-4310-ba27-bdf8cb200f0f" (UID: "15df79ef-9d7a-4310-ba27-bdf8cb200f0f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:31:49 crc kubenswrapper[4733]: I0318 10:31:49.492844 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15df79ef-9d7a-4310-ba27-bdf8cb200f0f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "15df79ef-9d7a-4310-ba27-bdf8cb200f0f" (UID: "15df79ef-9d7a-4310-ba27-bdf8cb200f0f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:31:49 crc kubenswrapper[4733]: I0318 10:31:49.504487 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5w85\" (UniqueName: \"kubernetes.io/projected/15df79ef-9d7a-4310-ba27-bdf8cb200f0f-kube-api-access-g5w85\") on node \"crc\" DevicePath \"\"" Mar 18 10:31:49 crc kubenswrapper[4733]: I0318 10:31:49.504927 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15df79ef-9d7a-4310-ba27-bdf8cb200f0f-config\") on node \"crc\" DevicePath \"\"" Mar 18 10:31:49 crc kubenswrapper[4733]: I0318 10:31:49.504942 4733 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/15df79ef-9d7a-4310-ba27-bdf8cb200f0f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 10:31:49 crc kubenswrapper[4733]: I0318 10:31:49.504952 4733 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/15df79ef-9d7a-4310-ba27-bdf8cb200f0f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 10:31:49 crc kubenswrapper[4733]: I0318 10:31:49.504960 4733 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/15df79ef-9d7a-4310-ba27-bdf8cb200f0f-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 10:31:49 crc kubenswrapper[4733]: I0318 10:31:49.541179 4733 scope.go:117] "RemoveContainer" containerID="e9b6a3c12243c23c29491c07886c30384e8dc7b44b11048b3a29f67cf6a0e54b" Mar 18 10:31:49 crc kubenswrapper[4733]: E0318 10:31:49.541593 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9b6a3c12243c23c29491c07886c30384e8dc7b44b11048b3a29f67cf6a0e54b\": container with ID starting with e9b6a3c12243c23c29491c07886c30384e8dc7b44b11048b3a29f67cf6a0e54b not found: ID does not exist" containerID="e9b6a3c12243c23c29491c07886c30384e8dc7b44b11048b3a29f67cf6a0e54b" Mar 18 10:31:49 crc kubenswrapper[4733]: I0318 10:31:49.541622 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9b6a3c12243c23c29491c07886c30384e8dc7b44b11048b3a29f67cf6a0e54b"} err="failed to get container status \"e9b6a3c12243c23c29491c07886c30384e8dc7b44b11048b3a29f67cf6a0e54b\": rpc error: code = NotFound desc = could not find container \"e9b6a3c12243c23c29491c07886c30384e8dc7b44b11048b3a29f67cf6a0e54b\": container with ID starting with e9b6a3c12243c23c29491c07886c30384e8dc7b44b11048b3a29f67cf6a0e54b not found: ID does not exist" Mar 18 10:31:49 crc kubenswrapper[4733]: I0318 10:31:49.541644 4733 scope.go:117] "RemoveContainer" containerID="e32e3455bac4748d83432ff47c120fde28d910e682c598fcca6672f025864937" Mar 18 10:31:49 crc kubenswrapper[4733]: E0318 10:31:49.541935 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e32e3455bac4748d83432ff47c120fde28d910e682c598fcca6672f025864937\": container with ID starting with e32e3455bac4748d83432ff47c120fde28d910e682c598fcca6672f025864937 not found: ID does not exist" containerID="e32e3455bac4748d83432ff47c120fde28d910e682c598fcca6672f025864937" Mar 18 10:31:49 crc kubenswrapper[4733]: I0318 10:31:49.542059 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e32e3455bac4748d83432ff47c120fde28d910e682c598fcca6672f025864937"} err="failed to get container status \"e32e3455bac4748d83432ff47c120fde28d910e682c598fcca6672f025864937\": rpc error: code = NotFound desc = could not find container \"e32e3455bac4748d83432ff47c120fde28d910e682c598fcca6672f025864937\": container with ID starting with e32e3455bac4748d83432ff47c120fde28d910e682c598fcca6672f025864937 not found: ID does not exist" Mar 18 10:31:49 crc kubenswrapper[4733]: I0318 10:31:49.686330 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-rrvg6"] Mar 18 10:31:49 crc kubenswrapper[4733]: I0318 10:31:49.692059 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-rrvg6"] Mar 18 10:31:50 crc kubenswrapper[4733]: I0318 10:31:50.867572 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-ptbmt"] Mar 18 10:31:50 crc kubenswrapper[4733]: E0318 10:31:50.868004 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84d4401f-2343-41fa-82ae-877674337bf4" containerName="mariadb-database-create" Mar 18 10:31:50 crc kubenswrapper[4733]: I0318 10:31:50.868022 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="84d4401f-2343-41fa-82ae-877674337bf4" containerName="mariadb-database-create" Mar 18 10:31:50 crc kubenswrapper[4733]: E0318 10:31:50.868030 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53658e2a-4376-49b7-82eb-f46c3dee3b6a" containerName="mariadb-account-create-update" Mar 18 10:31:50 crc kubenswrapper[4733]: I0318 10:31:50.868038 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="53658e2a-4376-49b7-82eb-f46c3dee3b6a" containerName="mariadb-account-create-update" Mar 18 10:31:50 crc kubenswrapper[4733]: E0318 10:31:50.868063 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15df79ef-9d7a-4310-ba27-bdf8cb200f0f" containerName="dnsmasq-dns" Mar 18 10:31:50 crc kubenswrapper[4733]: I0318 10:31:50.868069 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="15df79ef-9d7a-4310-ba27-bdf8cb200f0f" containerName="dnsmasq-dns" Mar 18 10:31:50 crc kubenswrapper[4733]: E0318 10:31:50.868080 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff5315db-fb68-4558-85c1-cf538d0e2770" containerName="mariadb-account-create-update" Mar 18 10:31:50 crc kubenswrapper[4733]: I0318 10:31:50.868086 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff5315db-fb68-4558-85c1-cf538d0e2770" containerName="mariadb-account-create-update" Mar 18 10:31:50 crc kubenswrapper[4733]: E0318 10:31:50.868098 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15df79ef-9d7a-4310-ba27-bdf8cb200f0f" containerName="init" Mar 18 10:31:50 crc kubenswrapper[4733]: I0318 10:31:50.868104 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="15df79ef-9d7a-4310-ba27-bdf8cb200f0f" containerName="init" Mar 18 10:31:50 crc kubenswrapper[4733]: I0318 10:31:50.868297 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="53658e2a-4376-49b7-82eb-f46c3dee3b6a" containerName="mariadb-account-create-update" Mar 18 10:31:50 crc kubenswrapper[4733]: I0318 10:31:50.868316 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="84d4401f-2343-41fa-82ae-877674337bf4" containerName="mariadb-database-create" Mar 18 10:31:50 crc kubenswrapper[4733]: I0318 10:31:50.868329 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff5315db-fb68-4558-85c1-cf538d0e2770" containerName="mariadb-account-create-update" Mar 18 10:31:50 crc kubenswrapper[4733]: I0318 10:31:50.868338 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="15df79ef-9d7a-4310-ba27-bdf8cb200f0f" containerName="dnsmasq-dns" Mar 18 10:31:50 crc kubenswrapper[4733]: I0318 10:31:50.868895 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-ptbmt" Mar 18 10:31:50 crc kubenswrapper[4733]: I0318 10:31:50.871430 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-7vljt" Mar 18 10:31:50 crc kubenswrapper[4733]: I0318 10:31:50.872006 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 18 10:31:50 crc kubenswrapper[4733]: I0318 10:31:50.886275 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-ptbmt"] Mar 18 10:31:51 crc kubenswrapper[4733]: I0318 10:31:51.029736 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/63c8f7bc-4162-4903-b3f9-96c8736a27b8-db-sync-config-data\") pod \"glance-db-sync-ptbmt\" (UID: \"63c8f7bc-4162-4903-b3f9-96c8736a27b8\") " pod="openstack/glance-db-sync-ptbmt" Mar 18 10:31:51 crc kubenswrapper[4733]: I0318 10:31:51.030199 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmmrf\" (UniqueName: \"kubernetes.io/projected/63c8f7bc-4162-4903-b3f9-96c8736a27b8-kube-api-access-vmmrf\") pod \"glance-db-sync-ptbmt\" (UID: \"63c8f7bc-4162-4903-b3f9-96c8736a27b8\") " pod="openstack/glance-db-sync-ptbmt" Mar 18 10:31:51 crc kubenswrapper[4733]: I0318 10:31:51.030295 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63c8f7bc-4162-4903-b3f9-96c8736a27b8-config-data\") pod \"glance-db-sync-ptbmt\" (UID: \"63c8f7bc-4162-4903-b3f9-96c8736a27b8\") " pod="openstack/glance-db-sync-ptbmt" Mar 18 10:31:51 crc kubenswrapper[4733]: I0318 10:31:51.030389 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63c8f7bc-4162-4903-b3f9-96c8736a27b8-combined-ca-bundle\") pod \"glance-db-sync-ptbmt\" (UID: \"63c8f7bc-4162-4903-b3f9-96c8736a27b8\") " pod="openstack/glance-db-sync-ptbmt" Mar 18 10:31:51 crc kubenswrapper[4733]: I0318 10:31:51.133090 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63c8f7bc-4162-4903-b3f9-96c8736a27b8-combined-ca-bundle\") pod \"glance-db-sync-ptbmt\" (UID: \"63c8f7bc-4162-4903-b3f9-96c8736a27b8\") " pod="openstack/glance-db-sync-ptbmt" Mar 18 10:31:51 crc kubenswrapper[4733]: I0318 10:31:51.133586 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/63c8f7bc-4162-4903-b3f9-96c8736a27b8-db-sync-config-data\") pod \"glance-db-sync-ptbmt\" (UID: \"63c8f7bc-4162-4903-b3f9-96c8736a27b8\") " pod="openstack/glance-db-sync-ptbmt" Mar 18 10:31:51 crc kubenswrapper[4733]: I0318 10:31:51.133814 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmmrf\" (UniqueName: \"kubernetes.io/projected/63c8f7bc-4162-4903-b3f9-96c8736a27b8-kube-api-access-vmmrf\") pod \"glance-db-sync-ptbmt\" (UID: \"63c8f7bc-4162-4903-b3f9-96c8736a27b8\") " pod="openstack/glance-db-sync-ptbmt" Mar 18 10:31:51 crc kubenswrapper[4733]: I0318 10:31:51.133890 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63c8f7bc-4162-4903-b3f9-96c8736a27b8-config-data\") pod \"glance-db-sync-ptbmt\" (UID: \"63c8f7bc-4162-4903-b3f9-96c8736a27b8\") " pod="openstack/glance-db-sync-ptbmt" Mar 18 10:31:51 crc kubenswrapper[4733]: I0318 10:31:51.139769 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/63c8f7bc-4162-4903-b3f9-96c8736a27b8-db-sync-config-data\") pod \"glance-db-sync-ptbmt\" (UID: \"63c8f7bc-4162-4903-b3f9-96c8736a27b8\") " pod="openstack/glance-db-sync-ptbmt" Mar 18 10:31:51 crc kubenswrapper[4733]: I0318 10:31:51.139906 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63c8f7bc-4162-4903-b3f9-96c8736a27b8-combined-ca-bundle\") pod \"glance-db-sync-ptbmt\" (UID: \"63c8f7bc-4162-4903-b3f9-96c8736a27b8\") " pod="openstack/glance-db-sync-ptbmt" Mar 18 10:31:51 crc kubenswrapper[4733]: I0318 10:31:51.151323 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63c8f7bc-4162-4903-b3f9-96c8736a27b8-config-data\") pod \"glance-db-sync-ptbmt\" (UID: \"63c8f7bc-4162-4903-b3f9-96c8736a27b8\") " pod="openstack/glance-db-sync-ptbmt" Mar 18 10:31:51 crc kubenswrapper[4733]: I0318 10:31:51.155957 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmmrf\" (UniqueName: \"kubernetes.io/projected/63c8f7bc-4162-4903-b3f9-96c8736a27b8-kube-api-access-vmmrf\") pod \"glance-db-sync-ptbmt\" (UID: \"63c8f7bc-4162-4903-b3f9-96c8736a27b8\") " pod="openstack/glance-db-sync-ptbmt" Mar 18 10:31:51 crc kubenswrapper[4733]: I0318 10:31:51.183648 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-ptbmt" Mar 18 10:31:51 crc kubenswrapper[4733]: I0318 10:31:51.190980 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15df79ef-9d7a-4310-ba27-bdf8cb200f0f" path="/var/lib/kubelet/pods/15df79ef-9d7a-4310-ba27-bdf8cb200f0f/volumes" Mar 18 10:31:51 crc kubenswrapper[4733]: I0318 10:31:51.773520 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-ptbmt"] Mar 18 10:31:52 crc kubenswrapper[4733]: I0318 10:31:52.387443 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-ptbmt" event={"ID":"63c8f7bc-4162-4903-b3f9-96c8736a27b8","Type":"ContainerStarted","Data":"0e53d30c9ba54d0bab85e4e6730952c64da04afa3c77dd75422e1f34b5188d78"} Mar 18 10:31:52 crc kubenswrapper[4733]: I0318 10:31:52.574865 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-gflvw"] Mar 18 10:31:52 crc kubenswrapper[4733]: I0318 10:31:52.575879 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-gflvw" Mar 18 10:31:52 crc kubenswrapper[4733]: I0318 10:31:52.579466 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 18 10:31:52 crc kubenswrapper[4733]: I0318 10:31:52.589958 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-gflvw"] Mar 18 10:31:52 crc kubenswrapper[4733]: I0318 10:31:52.670786 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4773e90c-6c0c-411c-810b-844d8570d4db-operator-scripts\") pod \"root-account-create-update-gflvw\" (UID: \"4773e90c-6c0c-411c-810b-844d8570d4db\") " pod="openstack/root-account-create-update-gflvw" Mar 18 10:31:52 crc kubenswrapper[4733]: I0318 10:31:52.670876 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gqp6\" (UniqueName: \"kubernetes.io/projected/4773e90c-6c0c-411c-810b-844d8570d4db-kube-api-access-2gqp6\") pod \"root-account-create-update-gflvw\" (UID: \"4773e90c-6c0c-411c-810b-844d8570d4db\") " pod="openstack/root-account-create-update-gflvw" Mar 18 10:31:52 crc kubenswrapper[4733]: I0318 10:31:52.772818 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gqp6\" (UniqueName: \"kubernetes.io/projected/4773e90c-6c0c-411c-810b-844d8570d4db-kube-api-access-2gqp6\") pod \"root-account-create-update-gflvw\" (UID: \"4773e90c-6c0c-411c-810b-844d8570d4db\") " pod="openstack/root-account-create-update-gflvw" Mar 18 10:31:52 crc kubenswrapper[4733]: I0318 10:31:52.774453 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4773e90c-6c0c-411c-810b-844d8570d4db-operator-scripts\") pod \"root-account-create-update-gflvw\" (UID: \"4773e90c-6c0c-411c-810b-844d8570d4db\") " pod="openstack/root-account-create-update-gflvw" Mar 18 10:31:52 crc kubenswrapper[4733]: I0318 10:31:52.773519 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4773e90c-6c0c-411c-810b-844d8570d4db-operator-scripts\") pod \"root-account-create-update-gflvw\" (UID: \"4773e90c-6c0c-411c-810b-844d8570d4db\") " pod="openstack/root-account-create-update-gflvw" Mar 18 10:31:52 crc kubenswrapper[4733]: I0318 10:31:52.794701 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gqp6\" (UniqueName: \"kubernetes.io/projected/4773e90c-6c0c-411c-810b-844d8570d4db-kube-api-access-2gqp6\") pod \"root-account-create-update-gflvw\" (UID: \"4773e90c-6c0c-411c-810b-844d8570d4db\") " pod="openstack/root-account-create-update-gflvw" Mar 18 10:31:52 crc kubenswrapper[4733]: I0318 10:31:52.907668 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-gflvw" Mar 18 10:31:53 crc kubenswrapper[4733]: I0318 10:31:53.401980 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-gflvw"] Mar 18 10:31:53 crc kubenswrapper[4733]: W0318 10:31:53.421983 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4773e90c_6c0c_411c_810b_844d8570d4db.slice/crio-236e88250136b9b1d8c3c58d2a1f9c316bbac83718dd8135763f8a4a32d31a30 WatchSource:0}: Error finding container 236e88250136b9b1d8c3c58d2a1f9c316bbac83718dd8135763f8a4a32d31a30: Status 404 returned error can't find the container with id 236e88250136b9b1d8c3c58d2a1f9c316bbac83718dd8135763f8a4a32d31a30 Mar 18 10:31:54 crc kubenswrapper[4733]: I0318 10:31:54.428661 4733 generic.go:334] "Generic (PLEG): container finished" podID="5e3fc960-7783-4952-90c9-1551c780ae03" containerID="dfd0abb25a1e6ce4147875c303d9c9787b741334508ecf0f7ab8f557701355ad" exitCode=0 Mar 18 10:31:54 crc kubenswrapper[4733]: I0318 10:31:54.428957 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-nfmp2" event={"ID":"5e3fc960-7783-4952-90c9-1551c780ae03","Type":"ContainerDied","Data":"dfd0abb25a1e6ce4147875c303d9c9787b741334508ecf0f7ab8f557701355ad"} Mar 18 10:31:54 crc kubenswrapper[4733]: I0318 10:31:54.434320 4733 generic.go:334] "Generic (PLEG): container finished" podID="4773e90c-6c0c-411c-810b-844d8570d4db" containerID="fc8a98034f827fb8988cc2fa281e7a7c5e2bd32e772267e324591ed784c75b62" exitCode=0 Mar 18 10:31:54 crc kubenswrapper[4733]: I0318 10:31:54.434425 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-gflvw" event={"ID":"4773e90c-6c0c-411c-810b-844d8570d4db","Type":"ContainerDied","Data":"fc8a98034f827fb8988cc2fa281e7a7c5e2bd32e772267e324591ed784c75b62"} Mar 18 10:31:54 crc kubenswrapper[4733]: I0318 10:31:54.434463 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-gflvw" event={"ID":"4773e90c-6c0c-411c-810b-844d8570d4db","Type":"ContainerStarted","Data":"236e88250136b9b1d8c3c58d2a1f9c316bbac83718dd8135763f8a4a32d31a30"} Mar 18 10:31:54 crc kubenswrapper[4733]: I0318 10:31:54.436522 4733 generic.go:334] "Generic (PLEG): container finished" podID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" containerID="0fb5e774f72bc7530e7861681639d72697b8c0245883531528195b98bc45ea93" exitCode=0 Mar 18 10:31:54 crc kubenswrapper[4733]: I0318 10:31:54.436561 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b4a4e3e2-bd4d-4f8d-97bc-51267378ab03","Type":"ContainerDied","Data":"0fb5e774f72bc7530e7861681639d72697b8c0245883531528195b98bc45ea93"} Mar 18 10:31:55 crc kubenswrapper[4733]: I0318 10:31:55.032979 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4f94cfc9-67cc-474c-8d99-58a9d4e0273f-etc-swift\") pod \"swift-storage-0\" (UID: \"4f94cfc9-67cc-474c-8d99-58a9d4e0273f\") " pod="openstack/swift-storage-0" Mar 18 10:31:55 crc kubenswrapper[4733]: I0318 10:31:55.053882 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4f94cfc9-67cc-474c-8d99-58a9d4e0273f-etc-swift\") pod \"swift-storage-0\" (UID: \"4f94cfc9-67cc-474c-8d99-58a9d4e0273f\") " pod="openstack/swift-storage-0" Mar 18 10:31:55 crc kubenswrapper[4733]: I0318 10:31:55.300531 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 18 10:31:55 crc kubenswrapper[4733]: I0318 10:31:55.445299 4733 generic.go:334] "Generic (PLEG): container finished" podID="f0570ce4-1455-4698-85cf-01f7108d9e7f" containerID="1b521608cd076add0dc6ea82ec6fd5b69318ec8068de497c0a6615c97830553d" exitCode=0 Mar 18 10:31:55 crc kubenswrapper[4733]: I0318 10:31:55.445381 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f0570ce4-1455-4698-85cf-01f7108d9e7f","Type":"ContainerDied","Data":"1b521608cd076add0dc6ea82ec6fd5b69318ec8068de497c0a6615c97830553d"} Mar 18 10:31:55 crc kubenswrapper[4733]: I0318 10:31:55.455568 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b4a4e3e2-bd4d-4f8d-97bc-51267378ab03","Type":"ContainerStarted","Data":"001b9e894ecd9bef6f800f761255da30231b8fbfeb22f304bc145622a4998afa"} Mar 18 10:31:55 crc kubenswrapper[4733]: I0318 10:31:55.456583 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 18 10:31:55 crc kubenswrapper[4733]: I0318 10:31:55.511467 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=48.509099846 podStartE2EDuration="1m4.511449354s" podCreationTimestamp="2026-03-18 10:30:51 +0000 UTC" firstStartedPulling="2026-03-18 10:31:05.062238827 +0000 UTC m=+1104.553973152" lastFinishedPulling="2026-03-18 10:31:21.064588335 +0000 UTC m=+1120.556322660" observedRunningTime="2026-03-18 10:31:55.509646563 +0000 UTC m=+1155.001380898" watchObservedRunningTime="2026-03-18 10:31:55.511449354 +0000 UTC m=+1155.003183679" Mar 18 10:31:55 crc kubenswrapper[4733]: I0318 10:31:55.881244 4733 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-rh64b" podUID="e3c842d3-b3dd-4cf2-9df0-16cea4061bc5" containerName="ovn-controller" probeResult="failure" output=< Mar 18 10:31:55 crc kubenswrapper[4733]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 18 10:31:55 crc kubenswrapper[4733]: > Mar 18 10:31:55 crc kubenswrapper[4733]: I0318 10:31:55.888958 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-nfmp2" Mar 18 10:31:55 crc kubenswrapper[4733]: I0318 10:31:55.898556 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-gflvw" Mar 18 10:31:55 crc kubenswrapper[4733]: I0318 10:31:55.899588 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-ljrgt" Mar 18 10:31:55 crc kubenswrapper[4733]: I0318 10:31:55.908218 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-ljrgt" Mar 18 10:31:55 crc kubenswrapper[4733]: I0318 10:31:55.974652 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 18 10:31:56 crc kubenswrapper[4733]: I0318 10:31:56.051209 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5e3fc960-7783-4952-90c9-1551c780ae03-dispersionconf\") pod \"5e3fc960-7783-4952-90c9-1551c780ae03\" (UID: \"5e3fc960-7783-4952-90c9-1551c780ae03\") " Mar 18 10:31:56 crc kubenswrapper[4733]: I0318 10:31:56.051281 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5e3fc960-7783-4952-90c9-1551c780ae03-etc-swift\") pod \"5e3fc960-7783-4952-90c9-1551c780ae03\" (UID: \"5e3fc960-7783-4952-90c9-1551c780ae03\") " Mar 18 10:31:56 crc kubenswrapper[4733]: I0318 10:31:56.051322 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5e3fc960-7783-4952-90c9-1551c780ae03-ring-data-devices\") pod \"5e3fc960-7783-4952-90c9-1551c780ae03\" (UID: \"5e3fc960-7783-4952-90c9-1551c780ae03\") " Mar 18 10:31:56 crc kubenswrapper[4733]: I0318 10:31:56.051406 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e3fc960-7783-4952-90c9-1551c780ae03-combined-ca-bundle\") pod \"5e3fc960-7783-4952-90c9-1551c780ae03\" (UID: \"5e3fc960-7783-4952-90c9-1551c780ae03\") " Mar 18 10:31:56 crc kubenswrapper[4733]: I0318 10:31:56.051453 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5e3fc960-7783-4952-90c9-1551c780ae03-swiftconf\") pod \"5e3fc960-7783-4952-90c9-1551c780ae03\" (UID: \"5e3fc960-7783-4952-90c9-1551c780ae03\") " Mar 18 10:31:56 crc kubenswrapper[4733]: I0318 10:31:56.051491 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5e3fc960-7783-4952-90c9-1551c780ae03-scripts\") pod \"5e3fc960-7783-4952-90c9-1551c780ae03\" (UID: \"5e3fc960-7783-4952-90c9-1551c780ae03\") " Mar 18 10:31:56 crc kubenswrapper[4733]: I0318 10:31:56.051597 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gqp6\" (UniqueName: \"kubernetes.io/projected/4773e90c-6c0c-411c-810b-844d8570d4db-kube-api-access-2gqp6\") pod \"4773e90c-6c0c-411c-810b-844d8570d4db\" (UID: \"4773e90c-6c0c-411c-810b-844d8570d4db\") " Mar 18 10:31:56 crc kubenswrapper[4733]: I0318 10:31:56.051625 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4773e90c-6c0c-411c-810b-844d8570d4db-operator-scripts\") pod \"4773e90c-6c0c-411c-810b-844d8570d4db\" (UID: \"4773e90c-6c0c-411c-810b-844d8570d4db\") " Mar 18 10:31:56 crc kubenswrapper[4733]: I0318 10:31:56.051670 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2xzc\" (UniqueName: \"kubernetes.io/projected/5e3fc960-7783-4952-90c9-1551c780ae03-kube-api-access-c2xzc\") pod \"5e3fc960-7783-4952-90c9-1551c780ae03\" (UID: \"5e3fc960-7783-4952-90c9-1551c780ae03\") " Mar 18 10:31:56 crc kubenswrapper[4733]: I0318 10:31:56.052941 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4773e90c-6c0c-411c-810b-844d8570d4db-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4773e90c-6c0c-411c-810b-844d8570d4db" (UID: "4773e90c-6c0c-411c-810b-844d8570d4db"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:31:56 crc kubenswrapper[4733]: I0318 10:31:56.053902 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e3fc960-7783-4952-90c9-1551c780ae03-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "5e3fc960-7783-4952-90c9-1551c780ae03" (UID: "5e3fc960-7783-4952-90c9-1551c780ae03"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:31:56 crc kubenswrapper[4733]: I0318 10:31:56.055465 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e3fc960-7783-4952-90c9-1551c780ae03-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "5e3fc960-7783-4952-90c9-1551c780ae03" (UID: "5e3fc960-7783-4952-90c9-1551c780ae03"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:31:56 crc kubenswrapper[4733]: I0318 10:31:56.056820 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e3fc960-7783-4952-90c9-1551c780ae03-kube-api-access-c2xzc" (OuterVolumeSpecName: "kube-api-access-c2xzc") pod "5e3fc960-7783-4952-90c9-1551c780ae03" (UID: "5e3fc960-7783-4952-90c9-1551c780ae03"). InnerVolumeSpecName "kube-api-access-c2xzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:31:56 crc kubenswrapper[4733]: I0318 10:31:56.058770 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e3fc960-7783-4952-90c9-1551c780ae03-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "5e3fc960-7783-4952-90c9-1551c780ae03" (UID: "5e3fc960-7783-4952-90c9-1551c780ae03"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:31:56 crc kubenswrapper[4733]: I0318 10:31:56.059171 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4773e90c-6c0c-411c-810b-844d8570d4db-kube-api-access-2gqp6" (OuterVolumeSpecName: "kube-api-access-2gqp6") pod "4773e90c-6c0c-411c-810b-844d8570d4db" (UID: "4773e90c-6c0c-411c-810b-844d8570d4db"). InnerVolumeSpecName "kube-api-access-2gqp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:31:56 crc kubenswrapper[4733]: I0318 10:31:56.075562 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e3fc960-7783-4952-90c9-1551c780ae03-scripts" (OuterVolumeSpecName: "scripts") pod "5e3fc960-7783-4952-90c9-1551c780ae03" (UID: "5e3fc960-7783-4952-90c9-1551c780ae03"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:31:56 crc kubenswrapper[4733]: I0318 10:31:56.076945 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e3fc960-7783-4952-90c9-1551c780ae03-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5e3fc960-7783-4952-90c9-1551c780ae03" (UID: "5e3fc960-7783-4952-90c9-1551c780ae03"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:31:56 crc kubenswrapper[4733]: I0318 10:31:56.080397 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e3fc960-7783-4952-90c9-1551c780ae03-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "5e3fc960-7783-4952-90c9-1551c780ae03" (UID: "5e3fc960-7783-4952-90c9-1551c780ae03"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:31:56 crc kubenswrapper[4733]: I0318 10:31:56.154173 4733 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5e3fc960-7783-4952-90c9-1551c780ae03-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 18 10:31:56 crc kubenswrapper[4733]: I0318 10:31:56.154229 4733 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5e3fc960-7783-4952-90c9-1551c780ae03-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 18 10:31:56 crc kubenswrapper[4733]: I0318 10:31:56.154240 4733 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5e3fc960-7783-4952-90c9-1551c780ae03-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 18 10:31:56 crc kubenswrapper[4733]: I0318 10:31:56.154252 4733 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e3fc960-7783-4952-90c9-1551c780ae03-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 10:31:56 crc kubenswrapper[4733]: I0318 10:31:56.154264 4733 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5e3fc960-7783-4952-90c9-1551c780ae03-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 18 10:31:56 crc kubenswrapper[4733]: I0318 10:31:56.154276 4733 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5e3fc960-7783-4952-90c9-1551c780ae03-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 10:31:56 crc kubenswrapper[4733]: I0318 10:31:56.154286 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gqp6\" (UniqueName: \"kubernetes.io/projected/4773e90c-6c0c-411c-810b-844d8570d4db-kube-api-access-2gqp6\") on node \"crc\" DevicePath \"\"" Mar 18 10:31:56 crc kubenswrapper[4733]: I0318 10:31:56.154301 4733 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4773e90c-6c0c-411c-810b-844d8570d4db-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 10:31:56 crc kubenswrapper[4733]: I0318 10:31:56.154312 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2xzc\" (UniqueName: \"kubernetes.io/projected/5e3fc960-7783-4952-90c9-1551c780ae03-kube-api-access-c2xzc\") on node \"crc\" DevicePath \"\"" Mar 18 10:31:56 crc kubenswrapper[4733]: I0318 10:31:56.169965 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-rh64b-config-kckbs"] Mar 18 10:31:56 crc kubenswrapper[4733]: E0318 10:31:56.171471 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4773e90c-6c0c-411c-810b-844d8570d4db" containerName="mariadb-account-create-update" Mar 18 10:31:56 crc kubenswrapper[4733]: I0318 10:31:56.171500 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="4773e90c-6c0c-411c-810b-844d8570d4db" containerName="mariadb-account-create-update" Mar 18 10:31:56 crc kubenswrapper[4733]: E0318 10:31:56.171535 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e3fc960-7783-4952-90c9-1551c780ae03" containerName="swift-ring-rebalance" Mar 18 10:31:56 crc kubenswrapper[4733]: I0318 10:31:56.171545 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e3fc960-7783-4952-90c9-1551c780ae03" containerName="swift-ring-rebalance" Mar 18 10:31:56 crc kubenswrapper[4733]: I0318 10:31:56.172723 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="4773e90c-6c0c-411c-810b-844d8570d4db" containerName="mariadb-account-create-update" Mar 18 10:31:56 crc kubenswrapper[4733]: I0318 10:31:56.172759 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e3fc960-7783-4952-90c9-1551c780ae03" containerName="swift-ring-rebalance" Mar 18 10:31:56 crc kubenswrapper[4733]: I0318 10:31:56.174638 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rh64b-config-kckbs" Mar 18 10:31:56 crc kubenswrapper[4733]: I0318 10:31:56.188507 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 18 10:31:56 crc kubenswrapper[4733]: I0318 10:31:56.209032 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-rh64b-config-kckbs"] Mar 18 10:31:56 crc kubenswrapper[4733]: I0318 10:31:56.357033 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0e628bba-84e9-4e1c-9e42-72277667b0a2-var-log-ovn\") pod \"ovn-controller-rh64b-config-kckbs\" (UID: \"0e628bba-84e9-4e1c-9e42-72277667b0a2\") " pod="openstack/ovn-controller-rh64b-config-kckbs" Mar 18 10:31:56 crc kubenswrapper[4733]: I0318 10:31:56.357095 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0e628bba-84e9-4e1c-9e42-72277667b0a2-var-run-ovn\") pod \"ovn-controller-rh64b-config-kckbs\" (UID: \"0e628bba-84e9-4e1c-9e42-72277667b0a2\") " pod="openstack/ovn-controller-rh64b-config-kckbs" Mar 18 10:31:56 crc kubenswrapper[4733]: I0318 10:31:56.357133 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/0e628bba-84e9-4e1c-9e42-72277667b0a2-additional-scripts\") pod \"ovn-controller-rh64b-config-kckbs\" (UID: \"0e628bba-84e9-4e1c-9e42-72277667b0a2\") " pod="openstack/ovn-controller-rh64b-config-kckbs" Mar 18 10:31:56 crc kubenswrapper[4733]: I0318 10:31:56.357397 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0e628bba-84e9-4e1c-9e42-72277667b0a2-var-run\") pod \"ovn-controller-rh64b-config-kckbs\" (UID: \"0e628bba-84e9-4e1c-9e42-72277667b0a2\") " pod="openstack/ovn-controller-rh64b-config-kckbs" Mar 18 10:31:56 crc kubenswrapper[4733]: I0318 10:31:56.357536 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0e628bba-84e9-4e1c-9e42-72277667b0a2-scripts\") pod \"ovn-controller-rh64b-config-kckbs\" (UID: \"0e628bba-84e9-4e1c-9e42-72277667b0a2\") " pod="openstack/ovn-controller-rh64b-config-kckbs" Mar 18 10:31:56 crc kubenswrapper[4733]: I0318 10:31:56.357687 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlx9j\" (UniqueName: \"kubernetes.io/projected/0e628bba-84e9-4e1c-9e42-72277667b0a2-kube-api-access-mlx9j\") pod \"ovn-controller-rh64b-config-kckbs\" (UID: \"0e628bba-84e9-4e1c-9e42-72277667b0a2\") " pod="openstack/ovn-controller-rh64b-config-kckbs" Mar 18 10:31:56 crc kubenswrapper[4733]: I0318 10:31:56.465549 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlx9j\" (UniqueName: \"kubernetes.io/projected/0e628bba-84e9-4e1c-9e42-72277667b0a2-kube-api-access-mlx9j\") pod \"ovn-controller-rh64b-config-kckbs\" (UID: \"0e628bba-84e9-4e1c-9e42-72277667b0a2\") " pod="openstack/ovn-controller-rh64b-config-kckbs" Mar 18 10:31:56 crc kubenswrapper[4733]: I0318 10:31:56.465605 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0e628bba-84e9-4e1c-9e42-72277667b0a2-var-log-ovn\") pod \"ovn-controller-rh64b-config-kckbs\" (UID: \"0e628bba-84e9-4e1c-9e42-72277667b0a2\") " pod="openstack/ovn-controller-rh64b-config-kckbs" Mar 18 10:31:56 crc kubenswrapper[4733]: I0318 10:31:56.465651 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0e628bba-84e9-4e1c-9e42-72277667b0a2-var-run-ovn\") pod \"ovn-controller-rh64b-config-kckbs\" (UID: \"0e628bba-84e9-4e1c-9e42-72277667b0a2\") " pod="openstack/ovn-controller-rh64b-config-kckbs" Mar 18 10:31:56 crc kubenswrapper[4733]: I0318 10:31:56.465680 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/0e628bba-84e9-4e1c-9e42-72277667b0a2-additional-scripts\") pod \"ovn-controller-rh64b-config-kckbs\" (UID: \"0e628bba-84e9-4e1c-9e42-72277667b0a2\") " pod="openstack/ovn-controller-rh64b-config-kckbs" Mar 18 10:31:56 crc kubenswrapper[4733]: I0318 10:31:56.465731 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0e628bba-84e9-4e1c-9e42-72277667b0a2-var-run\") pod \"ovn-controller-rh64b-config-kckbs\" (UID: \"0e628bba-84e9-4e1c-9e42-72277667b0a2\") " pod="openstack/ovn-controller-rh64b-config-kckbs" Mar 18 10:31:56 crc kubenswrapper[4733]: I0318 10:31:56.465766 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0e628bba-84e9-4e1c-9e42-72277667b0a2-scripts\") pod \"ovn-controller-rh64b-config-kckbs\" (UID: \"0e628bba-84e9-4e1c-9e42-72277667b0a2\") " pod="openstack/ovn-controller-rh64b-config-kckbs" Mar 18 10:31:56 crc kubenswrapper[4733]: I0318 10:31:56.465910 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0e628bba-84e9-4e1c-9e42-72277667b0a2-var-log-ovn\") pod \"ovn-controller-rh64b-config-kckbs\" (UID: \"0e628bba-84e9-4e1c-9e42-72277667b0a2\") " pod="openstack/ovn-controller-rh64b-config-kckbs" Mar 18 10:31:56 crc kubenswrapper[4733]: I0318 10:31:56.465968 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0e628bba-84e9-4e1c-9e42-72277667b0a2-var-run-ovn\") pod \"ovn-controller-rh64b-config-kckbs\" (UID: \"0e628bba-84e9-4e1c-9e42-72277667b0a2\") " pod="openstack/ovn-controller-rh64b-config-kckbs" Mar 18 10:31:56 crc kubenswrapper[4733]: I0318 10:31:56.465998 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0e628bba-84e9-4e1c-9e42-72277667b0a2-var-run\") pod \"ovn-controller-rh64b-config-kckbs\" (UID: \"0e628bba-84e9-4e1c-9e42-72277667b0a2\") " pod="openstack/ovn-controller-rh64b-config-kckbs" Mar 18 10:31:56 crc kubenswrapper[4733]: I0318 10:31:56.466717 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/0e628bba-84e9-4e1c-9e42-72277667b0a2-additional-scripts\") pod \"ovn-controller-rh64b-config-kckbs\" (UID: \"0e628bba-84e9-4e1c-9e42-72277667b0a2\") " pod="openstack/ovn-controller-rh64b-config-kckbs" Mar 18 10:31:56 crc kubenswrapper[4733]: I0318 10:31:56.467972 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0e628bba-84e9-4e1c-9e42-72277667b0a2-scripts\") pod \"ovn-controller-rh64b-config-kckbs\" (UID: \"0e628bba-84e9-4e1c-9e42-72277667b0a2\") " pod="openstack/ovn-controller-rh64b-config-kckbs" Mar 18 10:31:56 crc kubenswrapper[4733]: I0318 10:31:56.468707 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-nfmp2" event={"ID":"5e3fc960-7783-4952-90c9-1551c780ae03","Type":"ContainerDied","Data":"9c5077ef854e18aeb0823678f18007750be98c6ebb218cfdc6e156afe1f3ff45"} Mar 18 10:31:56 crc kubenswrapper[4733]: I0318 10:31:56.468736 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c5077ef854e18aeb0823678f18007750be98c6ebb218cfdc6e156afe1f3ff45" Mar 18 10:31:56 crc kubenswrapper[4733]: I0318 10:31:56.468776 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-nfmp2" Mar 18 10:31:56 crc kubenswrapper[4733]: I0318 10:31:56.470346 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-gflvw" event={"ID":"4773e90c-6c0c-411c-810b-844d8570d4db","Type":"ContainerDied","Data":"236e88250136b9b1d8c3c58d2a1f9c316bbac83718dd8135763f8a4a32d31a30"} Mar 18 10:31:56 crc kubenswrapper[4733]: I0318 10:31:56.470372 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="236e88250136b9b1d8c3c58d2a1f9c316bbac83718dd8135763f8a4a32d31a30" Mar 18 10:31:56 crc kubenswrapper[4733]: I0318 10:31:56.470383 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-gflvw" Mar 18 10:31:56 crc kubenswrapper[4733]: I0318 10:31:56.471731 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f0570ce4-1455-4698-85cf-01f7108d9e7f","Type":"ContainerStarted","Data":"1082901e937aee3ff012135f915e36be2d201e2adb97cf8ab3cb6b5dbb1e9f6f"} Mar 18 10:31:56 crc kubenswrapper[4733]: I0318 10:31:56.471886 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 18 10:31:56 crc kubenswrapper[4733]: I0318 10:31:56.473439 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4f94cfc9-67cc-474c-8d99-58a9d4e0273f","Type":"ContainerStarted","Data":"4b22df5788ca868e21d692e0286a4fd86e269619d6c47e62147fd029667867ea"} Mar 18 10:31:56 crc kubenswrapper[4733]: I0318 10:31:56.489549 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlx9j\" (UniqueName: \"kubernetes.io/projected/0e628bba-84e9-4e1c-9e42-72277667b0a2-kube-api-access-mlx9j\") pod \"ovn-controller-rh64b-config-kckbs\" (UID: \"0e628bba-84e9-4e1c-9e42-72277667b0a2\") " pod="openstack/ovn-controller-rh64b-config-kckbs" Mar 18 10:31:56 crc kubenswrapper[4733]: I0318 10:31:56.499341 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rh64b-config-kckbs" Mar 18 10:31:56 crc kubenswrapper[4733]: I0318 10:31:56.974848 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=57.428693781 podStartE2EDuration="1m6.974826055s" podCreationTimestamp="2026-03-18 10:30:50 +0000 UTC" firstStartedPulling="2026-03-18 10:31:05.052527152 +0000 UTC m=+1104.544261487" lastFinishedPulling="2026-03-18 10:31:14.598659436 +0000 UTC m=+1114.090393761" observedRunningTime="2026-03-18 10:31:56.500209345 +0000 UTC m=+1155.991943680" watchObservedRunningTime="2026-03-18 10:31:56.974826055 +0000 UTC m=+1156.466560380" Mar 18 10:31:57 crc kubenswrapper[4733]: I0318 10:31:57.022389 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-rh64b-config-kckbs"] Mar 18 10:31:57 crc kubenswrapper[4733]: W0318 10:31:57.165528 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e628bba_84e9_4e1c_9e42_72277667b0a2.slice/crio-22ce6d62b2c247efc2f78a573eaad62efb486d3285646f6c9d81f05e03bd4390 WatchSource:0}: Error finding container 22ce6d62b2c247efc2f78a573eaad62efb486d3285646f6c9d81f05e03bd4390: Status 404 returned error can't find the container with id 22ce6d62b2c247efc2f78a573eaad62efb486d3285646f6c9d81f05e03bd4390 Mar 18 10:31:57 crc kubenswrapper[4733]: I0318 10:31:57.484352 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rh64b-config-kckbs" event={"ID":"0e628bba-84e9-4e1c-9e42-72277667b0a2","Type":"ContainerStarted","Data":"22ce6d62b2c247efc2f78a573eaad62efb486d3285646f6c9d81f05e03bd4390"} Mar 18 10:31:58 crc kubenswrapper[4733]: I0318 10:31:58.496832 4733 generic.go:334] "Generic (PLEG): container finished" podID="0e628bba-84e9-4e1c-9e42-72277667b0a2" containerID="6a586c5fd4b77aeae152ada8c17b0c5946fd162e927825804e11d81559ba17f0" exitCode=0 Mar 18 10:31:58 crc kubenswrapper[4733]: I0318 10:31:58.496892 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rh64b-config-kckbs" event={"ID":"0e628bba-84e9-4e1c-9e42-72277667b0a2","Type":"ContainerDied","Data":"6a586c5fd4b77aeae152ada8c17b0c5946fd162e927825804e11d81559ba17f0"} Mar 18 10:31:58 crc kubenswrapper[4733]: I0318 10:31:58.503468 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4f94cfc9-67cc-474c-8d99-58a9d4e0273f","Type":"ContainerStarted","Data":"d19587f53ed713b0ff5fa1da4b4e7eba2645babc58913c28cba7a274f45c17df"} Mar 18 10:31:58 crc kubenswrapper[4733]: I0318 10:31:58.503824 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4f94cfc9-67cc-474c-8d99-58a9d4e0273f","Type":"ContainerStarted","Data":"b4be968df0168e894e0715f49feb099d6e403dc44a8d1eee90a40ddfac216669"} Mar 18 10:31:58 crc kubenswrapper[4733]: I0318 10:31:58.503840 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4f94cfc9-67cc-474c-8d99-58a9d4e0273f","Type":"ContainerStarted","Data":"dbfe977643a9bde3af88327de7777951f800a8fc1d72a36faddff72583073343"} Mar 18 10:31:58 crc kubenswrapper[4733]: I0318 10:31:58.503852 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4f94cfc9-67cc-474c-8d99-58a9d4e0273f","Type":"ContainerStarted","Data":"266fbf2493fb04519ff239ccf13b44e1a705933d786e0b3b1056deede6dccf8c"} Mar 18 10:31:58 crc kubenswrapper[4733]: I0318 10:31:58.948953 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-gflvw"] Mar 18 10:31:58 crc kubenswrapper[4733]: I0318 10:31:58.955139 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-gflvw"] Mar 18 10:31:59 crc kubenswrapper[4733]: I0318 10:31:59.184821 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4773e90c-6c0c-411c-810b-844d8570d4db" path="/var/lib/kubelet/pods/4773e90c-6c0c-411c-810b-844d8570d4db/volumes" Mar 18 10:31:59 crc kubenswrapper[4733]: I0318 10:31:59.516138 4733 generic.go:334] "Generic (PLEG): container finished" podID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" containerID="001b9e894ecd9bef6f800f761255da30231b8fbfeb22f304bc145622a4998afa" exitCode=0 Mar 18 10:31:59 crc kubenswrapper[4733]: I0318 10:31:59.516217 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b4a4e3e2-bd4d-4f8d-97bc-51267378ab03","Type":"ContainerDied","Data":"001b9e894ecd9bef6f800f761255da30231b8fbfeb22f304bc145622a4998afa"} Mar 18 10:31:59 crc kubenswrapper[4733]: I0318 10:31:59.517826 4733 scope.go:117] "RemoveContainer" containerID="001b9e894ecd9bef6f800f761255da30231b8fbfeb22f304bc145622a4998afa" Mar 18 10:32:00 crc kubenswrapper[4733]: I0318 10:32:00.133827 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563832-njktw"] Mar 18 10:32:00 crc kubenswrapper[4733]: I0318 10:32:00.134993 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563832-njktw" Mar 18 10:32:00 crc kubenswrapper[4733]: I0318 10:32:00.141366 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563832-njktw"] Mar 18 10:32:00 crc kubenswrapper[4733]: I0318 10:32:00.171281 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:32:00 crc kubenswrapper[4733]: I0318 10:32:00.171499 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:32:00 crc kubenswrapper[4733]: I0318 10:32:00.171696 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wmd5k" Mar 18 10:32:00 crc kubenswrapper[4733]: I0318 10:32:00.232365 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n47d2\" (UniqueName: \"kubernetes.io/projected/ac26b4cb-ac0b-4b78-9c5e-60c6563b478e-kube-api-access-n47d2\") pod \"auto-csr-approver-29563832-njktw\" (UID: \"ac26b4cb-ac0b-4b78-9c5e-60c6563b478e\") " pod="openshift-infra/auto-csr-approver-29563832-njktw" Mar 18 10:32:00 crc kubenswrapper[4733]: I0318 10:32:00.334460 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n47d2\" (UniqueName: \"kubernetes.io/projected/ac26b4cb-ac0b-4b78-9c5e-60c6563b478e-kube-api-access-n47d2\") pod \"auto-csr-approver-29563832-njktw\" (UID: \"ac26b4cb-ac0b-4b78-9c5e-60c6563b478e\") " pod="openshift-infra/auto-csr-approver-29563832-njktw" Mar 18 10:32:00 crc kubenswrapper[4733]: I0318 10:32:00.375384 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n47d2\" (UniqueName: \"kubernetes.io/projected/ac26b4cb-ac0b-4b78-9c5e-60c6563b478e-kube-api-access-n47d2\") pod \"auto-csr-approver-29563832-njktw\" (UID: \"ac26b4cb-ac0b-4b78-9c5e-60c6563b478e\") " pod="openshift-infra/auto-csr-approver-29563832-njktw" Mar 18 10:32:00 crc kubenswrapper[4733]: I0318 10:32:00.499790 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563832-njktw" Mar 18 10:32:00 crc kubenswrapper[4733]: I0318 10:32:00.524497 4733 generic.go:334] "Generic (PLEG): container finished" podID="f0570ce4-1455-4698-85cf-01f7108d9e7f" containerID="1082901e937aee3ff012135f915e36be2d201e2adb97cf8ab3cb6b5dbb1e9f6f" exitCode=0 Mar 18 10:32:00 crc kubenswrapper[4733]: I0318 10:32:00.524542 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f0570ce4-1455-4698-85cf-01f7108d9e7f","Type":"ContainerDied","Data":"1082901e937aee3ff012135f915e36be2d201e2adb97cf8ab3cb6b5dbb1e9f6f"} Mar 18 10:32:00 crc kubenswrapper[4733]: I0318 10:32:00.525174 4733 scope.go:117] "RemoveContainer" containerID="1082901e937aee3ff012135f915e36be2d201e2adb97cf8ab3cb6b5dbb1e9f6f" Mar 18 10:32:00 crc kubenswrapper[4733]: I0318 10:32:00.876721 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-rh64b" Mar 18 10:32:03 crc kubenswrapper[4733]: I0318 10:32:03.963860 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-xb87f"] Mar 18 10:32:03 crc kubenswrapper[4733]: I0318 10:32:03.965869 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-xb87f" Mar 18 10:32:03 crc kubenswrapper[4733]: I0318 10:32:03.968002 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 18 10:32:04 crc kubenswrapper[4733]: I0318 10:32:03.972002 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-xb87f"] Mar 18 10:32:04 crc kubenswrapper[4733]: I0318 10:32:04.096008 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d533f566-eded-44ca-b276-7e3d437f9fce-operator-scripts\") pod \"root-account-create-update-xb87f\" (UID: \"d533f566-eded-44ca-b276-7e3d437f9fce\") " pod="openstack/root-account-create-update-xb87f" Mar 18 10:32:04 crc kubenswrapper[4733]: I0318 10:32:04.096120 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n87bq\" (UniqueName: \"kubernetes.io/projected/d533f566-eded-44ca-b276-7e3d437f9fce-kube-api-access-n87bq\") pod \"root-account-create-update-xb87f\" (UID: \"d533f566-eded-44ca-b276-7e3d437f9fce\") " pod="openstack/root-account-create-update-xb87f" Mar 18 10:32:04 crc kubenswrapper[4733]: I0318 10:32:04.197097 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n87bq\" (UniqueName: \"kubernetes.io/projected/d533f566-eded-44ca-b276-7e3d437f9fce-kube-api-access-n87bq\") pod \"root-account-create-update-xb87f\" (UID: \"d533f566-eded-44ca-b276-7e3d437f9fce\") " pod="openstack/root-account-create-update-xb87f" Mar 18 10:32:04 crc kubenswrapper[4733]: I0318 10:32:04.197421 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d533f566-eded-44ca-b276-7e3d437f9fce-operator-scripts\") pod \"root-account-create-update-xb87f\" (UID: \"d533f566-eded-44ca-b276-7e3d437f9fce\") " pod="openstack/root-account-create-update-xb87f" Mar 18 10:32:04 crc kubenswrapper[4733]: I0318 10:32:04.198157 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d533f566-eded-44ca-b276-7e3d437f9fce-operator-scripts\") pod \"root-account-create-update-xb87f\" (UID: \"d533f566-eded-44ca-b276-7e3d437f9fce\") " pod="openstack/root-account-create-update-xb87f" Mar 18 10:32:04 crc kubenswrapper[4733]: I0318 10:32:04.220029 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n87bq\" (UniqueName: \"kubernetes.io/projected/d533f566-eded-44ca-b276-7e3d437f9fce-kube-api-access-n87bq\") pod \"root-account-create-update-xb87f\" (UID: \"d533f566-eded-44ca-b276-7e3d437f9fce\") " pod="openstack/root-account-create-update-xb87f" Mar 18 10:32:04 crc kubenswrapper[4733]: I0318 10:32:04.327351 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-xb87f" Mar 18 10:32:08 crc kubenswrapper[4733]: E0318 10:32:08.423472 4733 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api:current-podified" Mar 18 10:32:08 crc kubenswrapper[4733]: E0318 10:32:08.424086 4733 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vmmrf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-ptbmt_openstack(63c8f7bc-4162-4903-b3f9-96c8736a27b8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 10:32:08 crc kubenswrapper[4733]: E0318 10:32:08.425512 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-ptbmt" podUID="63c8f7bc-4162-4903-b3f9-96c8736a27b8" Mar 18 10:32:08 crc kubenswrapper[4733]: I0318 10:32:08.583137 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rh64b-config-kckbs" Mar 18 10:32:08 crc kubenswrapper[4733]: I0318 10:32:08.613371 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rh64b-config-kckbs" Mar 18 10:32:08 crc kubenswrapper[4733]: I0318 10:32:08.613704 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rh64b-config-kckbs" event={"ID":"0e628bba-84e9-4e1c-9e42-72277667b0a2","Type":"ContainerDied","Data":"22ce6d62b2c247efc2f78a573eaad62efb486d3285646f6c9d81f05e03bd4390"} Mar 18 10:32:08 crc kubenswrapper[4733]: I0318 10:32:08.613733 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="22ce6d62b2c247efc2f78a573eaad62efb486d3285646f6c9d81f05e03bd4390" Mar 18 10:32:08 crc kubenswrapper[4733]: E0318 10:32:08.615055 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api:current-podified\\\"\"" pod="openstack/glance-db-sync-ptbmt" podUID="63c8f7bc-4162-4903-b3f9-96c8736a27b8" Mar 18 10:32:08 crc kubenswrapper[4733]: I0318 10:32:08.674943 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0e628bba-84e9-4e1c-9e42-72277667b0a2-scripts\") pod \"0e628bba-84e9-4e1c-9e42-72277667b0a2\" (UID: \"0e628bba-84e9-4e1c-9e42-72277667b0a2\") " Mar 18 10:32:08 crc kubenswrapper[4733]: I0318 10:32:08.675041 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0e628bba-84e9-4e1c-9e42-72277667b0a2-var-log-ovn\") pod \"0e628bba-84e9-4e1c-9e42-72277667b0a2\" (UID: \"0e628bba-84e9-4e1c-9e42-72277667b0a2\") " Mar 18 10:32:08 crc kubenswrapper[4733]: I0318 10:32:08.675085 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlx9j\" (UniqueName: \"kubernetes.io/projected/0e628bba-84e9-4e1c-9e42-72277667b0a2-kube-api-access-mlx9j\") pod \"0e628bba-84e9-4e1c-9e42-72277667b0a2\" (UID: \"0e628bba-84e9-4e1c-9e42-72277667b0a2\") " Mar 18 10:32:08 crc kubenswrapper[4733]: I0318 10:32:08.675507 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0e628bba-84e9-4e1c-9e42-72277667b0a2-var-run-ovn\") pod \"0e628bba-84e9-4e1c-9e42-72277667b0a2\" (UID: \"0e628bba-84e9-4e1c-9e42-72277667b0a2\") " Mar 18 10:32:08 crc kubenswrapper[4733]: I0318 10:32:08.675713 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/0e628bba-84e9-4e1c-9e42-72277667b0a2-additional-scripts\") pod \"0e628bba-84e9-4e1c-9e42-72277667b0a2\" (UID: \"0e628bba-84e9-4e1c-9e42-72277667b0a2\") " Mar 18 10:32:08 crc kubenswrapper[4733]: I0318 10:32:08.675749 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0e628bba-84e9-4e1c-9e42-72277667b0a2-var-run\") pod \"0e628bba-84e9-4e1c-9e42-72277667b0a2\" (UID: \"0e628bba-84e9-4e1c-9e42-72277667b0a2\") " Mar 18 10:32:08 crc kubenswrapper[4733]: I0318 10:32:08.675782 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0e628bba-84e9-4e1c-9e42-72277667b0a2-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "0e628bba-84e9-4e1c-9e42-72277667b0a2" (UID: "0e628bba-84e9-4e1c-9e42-72277667b0a2"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 10:32:08 crc kubenswrapper[4733]: I0318 10:32:08.675832 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0e628bba-84e9-4e1c-9e42-72277667b0a2-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "0e628bba-84e9-4e1c-9e42-72277667b0a2" (UID: "0e628bba-84e9-4e1c-9e42-72277667b0a2"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 10:32:08 crc kubenswrapper[4733]: I0318 10:32:08.676868 4733 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0e628bba-84e9-4e1c-9e42-72277667b0a2-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 10:32:08 crc kubenswrapper[4733]: I0318 10:32:08.677068 4733 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0e628bba-84e9-4e1c-9e42-72277667b0a2-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 10:32:08 crc kubenswrapper[4733]: I0318 10:32:08.676902 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e628bba-84e9-4e1c-9e42-72277667b0a2-scripts" (OuterVolumeSpecName: "scripts") pod "0e628bba-84e9-4e1c-9e42-72277667b0a2" (UID: "0e628bba-84e9-4e1c-9e42-72277667b0a2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:32:08 crc kubenswrapper[4733]: I0318 10:32:08.676928 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0e628bba-84e9-4e1c-9e42-72277667b0a2-var-run" (OuterVolumeSpecName: "var-run") pod "0e628bba-84e9-4e1c-9e42-72277667b0a2" (UID: "0e628bba-84e9-4e1c-9e42-72277667b0a2"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 10:32:08 crc kubenswrapper[4733]: I0318 10:32:08.676985 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e628bba-84e9-4e1c-9e42-72277667b0a2-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "0e628bba-84e9-4e1c-9e42-72277667b0a2" (UID: "0e628bba-84e9-4e1c-9e42-72277667b0a2"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:32:08 crc kubenswrapper[4733]: I0318 10:32:08.681322 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e628bba-84e9-4e1c-9e42-72277667b0a2-kube-api-access-mlx9j" (OuterVolumeSpecName: "kube-api-access-mlx9j") pod "0e628bba-84e9-4e1c-9e42-72277667b0a2" (UID: "0e628bba-84e9-4e1c-9e42-72277667b0a2"). InnerVolumeSpecName "kube-api-access-mlx9j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:32:08 crc kubenswrapper[4733]: I0318 10:32:08.778653 4733 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0e628bba-84e9-4e1c-9e42-72277667b0a2-var-run\") on node \"crc\" DevicePath \"\"" Mar 18 10:32:08 crc kubenswrapper[4733]: I0318 10:32:08.778686 4733 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0e628bba-84e9-4e1c-9e42-72277667b0a2-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 10:32:08 crc kubenswrapper[4733]: I0318 10:32:08.778697 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlx9j\" (UniqueName: \"kubernetes.io/projected/0e628bba-84e9-4e1c-9e42-72277667b0a2-kube-api-access-mlx9j\") on node \"crc\" DevicePath \"\"" Mar 18 10:32:08 crc kubenswrapper[4733]: I0318 10:32:08.778708 4733 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/0e628bba-84e9-4e1c-9e42-72277667b0a2-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 10:32:08 crc kubenswrapper[4733]: I0318 10:32:08.944253 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563832-njktw"] Mar 18 10:32:08 crc kubenswrapper[4733]: W0318 10:32:08.947211 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac26b4cb_ac0b_4b78_9c5e_60c6563b478e.slice/crio-8ac1b0f08ec5d849a8bed24cb2098e48f5ac74bb6bd23eef8fa08f76cffaa710 WatchSource:0}: Error finding container 8ac1b0f08ec5d849a8bed24cb2098e48f5ac74bb6bd23eef8fa08f76cffaa710: Status 404 returned error can't find the container with id 8ac1b0f08ec5d849a8bed24cb2098e48f5ac74bb6bd23eef8fa08f76cffaa710 Mar 18 10:32:08 crc kubenswrapper[4733]: I0318 10:32:08.994781 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-xb87f"] Mar 18 10:32:09 crc kubenswrapper[4733]: W0318 10:32:09.162233 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd533f566_eded_44ca_b276_7e3d437f9fce.slice/crio-e79e61d37c63bf1173443cab053dcac88ea174ce8aa0f6e02aa470300d87201a WatchSource:0}: Error finding container e79e61d37c63bf1173443cab053dcac88ea174ce8aa0f6e02aa470300d87201a: Status 404 returned error can't find the container with id e79e61d37c63bf1173443cab053dcac88ea174ce8aa0f6e02aa470300d87201a Mar 18 10:32:09 crc kubenswrapper[4733]: I0318 10:32:09.639752 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b4a4e3e2-bd4d-4f8d-97bc-51267378ab03","Type":"ContainerStarted","Data":"9ba4505789b02a7aa27e40622e705a2188f60bebd64231f907e57dbee799f683"} Mar 18 10:32:09 crc kubenswrapper[4733]: I0318 10:32:09.641299 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 18 10:32:09 crc kubenswrapper[4733]: I0318 10:32:09.649400 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f0570ce4-1455-4698-85cf-01f7108d9e7f","Type":"ContainerStarted","Data":"f53733af192d507492916cb8fdfb9e36a34a2d5b06777b3df798d1db42baebb1"} Mar 18 10:32:09 crc kubenswrapper[4733]: I0318 10:32:09.649815 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 18 10:32:09 crc kubenswrapper[4733]: I0318 10:32:09.665559 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4f94cfc9-67cc-474c-8d99-58a9d4e0273f","Type":"ContainerStarted","Data":"f09f54d8c78fa82ebedc2ac49c3d041989b31b6f8ff99c6e4662292723ec43a0"} Mar 18 10:32:09 crc kubenswrapper[4733]: I0318 10:32:09.665610 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4f94cfc9-67cc-474c-8d99-58a9d4e0273f","Type":"ContainerStarted","Data":"9f55e786080c5ae38bb5df6db279e3cae26306d1b2cb1db2ee0263184a18fa92"} Mar 18 10:32:09 crc kubenswrapper[4733]: I0318 10:32:09.669585 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563832-njktw" event={"ID":"ac26b4cb-ac0b-4b78-9c5e-60c6563b478e","Type":"ContainerStarted","Data":"8ac1b0f08ec5d849a8bed24cb2098e48f5ac74bb6bd23eef8fa08f76cffaa710"} Mar 18 10:32:09 crc kubenswrapper[4733]: I0318 10:32:09.680620 4733 generic.go:334] "Generic (PLEG): container finished" podID="d533f566-eded-44ca-b276-7e3d437f9fce" containerID="7076d89bfeedd95679091846270edbde667d954c3e6fdb8ee00f499b50144915" exitCode=0 Mar 18 10:32:09 crc kubenswrapper[4733]: I0318 10:32:09.680677 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-xb87f" event={"ID":"d533f566-eded-44ca-b276-7e3d437f9fce","Type":"ContainerDied","Data":"7076d89bfeedd95679091846270edbde667d954c3e6fdb8ee00f499b50144915"} Mar 18 10:32:09 crc kubenswrapper[4733]: I0318 10:32:09.680704 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-xb87f" event={"ID":"d533f566-eded-44ca-b276-7e3d437f9fce","Type":"ContainerStarted","Data":"e79e61d37c63bf1173443cab053dcac88ea174ce8aa0f6e02aa470300d87201a"} Mar 18 10:32:09 crc kubenswrapper[4733]: I0318 10:32:09.740283 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-rh64b-config-kckbs"] Mar 18 10:32:09 crc kubenswrapper[4733]: I0318 10:32:09.748425 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-rh64b-config-kckbs"] Mar 18 10:32:10 crc kubenswrapper[4733]: I0318 10:32:10.693845 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4f94cfc9-67cc-474c-8d99-58a9d4e0273f","Type":"ContainerStarted","Data":"03b367733a02dfdbd7555ecf7e95eae5a15fd9b63d6399da7d29fc384dd98c9e"} Mar 18 10:32:10 crc kubenswrapper[4733]: I0318 10:32:10.694128 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4f94cfc9-67cc-474c-8d99-58a9d4e0273f","Type":"ContainerStarted","Data":"1109244a1ec38349bb9c33394e754f6ca373727120e2016826fdcbc9e7725e44"} Mar 18 10:32:11 crc kubenswrapper[4733]: I0318 10:32:11.186827 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e628bba-84e9-4e1c-9e42-72277667b0a2" path="/var/lib/kubelet/pods/0e628bba-84e9-4e1c-9e42-72277667b0a2/volumes" Mar 18 10:32:11 crc kubenswrapper[4733]: I0318 10:32:11.275979 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-xb87f" Mar 18 10:32:11 crc kubenswrapper[4733]: I0318 10:32:11.424911 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n87bq\" (UniqueName: \"kubernetes.io/projected/d533f566-eded-44ca-b276-7e3d437f9fce-kube-api-access-n87bq\") pod \"d533f566-eded-44ca-b276-7e3d437f9fce\" (UID: \"d533f566-eded-44ca-b276-7e3d437f9fce\") " Mar 18 10:32:11 crc kubenswrapper[4733]: I0318 10:32:11.425450 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d533f566-eded-44ca-b276-7e3d437f9fce-operator-scripts\") pod \"d533f566-eded-44ca-b276-7e3d437f9fce\" (UID: \"d533f566-eded-44ca-b276-7e3d437f9fce\") " Mar 18 10:32:11 crc kubenswrapper[4733]: I0318 10:32:11.426060 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d533f566-eded-44ca-b276-7e3d437f9fce-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d533f566-eded-44ca-b276-7e3d437f9fce" (UID: "d533f566-eded-44ca-b276-7e3d437f9fce"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:32:11 crc kubenswrapper[4733]: I0318 10:32:11.429418 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d533f566-eded-44ca-b276-7e3d437f9fce-kube-api-access-n87bq" (OuterVolumeSpecName: "kube-api-access-n87bq") pod "d533f566-eded-44ca-b276-7e3d437f9fce" (UID: "d533f566-eded-44ca-b276-7e3d437f9fce"). InnerVolumeSpecName "kube-api-access-n87bq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:32:11 crc kubenswrapper[4733]: I0318 10:32:11.527581 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n87bq\" (UniqueName: \"kubernetes.io/projected/d533f566-eded-44ca-b276-7e3d437f9fce-kube-api-access-n87bq\") on node \"crc\" DevicePath \"\"" Mar 18 10:32:11 crc kubenswrapper[4733]: I0318 10:32:11.527611 4733 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d533f566-eded-44ca-b276-7e3d437f9fce-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 10:32:11 crc kubenswrapper[4733]: I0318 10:32:11.708111 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4f94cfc9-67cc-474c-8d99-58a9d4e0273f","Type":"ContainerStarted","Data":"44cc72cf61efaa659b87c1775077631473a06083fb71d0347c99e0d6c7c980e0"} Mar 18 10:32:11 crc kubenswrapper[4733]: I0318 10:32:11.708153 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4f94cfc9-67cc-474c-8d99-58a9d4e0273f","Type":"ContainerStarted","Data":"b254459b4ffc3b00f73b9608b3610886dd56d5840666881fbaa05386bf9269fe"} Mar 18 10:32:11 crc kubenswrapper[4733]: I0318 10:32:11.709822 4733 generic.go:334] "Generic (PLEG): container finished" podID="ac26b4cb-ac0b-4b78-9c5e-60c6563b478e" containerID="da48f5028812280b5314f3d818c71b3049bdb0d8b1d5755bc74f1fedad4676d7" exitCode=0 Mar 18 10:32:11 crc kubenswrapper[4733]: I0318 10:32:11.709922 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563832-njktw" event={"ID":"ac26b4cb-ac0b-4b78-9c5e-60c6563b478e","Type":"ContainerDied","Data":"da48f5028812280b5314f3d818c71b3049bdb0d8b1d5755bc74f1fedad4676d7"} Mar 18 10:32:11 crc kubenswrapper[4733]: I0318 10:32:11.710977 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-xb87f" event={"ID":"d533f566-eded-44ca-b276-7e3d437f9fce","Type":"ContainerDied","Data":"e79e61d37c63bf1173443cab053dcac88ea174ce8aa0f6e02aa470300d87201a"} Mar 18 10:32:11 crc kubenswrapper[4733]: I0318 10:32:11.711004 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e79e61d37c63bf1173443cab053dcac88ea174ce8aa0f6e02aa470300d87201a" Mar 18 10:32:11 crc kubenswrapper[4733]: I0318 10:32:11.711058 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-xb87f" Mar 18 10:32:12 crc kubenswrapper[4733]: I0318 10:32:12.726559 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4f94cfc9-67cc-474c-8d99-58a9d4e0273f","Type":"ContainerStarted","Data":"d9de382521acac6cb3603510e0e270022fc866b500e77b3ff7d004243afaca92"} Mar 18 10:32:12 crc kubenswrapper[4733]: I0318 10:32:12.727883 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4f94cfc9-67cc-474c-8d99-58a9d4e0273f","Type":"ContainerStarted","Data":"100e25c73401fec565df6ba0895b1f5ea995dfe6b674db5b47d91ed3fc50711d"} Mar 18 10:32:12 crc kubenswrapper[4733]: I0318 10:32:12.728747 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4f94cfc9-67cc-474c-8d99-58a9d4e0273f","Type":"ContainerStarted","Data":"a147ff81dceb9facdc603abb3d83a7c1416cf9e85e710a7991b6a0626233c259"} Mar 18 10:32:12 crc kubenswrapper[4733]: I0318 10:32:12.728850 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4f94cfc9-67cc-474c-8d99-58a9d4e0273f","Type":"ContainerStarted","Data":"6e01f54b2250bac6d78a8f11659c5103ad761b5a72b5739349656250527ed43e"} Mar 18 10:32:12 crc kubenswrapper[4733]: I0318 10:32:12.728953 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4f94cfc9-67cc-474c-8d99-58a9d4e0273f","Type":"ContainerStarted","Data":"95ba0cc3f48d5f96dab0f77a9064545ae6f43f2040f94300ac8d0f8925102f4f"} Mar 18 10:32:12 crc kubenswrapper[4733]: I0318 10:32:12.768435 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=19.578083254 podStartE2EDuration="34.768414202s" podCreationTimestamp="2026-03-18 10:31:38 +0000 UTC" firstStartedPulling="2026-03-18 10:31:56.000950926 +0000 UTC m=+1155.492685251" lastFinishedPulling="2026-03-18 10:32:11.191281874 +0000 UTC m=+1170.683016199" observedRunningTime="2026-03-18 10:32:12.758109242 +0000 UTC m=+1172.249843567" watchObservedRunningTime="2026-03-18 10:32:12.768414202 +0000 UTC m=+1172.260148527" Mar 18 10:32:13 crc kubenswrapper[4733]: I0318 10:32:13.035996 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-5vkkr"] Mar 18 10:32:13 crc kubenswrapper[4733]: E0318 10:32:13.036626 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e628bba-84e9-4e1c-9e42-72277667b0a2" containerName="ovn-config" Mar 18 10:32:13 crc kubenswrapper[4733]: I0318 10:32:13.036642 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e628bba-84e9-4e1c-9e42-72277667b0a2" containerName="ovn-config" Mar 18 10:32:13 crc kubenswrapper[4733]: E0318 10:32:13.036653 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d533f566-eded-44ca-b276-7e3d437f9fce" containerName="mariadb-account-create-update" Mar 18 10:32:13 crc kubenswrapper[4733]: I0318 10:32:13.036660 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="d533f566-eded-44ca-b276-7e3d437f9fce" containerName="mariadb-account-create-update" Mar 18 10:32:13 crc kubenswrapper[4733]: I0318 10:32:13.036815 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e628bba-84e9-4e1c-9e42-72277667b0a2" containerName="ovn-config" Mar 18 10:32:13 crc kubenswrapper[4733]: I0318 10:32:13.036840 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="d533f566-eded-44ca-b276-7e3d437f9fce" containerName="mariadb-account-create-update" Mar 18 10:32:13 crc kubenswrapper[4733]: I0318 10:32:13.037594 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-5vkkr" Mar 18 10:32:13 crc kubenswrapper[4733]: I0318 10:32:13.039248 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 18 10:32:13 crc kubenswrapper[4733]: I0318 10:32:13.075017 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-5vkkr"] Mar 18 10:32:13 crc kubenswrapper[4733]: I0318 10:32:13.078379 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563832-njktw" Mar 18 10:32:13 crc kubenswrapper[4733]: I0318 10:32:13.160591 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-5vkkr\" (UID: \"e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d\") " pod="openstack/dnsmasq-dns-5c79d794d7-5vkkr" Mar 18 10:32:13 crc kubenswrapper[4733]: I0318 10:32:13.160670 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-5vkkr\" (UID: \"e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d\") " pod="openstack/dnsmasq-dns-5c79d794d7-5vkkr" Mar 18 10:32:13 crc kubenswrapper[4733]: I0318 10:32:13.160692 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-5vkkr\" (UID: \"e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d\") " pod="openstack/dnsmasq-dns-5c79d794d7-5vkkr" Mar 18 10:32:13 crc kubenswrapper[4733]: I0318 10:32:13.160738 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbm6m\" (UniqueName: \"kubernetes.io/projected/e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d-kube-api-access-zbm6m\") pod \"dnsmasq-dns-5c79d794d7-5vkkr\" (UID: \"e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d\") " pod="openstack/dnsmasq-dns-5c79d794d7-5vkkr" Mar 18 10:32:13 crc kubenswrapper[4733]: I0318 10:32:13.160755 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-5vkkr\" (UID: \"e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d\") " pod="openstack/dnsmasq-dns-5c79d794d7-5vkkr" Mar 18 10:32:13 crc kubenswrapper[4733]: I0318 10:32:13.160785 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d-config\") pod \"dnsmasq-dns-5c79d794d7-5vkkr\" (UID: \"e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d\") " pod="openstack/dnsmasq-dns-5c79d794d7-5vkkr" Mar 18 10:32:13 crc kubenswrapper[4733]: I0318 10:32:13.261950 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n47d2\" (UniqueName: \"kubernetes.io/projected/ac26b4cb-ac0b-4b78-9c5e-60c6563b478e-kube-api-access-n47d2\") pod \"ac26b4cb-ac0b-4b78-9c5e-60c6563b478e\" (UID: \"ac26b4cb-ac0b-4b78-9c5e-60c6563b478e\") " Mar 18 10:32:13 crc kubenswrapper[4733]: I0318 10:32:13.262254 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-5vkkr\" (UID: \"e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d\") " pod="openstack/dnsmasq-dns-5c79d794d7-5vkkr" Mar 18 10:32:13 crc kubenswrapper[4733]: I0318 10:32:13.262285 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-5vkkr\" (UID: \"e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d\") " pod="openstack/dnsmasq-dns-5c79d794d7-5vkkr" Mar 18 10:32:13 crc kubenswrapper[4733]: I0318 10:32:13.262381 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbm6m\" (UniqueName: \"kubernetes.io/projected/e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d-kube-api-access-zbm6m\") pod \"dnsmasq-dns-5c79d794d7-5vkkr\" (UID: \"e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d\") " pod="openstack/dnsmasq-dns-5c79d794d7-5vkkr" Mar 18 10:32:13 crc kubenswrapper[4733]: I0318 10:32:13.262416 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-5vkkr\" (UID: \"e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d\") " pod="openstack/dnsmasq-dns-5c79d794d7-5vkkr" Mar 18 10:32:13 crc kubenswrapper[4733]: I0318 10:32:13.262469 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d-config\") pod \"dnsmasq-dns-5c79d794d7-5vkkr\" (UID: \"e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d\") " pod="openstack/dnsmasq-dns-5c79d794d7-5vkkr" Mar 18 10:32:13 crc kubenswrapper[4733]: I0318 10:32:13.262534 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-5vkkr\" (UID: \"e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d\") " pod="openstack/dnsmasq-dns-5c79d794d7-5vkkr" Mar 18 10:32:13 crc kubenswrapper[4733]: I0318 10:32:13.264552 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-5vkkr\" (UID: \"e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d\") " pod="openstack/dnsmasq-dns-5c79d794d7-5vkkr" Mar 18 10:32:13 crc kubenswrapper[4733]: I0318 10:32:13.264562 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-5vkkr\" (UID: \"e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d\") " pod="openstack/dnsmasq-dns-5c79d794d7-5vkkr" Mar 18 10:32:13 crc kubenswrapper[4733]: I0318 10:32:13.264898 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d-config\") pod \"dnsmasq-dns-5c79d794d7-5vkkr\" (UID: \"e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d\") " pod="openstack/dnsmasq-dns-5c79d794d7-5vkkr" Mar 18 10:32:13 crc kubenswrapper[4733]: I0318 10:32:13.264900 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-5vkkr\" (UID: \"e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d\") " pod="openstack/dnsmasq-dns-5c79d794d7-5vkkr" Mar 18 10:32:13 crc kubenswrapper[4733]: I0318 10:32:13.267535 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac26b4cb-ac0b-4b78-9c5e-60c6563b478e-kube-api-access-n47d2" (OuterVolumeSpecName: "kube-api-access-n47d2") pod "ac26b4cb-ac0b-4b78-9c5e-60c6563b478e" (UID: "ac26b4cb-ac0b-4b78-9c5e-60c6563b478e"). InnerVolumeSpecName "kube-api-access-n47d2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:32:13 crc kubenswrapper[4733]: I0318 10:32:13.268116 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-5vkkr\" (UID: \"e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d\") " pod="openstack/dnsmasq-dns-5c79d794d7-5vkkr" Mar 18 10:32:13 crc kubenswrapper[4733]: I0318 10:32:13.279882 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbm6m\" (UniqueName: \"kubernetes.io/projected/e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d-kube-api-access-zbm6m\") pod \"dnsmasq-dns-5c79d794d7-5vkkr\" (UID: \"e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d\") " pod="openstack/dnsmasq-dns-5c79d794d7-5vkkr" Mar 18 10:32:13 crc kubenswrapper[4733]: I0318 10:32:13.364524 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n47d2\" (UniqueName: \"kubernetes.io/projected/ac26b4cb-ac0b-4b78-9c5e-60c6563b478e-kube-api-access-n47d2\") on node \"crc\" DevicePath \"\"" Mar 18 10:32:13 crc kubenswrapper[4733]: I0318 10:32:13.387808 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-5vkkr" Mar 18 10:32:13 crc kubenswrapper[4733]: I0318 10:32:13.737088 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563832-njktw" Mar 18 10:32:13 crc kubenswrapper[4733]: I0318 10:32:13.739964 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563832-njktw" event={"ID":"ac26b4cb-ac0b-4b78-9c5e-60c6563b478e","Type":"ContainerDied","Data":"8ac1b0f08ec5d849a8bed24cb2098e48f5ac74bb6bd23eef8fa08f76cffaa710"} Mar 18 10:32:13 crc kubenswrapper[4733]: I0318 10:32:13.740006 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ac1b0f08ec5d849a8bed24cb2098e48f5ac74bb6bd23eef8fa08f76cffaa710" Mar 18 10:32:13 crc kubenswrapper[4733]: I0318 10:32:13.963050 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-5vkkr"] Mar 18 10:32:13 crc kubenswrapper[4733]: W0318 10:32:13.974248 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0c771ba_dbb2_470b_b19c_8c8fdefbdd6d.slice/crio-e1e841a260dd28bf7f12085c583e93ce7e0418239f8aa3a3290f316b2610e14f WatchSource:0}: Error finding container e1e841a260dd28bf7f12085c583e93ce7e0418239f8aa3a3290f316b2610e14f: Status 404 returned error can't find the container with id e1e841a260dd28bf7f12085c583e93ce7e0418239f8aa3a3290f316b2610e14f Mar 18 10:32:14 crc kubenswrapper[4733]: I0318 10:32:14.142298 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563826-tfzqx"] Mar 18 10:32:14 crc kubenswrapper[4733]: I0318 10:32:14.150394 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563826-tfzqx"] Mar 18 10:32:14 crc kubenswrapper[4733]: I0318 10:32:14.746079 4733 generic.go:334] "Generic (PLEG): container finished" podID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" containerID="9ba4505789b02a7aa27e40622e705a2188f60bebd64231f907e57dbee799f683" exitCode=0 Mar 18 10:32:14 crc kubenswrapper[4733]: I0318 10:32:14.746140 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b4a4e3e2-bd4d-4f8d-97bc-51267378ab03","Type":"ContainerDied","Data":"9ba4505789b02a7aa27e40622e705a2188f60bebd64231f907e57dbee799f683"} Mar 18 10:32:14 crc kubenswrapper[4733]: I0318 10:32:14.746202 4733 scope.go:117] "RemoveContainer" containerID="001b9e894ecd9bef6f800f761255da30231b8fbfeb22f304bc145622a4998afa" Mar 18 10:32:14 crc kubenswrapper[4733]: I0318 10:32:14.746806 4733 scope.go:117] "RemoveContainer" containerID="9ba4505789b02a7aa27e40622e705a2188f60bebd64231f907e57dbee799f683" Mar 18 10:32:14 crc kubenswrapper[4733]: E0318 10:32:14.747044 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 10s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:32:14 crc kubenswrapper[4733]: I0318 10:32:14.749596 4733 generic.go:334] "Generic (PLEG): container finished" podID="f0570ce4-1455-4698-85cf-01f7108d9e7f" containerID="f53733af192d507492916cb8fdfb9e36a34a2d5b06777b3df798d1db42baebb1" exitCode=0 Mar 18 10:32:14 crc kubenswrapper[4733]: I0318 10:32:14.749668 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f0570ce4-1455-4698-85cf-01f7108d9e7f","Type":"ContainerDied","Data":"f53733af192d507492916cb8fdfb9e36a34a2d5b06777b3df798d1db42baebb1"} Mar 18 10:32:14 crc kubenswrapper[4733]: I0318 10:32:14.750023 4733 scope.go:117] "RemoveContainer" containerID="f53733af192d507492916cb8fdfb9e36a34a2d5b06777b3df798d1db42baebb1" Mar 18 10:32:14 crc kubenswrapper[4733]: E0318 10:32:14.750185 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 10s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:32:14 crc kubenswrapper[4733]: I0318 10:32:14.752884 4733 generic.go:334] "Generic (PLEG): container finished" podID="e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d" containerID="d08ced42dfa9a4c0f8b5c1fc1217494bf5cf7c8b883d4f35abfd833bac185535" exitCode=0 Mar 18 10:32:14 crc kubenswrapper[4733]: I0318 10:32:14.752914 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-5vkkr" event={"ID":"e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d","Type":"ContainerDied","Data":"d08ced42dfa9a4c0f8b5c1fc1217494bf5cf7c8b883d4f35abfd833bac185535"} Mar 18 10:32:14 crc kubenswrapper[4733]: I0318 10:32:14.752941 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-5vkkr" event={"ID":"e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d","Type":"ContainerStarted","Data":"e1e841a260dd28bf7f12085c583e93ce7e0418239f8aa3a3290f316b2610e14f"} Mar 18 10:32:14 crc kubenswrapper[4733]: I0318 10:32:14.906511 4733 scope.go:117] "RemoveContainer" containerID="1082901e937aee3ff012135f915e36be2d201e2adb97cf8ab3cb6b5dbb1e9f6f" Mar 18 10:32:15 crc kubenswrapper[4733]: I0318 10:32:15.191682 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb9f28a6-6f4c-440b-abfc-cca26041cbef" path="/var/lib/kubelet/pods/eb9f28a6-6f4c-440b-abfc-cca26041cbef/volumes" Mar 18 10:32:15 crc kubenswrapper[4733]: I0318 10:32:15.782350 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-5vkkr" event={"ID":"e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d","Type":"ContainerStarted","Data":"42e7ffa5d83fe25c846d3177b274430b818fc1f5e5b4e5a9ae3ffea34dec97db"} Mar 18 10:32:15 crc kubenswrapper[4733]: I0318 10:32:15.782514 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c79d794d7-5vkkr" Mar 18 10:32:15 crc kubenswrapper[4733]: I0318 10:32:15.803828 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c79d794d7-5vkkr" podStartSLOduration=2.803803945 podStartE2EDuration="2.803803945s" podCreationTimestamp="2026-03-18 10:32:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:32:15.79761485 +0000 UTC m=+1175.289349215" watchObservedRunningTime="2026-03-18 10:32:15.803803945 +0000 UTC m=+1175.295538290" Mar 18 10:32:20 crc kubenswrapper[4733]: I0318 10:32:20.179293 4733 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 10:32:21 crc kubenswrapper[4733]: I0318 10:32:21.858885 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-ptbmt" event={"ID":"63c8f7bc-4162-4903-b3f9-96c8736a27b8","Type":"ContainerStarted","Data":"5eade19f29d1bfd378a24e80adf648a30d27b707a4c07763d5b7990ffd79ce55"} Mar 18 10:32:21 crc kubenswrapper[4733]: I0318 10:32:21.890769 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-ptbmt" podStartSLOduration=3.034984458 podStartE2EDuration="31.890718105s" podCreationTimestamp="2026-03-18 10:31:50 +0000 UTC" firstStartedPulling="2026-03-18 10:31:51.77937263 +0000 UTC m=+1151.271106955" lastFinishedPulling="2026-03-18 10:32:20.635106257 +0000 UTC m=+1180.126840602" observedRunningTime="2026-03-18 10:32:21.88912331 +0000 UTC m=+1181.380857645" watchObservedRunningTime="2026-03-18 10:32:21.890718105 +0000 UTC m=+1181.382452420" Mar 18 10:32:23 crc kubenswrapper[4733]: I0318 10:32:23.389422 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c79d794d7-5vkkr" Mar 18 10:32:23 crc kubenswrapper[4733]: I0318 10:32:23.461086 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-h4pnt"] Mar 18 10:32:23 crc kubenswrapper[4733]: I0318 10:32:23.461554 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-h4pnt" podUID="7bdf8dbb-ffe1-48d1-9c79-22e37dd882be" containerName="dnsmasq-dns" containerID="cri-o://76c24ca485c5cd0e612df85bd0c8ef951256abf933fe1359b894cd82b8ea15fb" gracePeriod=10 Mar 18 10:32:23 crc kubenswrapper[4733]: I0318 10:32:23.890815 4733 generic.go:334] "Generic (PLEG): container finished" podID="7bdf8dbb-ffe1-48d1-9c79-22e37dd882be" containerID="76c24ca485c5cd0e612df85bd0c8ef951256abf933fe1359b894cd82b8ea15fb" exitCode=0 Mar 18 10:32:23 crc kubenswrapper[4733]: I0318 10:32:23.890854 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-h4pnt" event={"ID":"7bdf8dbb-ffe1-48d1-9c79-22e37dd882be","Type":"ContainerDied","Data":"76c24ca485c5cd0e612df85bd0c8ef951256abf933fe1359b894cd82b8ea15fb"} Mar 18 10:32:24 crc kubenswrapper[4733]: I0318 10:32:24.046082 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-h4pnt" Mar 18 10:32:24 crc kubenswrapper[4733]: I0318 10:32:24.143803 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7bdf8dbb-ffe1-48d1-9c79-22e37dd882be-dns-svc\") pod \"7bdf8dbb-ffe1-48d1-9c79-22e37dd882be\" (UID: \"7bdf8dbb-ffe1-48d1-9c79-22e37dd882be\") " Mar 18 10:32:24 crc kubenswrapper[4733]: I0318 10:32:24.143918 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7bdf8dbb-ffe1-48d1-9c79-22e37dd882be-ovsdbserver-nb\") pod \"7bdf8dbb-ffe1-48d1-9c79-22e37dd882be\" (UID: \"7bdf8dbb-ffe1-48d1-9c79-22e37dd882be\") " Mar 18 10:32:24 crc kubenswrapper[4733]: I0318 10:32:24.143962 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bdf8dbb-ffe1-48d1-9c79-22e37dd882be-config\") pod \"7bdf8dbb-ffe1-48d1-9c79-22e37dd882be\" (UID: \"7bdf8dbb-ffe1-48d1-9c79-22e37dd882be\") " Mar 18 10:32:24 crc kubenswrapper[4733]: I0318 10:32:24.144005 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7bdf8dbb-ffe1-48d1-9c79-22e37dd882be-ovsdbserver-sb\") pod \"7bdf8dbb-ffe1-48d1-9c79-22e37dd882be\" (UID: \"7bdf8dbb-ffe1-48d1-9c79-22e37dd882be\") " Mar 18 10:32:24 crc kubenswrapper[4733]: I0318 10:32:24.144154 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clgwm\" (UniqueName: \"kubernetes.io/projected/7bdf8dbb-ffe1-48d1-9c79-22e37dd882be-kube-api-access-clgwm\") pod \"7bdf8dbb-ffe1-48d1-9c79-22e37dd882be\" (UID: \"7bdf8dbb-ffe1-48d1-9c79-22e37dd882be\") " Mar 18 10:32:24 crc kubenswrapper[4733]: I0318 10:32:24.157454 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bdf8dbb-ffe1-48d1-9c79-22e37dd882be-kube-api-access-clgwm" (OuterVolumeSpecName: "kube-api-access-clgwm") pod "7bdf8dbb-ffe1-48d1-9c79-22e37dd882be" (UID: "7bdf8dbb-ffe1-48d1-9c79-22e37dd882be"). InnerVolumeSpecName "kube-api-access-clgwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:32:24 crc kubenswrapper[4733]: I0318 10:32:24.208753 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bdf8dbb-ffe1-48d1-9c79-22e37dd882be-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7bdf8dbb-ffe1-48d1-9c79-22e37dd882be" (UID: "7bdf8dbb-ffe1-48d1-9c79-22e37dd882be"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:32:24 crc kubenswrapper[4733]: I0318 10:32:24.215179 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bdf8dbb-ffe1-48d1-9c79-22e37dd882be-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7bdf8dbb-ffe1-48d1-9c79-22e37dd882be" (UID: "7bdf8dbb-ffe1-48d1-9c79-22e37dd882be"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:32:24 crc kubenswrapper[4733]: I0318 10:32:24.233383 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bdf8dbb-ffe1-48d1-9c79-22e37dd882be-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7bdf8dbb-ffe1-48d1-9c79-22e37dd882be" (UID: "7bdf8dbb-ffe1-48d1-9c79-22e37dd882be"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:32:24 crc kubenswrapper[4733]: I0318 10:32:24.249663 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bdf8dbb-ffe1-48d1-9c79-22e37dd882be-config" (OuterVolumeSpecName: "config") pod "7bdf8dbb-ffe1-48d1-9c79-22e37dd882be" (UID: "7bdf8dbb-ffe1-48d1-9c79-22e37dd882be"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:32:24 crc kubenswrapper[4733]: I0318 10:32:24.251327 4733 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7bdf8dbb-ffe1-48d1-9c79-22e37dd882be-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 10:32:24 crc kubenswrapper[4733]: I0318 10:32:24.251353 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bdf8dbb-ffe1-48d1-9c79-22e37dd882be-config\") on node \"crc\" DevicePath \"\"" Mar 18 10:32:24 crc kubenswrapper[4733]: I0318 10:32:24.251366 4733 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7bdf8dbb-ffe1-48d1-9c79-22e37dd882be-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 10:32:24 crc kubenswrapper[4733]: I0318 10:32:24.251379 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-clgwm\" (UniqueName: \"kubernetes.io/projected/7bdf8dbb-ffe1-48d1-9c79-22e37dd882be-kube-api-access-clgwm\") on node \"crc\" DevicePath \"\"" Mar 18 10:32:24 crc kubenswrapper[4733]: I0318 10:32:24.251401 4733 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7bdf8dbb-ffe1-48d1-9c79-22e37dd882be-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 10:32:24 crc kubenswrapper[4733]: I0318 10:32:24.902723 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-h4pnt" event={"ID":"7bdf8dbb-ffe1-48d1-9c79-22e37dd882be","Type":"ContainerDied","Data":"5dd7dc77696d4097c1648883d1fba422fc00eb1a9ede4031a68c1b0d6e1e9d1c"} Mar 18 10:32:24 crc kubenswrapper[4733]: I0318 10:32:24.902812 4733 scope.go:117] "RemoveContainer" containerID="76c24ca485c5cd0e612df85bd0c8ef951256abf933fe1359b894cd82b8ea15fb" Mar 18 10:32:24 crc kubenswrapper[4733]: I0318 10:32:24.902867 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-h4pnt" Mar 18 10:32:24 crc kubenswrapper[4733]: I0318 10:32:24.930105 4733 scope.go:117] "RemoveContainer" containerID="be1323a707d76c996153e9edb3286a8842293d7d0852b41ecba2e5d11f48e074" Mar 18 10:32:24 crc kubenswrapper[4733]: I0318 10:32:24.966947 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-h4pnt"] Mar 18 10:32:24 crc kubenswrapper[4733]: I0318 10:32:24.976099 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-h4pnt"] Mar 18 10:32:25 crc kubenswrapper[4733]: I0318 10:32:25.185861 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bdf8dbb-ffe1-48d1-9c79-22e37dd882be" path="/var/lib/kubelet/pods/7bdf8dbb-ffe1-48d1-9c79-22e37dd882be/volumes" Mar 18 10:32:26 crc kubenswrapper[4733]: I0318 10:32:26.176558 4733 scope.go:117] "RemoveContainer" containerID="f53733af192d507492916cb8fdfb9e36a34a2d5b06777b3df798d1db42baebb1" Mar 18 10:32:26 crc kubenswrapper[4733]: I0318 10:32:26.920627 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f0570ce4-1455-4698-85cf-01f7108d9e7f","Type":"ContainerStarted","Data":"c403cc91104f4f18606a75c2c7c0e5519b21ff1fde3dacb452abcf30617940a4"} Mar 18 10:32:26 crc kubenswrapper[4733]: I0318 10:32:26.921155 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 18 10:32:27 crc kubenswrapper[4733]: I0318 10:32:27.931954 4733 generic.go:334] "Generic (PLEG): container finished" podID="63c8f7bc-4162-4903-b3f9-96c8736a27b8" containerID="5eade19f29d1bfd378a24e80adf648a30d27b707a4c07763d5b7990ffd79ce55" exitCode=0 Mar 18 10:32:27 crc kubenswrapper[4733]: I0318 10:32:27.932292 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-ptbmt" event={"ID":"63c8f7bc-4162-4903-b3f9-96c8736a27b8","Type":"ContainerDied","Data":"5eade19f29d1bfd378a24e80adf648a30d27b707a4c07763d5b7990ffd79ce55"} Mar 18 10:32:29 crc kubenswrapper[4733]: I0318 10:32:29.176317 4733 scope.go:117] "RemoveContainer" containerID="9ba4505789b02a7aa27e40622e705a2188f60bebd64231f907e57dbee799f683" Mar 18 10:32:29 crc kubenswrapper[4733]: I0318 10:32:29.341106 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-ptbmt" Mar 18 10:32:29 crc kubenswrapper[4733]: I0318 10:32:29.433065 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmmrf\" (UniqueName: \"kubernetes.io/projected/63c8f7bc-4162-4903-b3f9-96c8736a27b8-kube-api-access-vmmrf\") pod \"63c8f7bc-4162-4903-b3f9-96c8736a27b8\" (UID: \"63c8f7bc-4162-4903-b3f9-96c8736a27b8\") " Mar 18 10:32:29 crc kubenswrapper[4733]: I0318 10:32:29.433160 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/63c8f7bc-4162-4903-b3f9-96c8736a27b8-db-sync-config-data\") pod \"63c8f7bc-4162-4903-b3f9-96c8736a27b8\" (UID: \"63c8f7bc-4162-4903-b3f9-96c8736a27b8\") " Mar 18 10:32:29 crc kubenswrapper[4733]: I0318 10:32:29.433266 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63c8f7bc-4162-4903-b3f9-96c8736a27b8-combined-ca-bundle\") pod \"63c8f7bc-4162-4903-b3f9-96c8736a27b8\" (UID: \"63c8f7bc-4162-4903-b3f9-96c8736a27b8\") " Mar 18 10:32:29 crc kubenswrapper[4733]: I0318 10:32:29.433366 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63c8f7bc-4162-4903-b3f9-96c8736a27b8-config-data\") pod \"63c8f7bc-4162-4903-b3f9-96c8736a27b8\" (UID: \"63c8f7bc-4162-4903-b3f9-96c8736a27b8\") " Mar 18 10:32:29 crc kubenswrapper[4733]: I0318 10:32:29.444879 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63c8f7bc-4162-4903-b3f9-96c8736a27b8-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "63c8f7bc-4162-4903-b3f9-96c8736a27b8" (UID: "63c8f7bc-4162-4903-b3f9-96c8736a27b8"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:32:29 crc kubenswrapper[4733]: I0318 10:32:29.444943 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63c8f7bc-4162-4903-b3f9-96c8736a27b8-kube-api-access-vmmrf" (OuterVolumeSpecName: "kube-api-access-vmmrf") pod "63c8f7bc-4162-4903-b3f9-96c8736a27b8" (UID: "63c8f7bc-4162-4903-b3f9-96c8736a27b8"). InnerVolumeSpecName "kube-api-access-vmmrf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:32:29 crc kubenswrapper[4733]: I0318 10:32:29.460778 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63c8f7bc-4162-4903-b3f9-96c8736a27b8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "63c8f7bc-4162-4903-b3f9-96c8736a27b8" (UID: "63c8f7bc-4162-4903-b3f9-96c8736a27b8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:32:29 crc kubenswrapper[4733]: I0318 10:32:29.496454 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63c8f7bc-4162-4903-b3f9-96c8736a27b8-config-data" (OuterVolumeSpecName: "config-data") pod "63c8f7bc-4162-4903-b3f9-96c8736a27b8" (UID: "63c8f7bc-4162-4903-b3f9-96c8736a27b8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:32:29 crc kubenswrapper[4733]: I0318 10:32:29.535516 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmmrf\" (UniqueName: \"kubernetes.io/projected/63c8f7bc-4162-4903-b3f9-96c8736a27b8-kube-api-access-vmmrf\") on node \"crc\" DevicePath \"\"" Mar 18 10:32:29 crc kubenswrapper[4733]: I0318 10:32:29.535552 4733 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/63c8f7bc-4162-4903-b3f9-96c8736a27b8-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 10:32:29 crc kubenswrapper[4733]: I0318 10:32:29.535564 4733 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63c8f7bc-4162-4903-b3f9-96c8736a27b8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 10:32:29 crc kubenswrapper[4733]: I0318 10:32:29.535577 4733 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63c8f7bc-4162-4903-b3f9-96c8736a27b8-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 10:32:29 crc kubenswrapper[4733]: I0318 10:32:29.954844 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b4a4e3e2-bd4d-4f8d-97bc-51267378ab03","Type":"ContainerStarted","Data":"e897d31f6a9846ce7ce6f729eb4ad9ad29fd1f9d58f4b6a76aae048e641e8982"} Mar 18 10:32:29 crc kubenswrapper[4733]: I0318 10:32:29.955966 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 18 10:32:29 crc kubenswrapper[4733]: I0318 10:32:29.957182 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-ptbmt" event={"ID":"63c8f7bc-4162-4903-b3f9-96c8736a27b8","Type":"ContainerDied","Data":"0e53d30c9ba54d0bab85e4e6730952c64da04afa3c77dd75422e1f34b5188d78"} Mar 18 10:32:29 crc kubenswrapper[4733]: I0318 10:32:29.957225 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e53d30c9ba54d0bab85e4e6730952c64da04afa3c77dd75422e1f34b5188d78" Mar 18 10:32:29 crc kubenswrapper[4733]: I0318 10:32:29.957272 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-ptbmt" Mar 18 10:32:30 crc kubenswrapper[4733]: I0318 10:32:30.450717 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-z8m4g"] Mar 18 10:32:30 crc kubenswrapper[4733]: E0318 10:32:30.451011 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bdf8dbb-ffe1-48d1-9c79-22e37dd882be" containerName="dnsmasq-dns" Mar 18 10:32:30 crc kubenswrapper[4733]: I0318 10:32:30.451022 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bdf8dbb-ffe1-48d1-9c79-22e37dd882be" containerName="dnsmasq-dns" Mar 18 10:32:30 crc kubenswrapper[4733]: E0318 10:32:30.451038 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bdf8dbb-ffe1-48d1-9c79-22e37dd882be" containerName="init" Mar 18 10:32:30 crc kubenswrapper[4733]: I0318 10:32:30.451044 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bdf8dbb-ffe1-48d1-9c79-22e37dd882be" containerName="init" Mar 18 10:32:30 crc kubenswrapper[4733]: E0318 10:32:30.451057 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63c8f7bc-4162-4903-b3f9-96c8736a27b8" containerName="glance-db-sync" Mar 18 10:32:30 crc kubenswrapper[4733]: I0318 10:32:30.451063 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="63c8f7bc-4162-4903-b3f9-96c8736a27b8" containerName="glance-db-sync" Mar 18 10:32:30 crc kubenswrapper[4733]: E0318 10:32:30.451080 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac26b4cb-ac0b-4b78-9c5e-60c6563b478e" containerName="oc" Mar 18 10:32:30 crc kubenswrapper[4733]: I0318 10:32:30.451085 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac26b4cb-ac0b-4b78-9c5e-60c6563b478e" containerName="oc" Mar 18 10:32:30 crc kubenswrapper[4733]: I0318 10:32:30.451235 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bdf8dbb-ffe1-48d1-9c79-22e37dd882be" containerName="dnsmasq-dns" Mar 18 10:32:30 crc kubenswrapper[4733]: I0318 10:32:30.451250 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac26b4cb-ac0b-4b78-9c5e-60c6563b478e" containerName="oc" Mar 18 10:32:30 crc kubenswrapper[4733]: I0318 10:32:30.451260 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="63c8f7bc-4162-4903-b3f9-96c8736a27b8" containerName="glance-db-sync" Mar 18 10:32:30 crc kubenswrapper[4733]: I0318 10:32:30.451985 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-z8m4g" Mar 18 10:32:30 crc kubenswrapper[4733]: I0318 10:32:30.474700 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-z8m4g"] Mar 18 10:32:30 crc kubenswrapper[4733]: I0318 10:32:30.650909 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5fcd9264-61af-4872-82e6-8b0e1667ac70-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-z8m4g\" (UID: \"5fcd9264-61af-4872-82e6-8b0e1667ac70\") " pod="openstack/dnsmasq-dns-5f59b8f679-z8m4g" Mar 18 10:32:30 crc kubenswrapper[4733]: I0318 10:32:30.651215 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5fcd9264-61af-4872-82e6-8b0e1667ac70-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-z8m4g\" (UID: \"5fcd9264-61af-4872-82e6-8b0e1667ac70\") " pod="openstack/dnsmasq-dns-5f59b8f679-z8m4g" Mar 18 10:32:30 crc kubenswrapper[4733]: I0318 10:32:30.651244 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spshf\" (UniqueName: \"kubernetes.io/projected/5fcd9264-61af-4872-82e6-8b0e1667ac70-kube-api-access-spshf\") pod \"dnsmasq-dns-5f59b8f679-z8m4g\" (UID: \"5fcd9264-61af-4872-82e6-8b0e1667ac70\") " pod="openstack/dnsmasq-dns-5f59b8f679-z8m4g" Mar 18 10:32:30 crc kubenswrapper[4733]: I0318 10:32:30.651371 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5fcd9264-61af-4872-82e6-8b0e1667ac70-config\") pod \"dnsmasq-dns-5f59b8f679-z8m4g\" (UID: \"5fcd9264-61af-4872-82e6-8b0e1667ac70\") " pod="openstack/dnsmasq-dns-5f59b8f679-z8m4g" Mar 18 10:32:30 crc kubenswrapper[4733]: I0318 10:32:30.651439 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5fcd9264-61af-4872-82e6-8b0e1667ac70-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-z8m4g\" (UID: \"5fcd9264-61af-4872-82e6-8b0e1667ac70\") " pod="openstack/dnsmasq-dns-5f59b8f679-z8m4g" Mar 18 10:32:30 crc kubenswrapper[4733]: I0318 10:32:30.651475 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5fcd9264-61af-4872-82e6-8b0e1667ac70-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-z8m4g\" (UID: \"5fcd9264-61af-4872-82e6-8b0e1667ac70\") " pod="openstack/dnsmasq-dns-5f59b8f679-z8m4g" Mar 18 10:32:30 crc kubenswrapper[4733]: I0318 10:32:30.752640 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5fcd9264-61af-4872-82e6-8b0e1667ac70-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-z8m4g\" (UID: \"5fcd9264-61af-4872-82e6-8b0e1667ac70\") " pod="openstack/dnsmasq-dns-5f59b8f679-z8m4g" Mar 18 10:32:30 crc kubenswrapper[4733]: I0318 10:32:30.752690 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5fcd9264-61af-4872-82e6-8b0e1667ac70-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-z8m4g\" (UID: \"5fcd9264-61af-4872-82e6-8b0e1667ac70\") " pod="openstack/dnsmasq-dns-5f59b8f679-z8m4g" Mar 18 10:32:30 crc kubenswrapper[4733]: I0318 10:32:30.752734 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5fcd9264-61af-4872-82e6-8b0e1667ac70-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-z8m4g\" (UID: \"5fcd9264-61af-4872-82e6-8b0e1667ac70\") " pod="openstack/dnsmasq-dns-5f59b8f679-z8m4g" Mar 18 10:32:30 crc kubenswrapper[4733]: I0318 10:32:30.752760 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5fcd9264-61af-4872-82e6-8b0e1667ac70-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-z8m4g\" (UID: \"5fcd9264-61af-4872-82e6-8b0e1667ac70\") " pod="openstack/dnsmasq-dns-5f59b8f679-z8m4g" Mar 18 10:32:30 crc kubenswrapper[4733]: I0318 10:32:30.752777 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spshf\" (UniqueName: \"kubernetes.io/projected/5fcd9264-61af-4872-82e6-8b0e1667ac70-kube-api-access-spshf\") pod \"dnsmasq-dns-5f59b8f679-z8m4g\" (UID: \"5fcd9264-61af-4872-82e6-8b0e1667ac70\") " pod="openstack/dnsmasq-dns-5f59b8f679-z8m4g" Mar 18 10:32:30 crc kubenswrapper[4733]: I0318 10:32:30.753120 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5fcd9264-61af-4872-82e6-8b0e1667ac70-config\") pod \"dnsmasq-dns-5f59b8f679-z8m4g\" (UID: \"5fcd9264-61af-4872-82e6-8b0e1667ac70\") " pod="openstack/dnsmasq-dns-5f59b8f679-z8m4g" Mar 18 10:32:30 crc kubenswrapper[4733]: I0318 10:32:30.753622 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5fcd9264-61af-4872-82e6-8b0e1667ac70-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-z8m4g\" (UID: \"5fcd9264-61af-4872-82e6-8b0e1667ac70\") " pod="openstack/dnsmasq-dns-5f59b8f679-z8m4g" Mar 18 10:32:30 crc kubenswrapper[4733]: I0318 10:32:30.753845 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5fcd9264-61af-4872-82e6-8b0e1667ac70-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-z8m4g\" (UID: \"5fcd9264-61af-4872-82e6-8b0e1667ac70\") " pod="openstack/dnsmasq-dns-5f59b8f679-z8m4g" Mar 18 10:32:30 crc kubenswrapper[4733]: I0318 10:32:30.753867 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5fcd9264-61af-4872-82e6-8b0e1667ac70-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-z8m4g\" (UID: \"5fcd9264-61af-4872-82e6-8b0e1667ac70\") " pod="openstack/dnsmasq-dns-5f59b8f679-z8m4g" Mar 18 10:32:30 crc kubenswrapper[4733]: I0318 10:32:30.754786 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5fcd9264-61af-4872-82e6-8b0e1667ac70-config\") pod \"dnsmasq-dns-5f59b8f679-z8m4g\" (UID: \"5fcd9264-61af-4872-82e6-8b0e1667ac70\") " pod="openstack/dnsmasq-dns-5f59b8f679-z8m4g" Mar 18 10:32:30 crc kubenswrapper[4733]: I0318 10:32:30.755033 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5fcd9264-61af-4872-82e6-8b0e1667ac70-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-z8m4g\" (UID: \"5fcd9264-61af-4872-82e6-8b0e1667ac70\") " pod="openstack/dnsmasq-dns-5f59b8f679-z8m4g" Mar 18 10:32:30 crc kubenswrapper[4733]: I0318 10:32:30.781489 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spshf\" (UniqueName: \"kubernetes.io/projected/5fcd9264-61af-4872-82e6-8b0e1667ac70-kube-api-access-spshf\") pod \"dnsmasq-dns-5f59b8f679-z8m4g\" (UID: \"5fcd9264-61af-4872-82e6-8b0e1667ac70\") " pod="openstack/dnsmasq-dns-5f59b8f679-z8m4g" Mar 18 10:32:30 crc kubenswrapper[4733]: I0318 10:32:30.966418 4733 generic.go:334] "Generic (PLEG): container finished" podID="f0570ce4-1455-4698-85cf-01f7108d9e7f" containerID="c403cc91104f4f18606a75c2c7c0e5519b21ff1fde3dacb452abcf30617940a4" exitCode=0 Mar 18 10:32:30 crc kubenswrapper[4733]: I0318 10:32:30.967304 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f0570ce4-1455-4698-85cf-01f7108d9e7f","Type":"ContainerDied","Data":"c403cc91104f4f18606a75c2c7c0e5519b21ff1fde3dacb452abcf30617940a4"} Mar 18 10:32:30 crc kubenswrapper[4733]: I0318 10:32:30.967348 4733 scope.go:117] "RemoveContainer" containerID="f53733af192d507492916cb8fdfb9e36a34a2d5b06777b3df798d1db42baebb1" Mar 18 10:32:30 crc kubenswrapper[4733]: I0318 10:32:30.967752 4733 scope.go:117] "RemoveContainer" containerID="c403cc91104f4f18606a75c2c7c0e5519b21ff1fde3dacb452abcf30617940a4" Mar 18 10:32:30 crc kubenswrapper[4733]: E0318 10:32:30.967953 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 20s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:32:31 crc kubenswrapper[4733]: I0318 10:32:31.068330 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-z8m4g" Mar 18 10:32:31 crc kubenswrapper[4733]: I0318 10:32:31.512287 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-z8m4g"] Mar 18 10:32:31 crc kubenswrapper[4733]: W0318 10:32:31.514846 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5fcd9264_61af_4872_82e6_8b0e1667ac70.slice/crio-24b45d20a646f19c4ac7c53f08aa1252d7641f341ab6088ea4d0e7ab19ced41f WatchSource:0}: Error finding container 24b45d20a646f19c4ac7c53f08aa1252d7641f341ab6088ea4d0e7ab19ced41f: Status 404 returned error can't find the container with id 24b45d20a646f19c4ac7c53f08aa1252d7641f341ab6088ea4d0e7ab19ced41f Mar 18 10:32:31 crc kubenswrapper[4733]: I0318 10:32:31.977484 4733 generic.go:334] "Generic (PLEG): container finished" podID="5fcd9264-61af-4872-82e6-8b0e1667ac70" containerID="683bc45d8d919be15f5e003415fd39aa3c57a2d498bf02cfe2428c0d955eea7e" exitCode=0 Mar 18 10:32:31 crc kubenswrapper[4733]: I0318 10:32:31.977548 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-z8m4g" event={"ID":"5fcd9264-61af-4872-82e6-8b0e1667ac70","Type":"ContainerDied","Data":"683bc45d8d919be15f5e003415fd39aa3c57a2d498bf02cfe2428c0d955eea7e"} Mar 18 10:32:31 crc kubenswrapper[4733]: I0318 10:32:31.977907 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-z8m4g" event={"ID":"5fcd9264-61af-4872-82e6-8b0e1667ac70","Type":"ContainerStarted","Data":"24b45d20a646f19c4ac7c53f08aa1252d7641f341ab6088ea4d0e7ab19ced41f"} Mar 18 10:32:33 crc kubenswrapper[4733]: I0318 10:32:32.999716 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-z8m4g" event={"ID":"5fcd9264-61af-4872-82e6-8b0e1667ac70","Type":"ContainerStarted","Data":"4875c8e8342cbcaa2e5b0a8a9970108486fdbbadf86eefa8ef87d6df3340240a"} Mar 18 10:32:33 crc kubenswrapper[4733]: I0318 10:32:33.000376 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f59b8f679-z8m4g" Mar 18 10:32:33 crc kubenswrapper[4733]: I0318 10:32:33.029885 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5f59b8f679-z8m4g" podStartSLOduration=3.02986627 podStartE2EDuration="3.02986627s" podCreationTimestamp="2026-03-18 10:32:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 10:32:33.021547405 +0000 UTC m=+1192.513281730" watchObservedRunningTime="2026-03-18 10:32:33.02986627 +0000 UTC m=+1192.521600585" Mar 18 10:32:34 crc kubenswrapper[4733]: I0318 10:32:34.009055 4733 generic.go:334] "Generic (PLEG): container finished" podID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" containerID="e897d31f6a9846ce7ce6f729eb4ad9ad29fd1f9d58f4b6a76aae048e641e8982" exitCode=0 Mar 18 10:32:34 crc kubenswrapper[4733]: I0318 10:32:34.009138 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b4a4e3e2-bd4d-4f8d-97bc-51267378ab03","Type":"ContainerDied","Data":"e897d31f6a9846ce7ce6f729eb4ad9ad29fd1f9d58f4b6a76aae048e641e8982"} Mar 18 10:32:34 crc kubenswrapper[4733]: I0318 10:32:34.009631 4733 scope.go:117] "RemoveContainer" containerID="9ba4505789b02a7aa27e40622e705a2188f60bebd64231f907e57dbee799f683" Mar 18 10:32:34 crc kubenswrapper[4733]: I0318 10:32:34.010403 4733 scope.go:117] "RemoveContainer" containerID="e897d31f6a9846ce7ce6f729eb4ad9ad29fd1f9d58f4b6a76aae048e641e8982" Mar 18 10:32:34 crc kubenswrapper[4733]: E0318 10:32:34.010666 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 20s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:32:41 crc kubenswrapper[4733]: I0318 10:32:41.071493 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5f59b8f679-z8m4g" Mar 18 10:32:41 crc kubenswrapper[4733]: I0318 10:32:41.159475 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-5vkkr"] Mar 18 10:32:41 crc kubenswrapper[4733]: I0318 10:32:41.159754 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c79d794d7-5vkkr" podUID="e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d" containerName="dnsmasq-dns" containerID="cri-o://42e7ffa5d83fe25c846d3177b274430b818fc1f5e5b4e5a9ae3ffea34dec97db" gracePeriod=10 Mar 18 10:32:41 crc kubenswrapper[4733]: I0318 10:32:41.685315 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-5vkkr" Mar 18 10:32:41 crc kubenswrapper[4733]: I0318 10:32:41.772739 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbm6m\" (UniqueName: \"kubernetes.io/projected/e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d-kube-api-access-zbm6m\") pod \"e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d\" (UID: \"e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d\") " Mar 18 10:32:41 crc kubenswrapper[4733]: I0318 10:32:41.772774 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d-dns-swift-storage-0\") pod \"e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d\" (UID: \"e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d\") " Mar 18 10:32:41 crc kubenswrapper[4733]: I0318 10:32:41.772824 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d-dns-svc\") pod \"e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d\" (UID: \"e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d\") " Mar 18 10:32:41 crc kubenswrapper[4733]: I0318 10:32:41.772857 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d-config\") pod \"e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d\" (UID: \"e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d\") " Mar 18 10:32:41 crc kubenswrapper[4733]: I0318 10:32:41.772879 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d-ovsdbserver-sb\") pod \"e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d\" (UID: \"e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d\") " Mar 18 10:32:41 crc kubenswrapper[4733]: I0318 10:32:41.772911 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d-ovsdbserver-nb\") pod \"e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d\" (UID: \"e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d\") " Mar 18 10:32:41 crc kubenswrapper[4733]: I0318 10:32:41.785334 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d-kube-api-access-zbm6m" (OuterVolumeSpecName: "kube-api-access-zbm6m") pod "e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d" (UID: "e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d"). InnerVolumeSpecName "kube-api-access-zbm6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:32:41 crc kubenswrapper[4733]: I0318 10:32:41.843014 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d" (UID: "e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:32:41 crc kubenswrapper[4733]: I0318 10:32:41.843953 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d" (UID: "e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:32:41 crc kubenswrapper[4733]: I0318 10:32:41.852970 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d" (UID: "e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:32:41 crc kubenswrapper[4733]: I0318 10:32:41.857111 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d" (UID: "e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:32:41 crc kubenswrapper[4733]: I0318 10:32:41.876394 4733 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 10:32:41 crc kubenswrapper[4733]: I0318 10:32:41.876459 4733 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 10:32:41 crc kubenswrapper[4733]: I0318 10:32:41.876475 4733 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 10:32:41 crc kubenswrapper[4733]: I0318 10:32:41.876488 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbm6m\" (UniqueName: \"kubernetes.io/projected/e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d-kube-api-access-zbm6m\") on node \"crc\" DevicePath \"\"" Mar 18 10:32:41 crc kubenswrapper[4733]: I0318 10:32:41.876502 4733 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 10:32:41 crc kubenswrapper[4733]: I0318 10:32:41.877571 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d-config" (OuterVolumeSpecName: "config") pod "e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d" (UID: "e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:32:41 crc kubenswrapper[4733]: I0318 10:32:41.977725 4733 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d-config\") on node \"crc\" DevicePath \"\"" Mar 18 10:32:42 crc kubenswrapper[4733]: I0318 10:32:42.080984 4733 generic.go:334] "Generic (PLEG): container finished" podID="e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d" containerID="42e7ffa5d83fe25c846d3177b274430b818fc1f5e5b4e5a9ae3ffea34dec97db" exitCode=0 Mar 18 10:32:42 crc kubenswrapper[4733]: I0318 10:32:42.081030 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-5vkkr" Mar 18 10:32:42 crc kubenswrapper[4733]: I0318 10:32:42.081031 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-5vkkr" event={"ID":"e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d","Type":"ContainerDied","Data":"42e7ffa5d83fe25c846d3177b274430b818fc1f5e5b4e5a9ae3ffea34dec97db"} Mar 18 10:32:42 crc kubenswrapper[4733]: I0318 10:32:42.081136 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-5vkkr" event={"ID":"e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d","Type":"ContainerDied","Data":"e1e841a260dd28bf7f12085c583e93ce7e0418239f8aa3a3290f316b2610e14f"} Mar 18 10:32:42 crc kubenswrapper[4733]: I0318 10:32:42.081155 4733 scope.go:117] "RemoveContainer" containerID="42e7ffa5d83fe25c846d3177b274430b818fc1f5e5b4e5a9ae3ffea34dec97db" Mar 18 10:32:42 crc kubenswrapper[4733]: I0318 10:32:42.100159 4733 scope.go:117] "RemoveContainer" containerID="d08ced42dfa9a4c0f8b5c1fc1217494bf5cf7c8b883d4f35abfd833bac185535" Mar 18 10:32:42 crc kubenswrapper[4733]: I0318 10:32:42.114039 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-5vkkr"] Mar 18 10:32:42 crc kubenswrapper[4733]: I0318 10:32:42.120813 4733 scope.go:117] "RemoveContainer" containerID="42e7ffa5d83fe25c846d3177b274430b818fc1f5e5b4e5a9ae3ffea34dec97db" Mar 18 10:32:42 crc kubenswrapper[4733]: I0318 10:32:42.121108 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-5vkkr"] Mar 18 10:32:42 crc kubenswrapper[4733]: E0318 10:32:42.121298 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42e7ffa5d83fe25c846d3177b274430b818fc1f5e5b4e5a9ae3ffea34dec97db\": container with ID starting with 42e7ffa5d83fe25c846d3177b274430b818fc1f5e5b4e5a9ae3ffea34dec97db not found: ID does not exist" containerID="42e7ffa5d83fe25c846d3177b274430b818fc1f5e5b4e5a9ae3ffea34dec97db" Mar 18 10:32:42 crc kubenswrapper[4733]: I0318 10:32:42.121337 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42e7ffa5d83fe25c846d3177b274430b818fc1f5e5b4e5a9ae3ffea34dec97db"} err="failed to get container status \"42e7ffa5d83fe25c846d3177b274430b818fc1f5e5b4e5a9ae3ffea34dec97db\": rpc error: code = NotFound desc = could not find container \"42e7ffa5d83fe25c846d3177b274430b818fc1f5e5b4e5a9ae3ffea34dec97db\": container with ID starting with 42e7ffa5d83fe25c846d3177b274430b818fc1f5e5b4e5a9ae3ffea34dec97db not found: ID does not exist" Mar 18 10:32:42 crc kubenswrapper[4733]: I0318 10:32:42.121368 4733 scope.go:117] "RemoveContainer" containerID="d08ced42dfa9a4c0f8b5c1fc1217494bf5cf7c8b883d4f35abfd833bac185535" Mar 18 10:32:42 crc kubenswrapper[4733]: E0318 10:32:42.121664 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d08ced42dfa9a4c0f8b5c1fc1217494bf5cf7c8b883d4f35abfd833bac185535\": container with ID starting with d08ced42dfa9a4c0f8b5c1fc1217494bf5cf7c8b883d4f35abfd833bac185535 not found: ID does not exist" containerID="d08ced42dfa9a4c0f8b5c1fc1217494bf5cf7c8b883d4f35abfd833bac185535" Mar 18 10:32:42 crc kubenswrapper[4733]: I0318 10:32:42.121693 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d08ced42dfa9a4c0f8b5c1fc1217494bf5cf7c8b883d4f35abfd833bac185535"} err="failed to get container status \"d08ced42dfa9a4c0f8b5c1fc1217494bf5cf7c8b883d4f35abfd833bac185535\": rpc error: code = NotFound desc = could not find container \"d08ced42dfa9a4c0f8b5c1fc1217494bf5cf7c8b883d4f35abfd833bac185535\": container with ID starting with d08ced42dfa9a4c0f8b5c1fc1217494bf5cf7c8b883d4f35abfd833bac185535 not found: ID does not exist" Mar 18 10:32:43 crc kubenswrapper[4733]: I0318 10:32:43.176723 4733 scope.go:117] "RemoveContainer" containerID="c403cc91104f4f18606a75c2c7c0e5519b21ff1fde3dacb452abcf30617940a4" Mar 18 10:32:43 crc kubenswrapper[4733]: E0318 10:32:43.177457 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 20s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:32:43 crc kubenswrapper[4733]: I0318 10:32:43.192980 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d" path="/var/lib/kubelet/pods/e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d/volumes" Mar 18 10:32:43 crc kubenswrapper[4733]: I0318 10:32:43.571021 4733 patch_prober.go:28] interesting pod/machine-config-daemon-2h7dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:32:43 crc kubenswrapper[4733]: I0318 10:32:43.571119 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:32:48 crc kubenswrapper[4733]: I0318 10:32:48.175851 4733 scope.go:117] "RemoveContainer" containerID="e897d31f6a9846ce7ce6f729eb4ad9ad29fd1f9d58f4b6a76aae048e641e8982" Mar 18 10:32:48 crc kubenswrapper[4733]: E0318 10:32:48.176582 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 20s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:32:54 crc kubenswrapper[4733]: I0318 10:32:54.175669 4733 scope.go:117] "RemoveContainer" containerID="c403cc91104f4f18606a75c2c7c0e5519b21ff1fde3dacb452abcf30617940a4" Mar 18 10:32:55 crc kubenswrapper[4733]: I0318 10:32:55.202537 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f0570ce4-1455-4698-85cf-01f7108d9e7f","Type":"ContainerStarted","Data":"bdf571c67b493e3fd2b9642e6dd66519b598c153da1f077075fd25aebffa1e9c"} Mar 18 10:32:55 crc kubenswrapper[4733]: I0318 10:32:55.203580 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 18 10:32:59 crc kubenswrapper[4733]: I0318 10:32:59.239632 4733 generic.go:334] "Generic (PLEG): container finished" podID="f0570ce4-1455-4698-85cf-01f7108d9e7f" containerID="bdf571c67b493e3fd2b9642e6dd66519b598c153da1f077075fd25aebffa1e9c" exitCode=0 Mar 18 10:32:59 crc kubenswrapper[4733]: I0318 10:32:59.239764 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f0570ce4-1455-4698-85cf-01f7108d9e7f","Type":"ContainerDied","Data":"bdf571c67b493e3fd2b9642e6dd66519b598c153da1f077075fd25aebffa1e9c"} Mar 18 10:32:59 crc kubenswrapper[4733]: I0318 10:32:59.240041 4733 scope.go:117] "RemoveContainer" containerID="c403cc91104f4f18606a75c2c7c0e5519b21ff1fde3dacb452abcf30617940a4" Mar 18 10:32:59 crc kubenswrapper[4733]: I0318 10:32:59.240765 4733 scope.go:117] "RemoveContainer" containerID="bdf571c67b493e3fd2b9642e6dd66519b598c153da1f077075fd25aebffa1e9c" Mar 18 10:32:59 crc kubenswrapper[4733]: E0318 10:32:59.241111 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 40s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:33:02 crc kubenswrapper[4733]: I0318 10:33:02.175984 4733 scope.go:117] "RemoveContainer" containerID="e897d31f6a9846ce7ce6f729eb4ad9ad29fd1f9d58f4b6a76aae048e641e8982" Mar 18 10:33:03 crc kubenswrapper[4733]: I0318 10:33:03.281881 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b4a4e3e2-bd4d-4f8d-97bc-51267378ab03","Type":"ContainerStarted","Data":"404382805ba91938d5973ffc7857ed67b92775fae6cf128d8db979d4adcb6eaa"} Mar 18 10:33:03 crc kubenswrapper[4733]: I0318 10:33:03.282554 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 18 10:33:07 crc kubenswrapper[4733]: I0318 10:33:07.324852 4733 generic.go:334] "Generic (PLEG): container finished" podID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" containerID="404382805ba91938d5973ffc7857ed67b92775fae6cf128d8db979d4adcb6eaa" exitCode=0 Mar 18 10:33:07 crc kubenswrapper[4733]: I0318 10:33:07.324911 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b4a4e3e2-bd4d-4f8d-97bc-51267378ab03","Type":"ContainerDied","Data":"404382805ba91938d5973ffc7857ed67b92775fae6cf128d8db979d4adcb6eaa"} Mar 18 10:33:07 crc kubenswrapper[4733]: I0318 10:33:07.325331 4733 scope.go:117] "RemoveContainer" containerID="e897d31f6a9846ce7ce6f729eb4ad9ad29fd1f9d58f4b6a76aae048e641e8982" Mar 18 10:33:07 crc kubenswrapper[4733]: I0318 10:33:07.326727 4733 scope.go:117] "RemoveContainer" containerID="404382805ba91938d5973ffc7857ed67b92775fae6cf128d8db979d4adcb6eaa" Mar 18 10:33:07 crc kubenswrapper[4733]: E0318 10:33:07.327166 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 40s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:33:08 crc kubenswrapper[4733]: I0318 10:33:08.570637 4733 scope.go:117] "RemoveContainer" containerID="73a17ce4bce512adc8ff4282e561fca0880aa24a1a28aaa52332d077a8673f8c" Mar 18 10:33:13 crc kubenswrapper[4733]: I0318 10:33:13.571884 4733 patch_prober.go:28] interesting pod/machine-config-daemon-2h7dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:33:13 crc kubenswrapper[4733]: I0318 10:33:13.572530 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:33:14 crc kubenswrapper[4733]: I0318 10:33:14.176445 4733 scope.go:117] "RemoveContainer" containerID="bdf571c67b493e3fd2b9642e6dd66519b598c153da1f077075fd25aebffa1e9c" Mar 18 10:33:14 crc kubenswrapper[4733]: E0318 10:33:14.177107 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 40s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:33:22 crc kubenswrapper[4733]: I0318 10:33:22.175749 4733 scope.go:117] "RemoveContainer" containerID="404382805ba91938d5973ffc7857ed67b92775fae6cf128d8db979d4adcb6eaa" Mar 18 10:33:22 crc kubenswrapper[4733]: E0318 10:33:22.176507 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 40s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:33:29 crc kubenswrapper[4733]: I0318 10:33:29.176618 4733 scope.go:117] "RemoveContainer" containerID="bdf571c67b493e3fd2b9642e6dd66519b598c153da1f077075fd25aebffa1e9c" Mar 18 10:33:29 crc kubenswrapper[4733]: E0318 10:33:29.177531 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 40s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:33:36 crc kubenswrapper[4733]: I0318 10:33:36.175418 4733 scope.go:117] "RemoveContainer" containerID="404382805ba91938d5973ffc7857ed67b92775fae6cf128d8db979d4adcb6eaa" Mar 18 10:33:36 crc kubenswrapper[4733]: E0318 10:33:36.176269 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 40s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:33:41 crc kubenswrapper[4733]: I0318 10:33:41.187832 4733 scope.go:117] "RemoveContainer" containerID="bdf571c67b493e3fd2b9642e6dd66519b598c153da1f077075fd25aebffa1e9c" Mar 18 10:33:41 crc kubenswrapper[4733]: I0318 10:33:41.672691 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f0570ce4-1455-4698-85cf-01f7108d9e7f","Type":"ContainerStarted","Data":"693e6eba0eed87d5064eb695aef9f113ca2e6fa1b8fe4241bf5171215cf4e686"} Mar 18 10:33:41 crc kubenswrapper[4733]: I0318 10:33:41.674072 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 18 10:33:43 crc kubenswrapper[4733]: I0318 10:33:43.571355 4733 patch_prober.go:28] interesting pod/machine-config-daemon-2h7dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:33:43 crc kubenswrapper[4733]: I0318 10:33:43.571821 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:33:43 crc kubenswrapper[4733]: I0318 10:33:43.571906 4733 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" Mar 18 10:33:43 crc kubenswrapper[4733]: I0318 10:33:43.573792 4733 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"345f1c51e0b2f38e27fd31ce4a7323d51ffa4b8419f456177dd8653558afb625"} pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 10:33:43 crc kubenswrapper[4733]: I0318 10:33:43.573916 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" containerName="machine-config-daemon" containerID="cri-o://345f1c51e0b2f38e27fd31ce4a7323d51ffa4b8419f456177dd8653558afb625" gracePeriod=600 Mar 18 10:33:44 crc kubenswrapper[4733]: I0318 10:33:44.710782 4733 generic.go:334] "Generic (PLEG): container finished" podID="6f75e1c5-e0c5-43df-944f-77b734070793" containerID="345f1c51e0b2f38e27fd31ce4a7323d51ffa4b8419f456177dd8653558afb625" exitCode=0 Mar 18 10:33:44 crc kubenswrapper[4733]: I0318 10:33:44.710901 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" event={"ID":"6f75e1c5-e0c5-43df-944f-77b734070793","Type":"ContainerDied","Data":"345f1c51e0b2f38e27fd31ce4a7323d51ffa4b8419f456177dd8653558afb625"} Mar 18 10:33:44 crc kubenswrapper[4733]: I0318 10:33:44.711176 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" event={"ID":"6f75e1c5-e0c5-43df-944f-77b734070793","Type":"ContainerStarted","Data":"18491327409d036c07217a5bf65332367e43c6f94559e59f3995caefe0f899d9"} Mar 18 10:33:44 crc kubenswrapper[4733]: I0318 10:33:44.711216 4733 scope.go:117] "RemoveContainer" containerID="2a78644e078fbb319d0fc66d47cfb2501076e4fd678ad793e791ddb4f3d3ee96" Mar 18 10:33:45 crc kubenswrapper[4733]: I0318 10:33:45.733456 4733 generic.go:334] "Generic (PLEG): container finished" podID="f0570ce4-1455-4698-85cf-01f7108d9e7f" containerID="693e6eba0eed87d5064eb695aef9f113ca2e6fa1b8fe4241bf5171215cf4e686" exitCode=0 Mar 18 10:33:45 crc kubenswrapper[4733]: I0318 10:33:45.733539 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f0570ce4-1455-4698-85cf-01f7108d9e7f","Type":"ContainerDied","Data":"693e6eba0eed87d5064eb695aef9f113ca2e6fa1b8fe4241bf5171215cf4e686"} Mar 18 10:33:45 crc kubenswrapper[4733]: I0318 10:33:45.734029 4733 scope.go:117] "RemoveContainer" containerID="bdf571c67b493e3fd2b9642e6dd66519b598c153da1f077075fd25aebffa1e9c" Mar 18 10:33:45 crc kubenswrapper[4733]: I0318 10:33:45.735481 4733 scope.go:117] "RemoveContainer" containerID="693e6eba0eed87d5064eb695aef9f113ca2e6fa1b8fe4241bf5171215cf4e686" Mar 18 10:33:45 crc kubenswrapper[4733]: E0318 10:33:45.736178 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:33:49 crc kubenswrapper[4733]: I0318 10:33:49.175960 4733 scope.go:117] "RemoveContainer" containerID="404382805ba91938d5973ffc7857ed67b92775fae6cf128d8db979d4adcb6eaa" Mar 18 10:33:49 crc kubenswrapper[4733]: I0318 10:33:49.784683 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b4a4e3e2-bd4d-4f8d-97bc-51267378ab03","Type":"ContainerStarted","Data":"f53819fc80d135628a27c9199c900935f63ae50e8bb03f43fb957f7ef27dfd00"} Mar 18 10:33:49 crc kubenswrapper[4733]: I0318 10:33:49.785578 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 18 10:33:53 crc kubenswrapper[4733]: I0318 10:33:53.822208 4733 generic.go:334] "Generic (PLEG): container finished" podID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" containerID="f53819fc80d135628a27c9199c900935f63ae50e8bb03f43fb957f7ef27dfd00" exitCode=0 Mar 18 10:33:53 crc kubenswrapper[4733]: I0318 10:33:53.822322 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b4a4e3e2-bd4d-4f8d-97bc-51267378ab03","Type":"ContainerDied","Data":"f53819fc80d135628a27c9199c900935f63ae50e8bb03f43fb957f7ef27dfd00"} Mar 18 10:33:53 crc kubenswrapper[4733]: I0318 10:33:53.822834 4733 scope.go:117] "RemoveContainer" containerID="404382805ba91938d5973ffc7857ed67b92775fae6cf128d8db979d4adcb6eaa" Mar 18 10:33:53 crc kubenswrapper[4733]: I0318 10:33:53.823680 4733 scope.go:117] "RemoveContainer" containerID="f53819fc80d135628a27c9199c900935f63ae50e8bb03f43fb957f7ef27dfd00" Mar 18 10:33:53 crc kubenswrapper[4733]: E0318 10:33:53.824007 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:33:58 crc kubenswrapper[4733]: I0318 10:33:58.176402 4733 scope.go:117] "RemoveContainer" containerID="693e6eba0eed87d5064eb695aef9f113ca2e6fa1b8fe4241bf5171215cf4e686" Mar 18 10:33:58 crc kubenswrapper[4733]: E0318 10:33:58.177111 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:34:00 crc kubenswrapper[4733]: I0318 10:34:00.157638 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563834-7bqxq"] Mar 18 10:34:00 crc kubenswrapper[4733]: E0318 10:34:00.158538 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d" containerName="dnsmasq-dns" Mar 18 10:34:00 crc kubenswrapper[4733]: I0318 10:34:00.158559 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d" containerName="dnsmasq-dns" Mar 18 10:34:00 crc kubenswrapper[4733]: E0318 10:34:00.158580 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d" containerName="init" Mar 18 10:34:00 crc kubenswrapper[4733]: I0318 10:34:00.158592 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d" containerName="init" Mar 18 10:34:00 crc kubenswrapper[4733]: I0318 10:34:00.158898 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0c771ba-dbb2-470b-b19c-8c8fdefbdd6d" containerName="dnsmasq-dns" Mar 18 10:34:00 crc kubenswrapper[4733]: I0318 10:34:00.159807 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563834-7bqxq" Mar 18 10:34:00 crc kubenswrapper[4733]: I0318 10:34:00.165036 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wmd5k" Mar 18 10:34:00 crc kubenswrapper[4733]: I0318 10:34:00.167710 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:34:00 crc kubenswrapper[4733]: I0318 10:34:00.168829 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:34:00 crc kubenswrapper[4733]: I0318 10:34:00.170057 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563834-7bqxq"] Mar 18 10:34:00 crc kubenswrapper[4733]: I0318 10:34:00.201707 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crlqs\" (UniqueName: \"kubernetes.io/projected/dc0f56eb-6c6a-49bf-9a12-ef5f2dd95316-kube-api-access-crlqs\") pod \"auto-csr-approver-29563834-7bqxq\" (UID: \"dc0f56eb-6c6a-49bf-9a12-ef5f2dd95316\") " pod="openshift-infra/auto-csr-approver-29563834-7bqxq" Mar 18 10:34:00 crc kubenswrapper[4733]: I0318 10:34:00.304498 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crlqs\" (UniqueName: \"kubernetes.io/projected/dc0f56eb-6c6a-49bf-9a12-ef5f2dd95316-kube-api-access-crlqs\") pod \"auto-csr-approver-29563834-7bqxq\" (UID: \"dc0f56eb-6c6a-49bf-9a12-ef5f2dd95316\") " pod="openshift-infra/auto-csr-approver-29563834-7bqxq" Mar 18 10:34:00 crc kubenswrapper[4733]: I0318 10:34:00.344269 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crlqs\" (UniqueName: \"kubernetes.io/projected/dc0f56eb-6c6a-49bf-9a12-ef5f2dd95316-kube-api-access-crlqs\") pod \"auto-csr-approver-29563834-7bqxq\" (UID: \"dc0f56eb-6c6a-49bf-9a12-ef5f2dd95316\") " pod="openshift-infra/auto-csr-approver-29563834-7bqxq" Mar 18 10:34:00 crc kubenswrapper[4733]: I0318 10:34:00.513896 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563834-7bqxq" Mar 18 10:34:01 crc kubenswrapper[4733]: I0318 10:34:01.632276 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563834-7bqxq"] Mar 18 10:34:01 crc kubenswrapper[4733]: W0318 10:34:01.635411 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc0f56eb_6c6a_49bf_9a12_ef5f2dd95316.slice/crio-29b17b8ff0f2a892cfd860135ff82140434dd4ba914f10aa584db885c8ec2e17 WatchSource:0}: Error finding container 29b17b8ff0f2a892cfd860135ff82140434dd4ba914f10aa584db885c8ec2e17: Status 404 returned error can't find the container with id 29b17b8ff0f2a892cfd860135ff82140434dd4ba914f10aa584db885c8ec2e17 Mar 18 10:34:01 crc kubenswrapper[4733]: I0318 10:34:01.905231 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563834-7bqxq" event={"ID":"dc0f56eb-6c6a-49bf-9a12-ef5f2dd95316","Type":"ContainerStarted","Data":"29b17b8ff0f2a892cfd860135ff82140434dd4ba914f10aa584db885c8ec2e17"} Mar 18 10:34:03 crc kubenswrapper[4733]: I0318 10:34:03.930377 4733 generic.go:334] "Generic (PLEG): container finished" podID="dc0f56eb-6c6a-49bf-9a12-ef5f2dd95316" containerID="2746f736c334d9ac3079e5dc9b5db5929c610a6933e47547e58536ec78e443c9" exitCode=0 Mar 18 10:34:03 crc kubenswrapper[4733]: I0318 10:34:03.930472 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563834-7bqxq" event={"ID":"dc0f56eb-6c6a-49bf-9a12-ef5f2dd95316","Type":"ContainerDied","Data":"2746f736c334d9ac3079e5dc9b5db5929c610a6933e47547e58536ec78e443c9"} Mar 18 10:34:05 crc kubenswrapper[4733]: I0318 10:34:05.350340 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563834-7bqxq" Mar 18 10:34:05 crc kubenswrapper[4733]: I0318 10:34:05.389032 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crlqs\" (UniqueName: \"kubernetes.io/projected/dc0f56eb-6c6a-49bf-9a12-ef5f2dd95316-kube-api-access-crlqs\") pod \"dc0f56eb-6c6a-49bf-9a12-ef5f2dd95316\" (UID: \"dc0f56eb-6c6a-49bf-9a12-ef5f2dd95316\") " Mar 18 10:34:05 crc kubenswrapper[4733]: I0318 10:34:05.399471 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc0f56eb-6c6a-49bf-9a12-ef5f2dd95316-kube-api-access-crlqs" (OuterVolumeSpecName: "kube-api-access-crlqs") pod "dc0f56eb-6c6a-49bf-9a12-ef5f2dd95316" (UID: "dc0f56eb-6c6a-49bf-9a12-ef5f2dd95316"). InnerVolumeSpecName "kube-api-access-crlqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:34:05 crc kubenswrapper[4733]: I0318 10:34:05.490305 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crlqs\" (UniqueName: \"kubernetes.io/projected/dc0f56eb-6c6a-49bf-9a12-ef5f2dd95316-kube-api-access-crlqs\") on node \"crc\" DevicePath \"\"" Mar 18 10:34:05 crc kubenswrapper[4733]: I0318 10:34:05.965677 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563834-7bqxq" event={"ID":"dc0f56eb-6c6a-49bf-9a12-ef5f2dd95316","Type":"ContainerDied","Data":"29b17b8ff0f2a892cfd860135ff82140434dd4ba914f10aa584db885c8ec2e17"} Mar 18 10:34:05 crc kubenswrapper[4733]: I0318 10:34:05.966165 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29b17b8ff0f2a892cfd860135ff82140434dd4ba914f10aa584db885c8ec2e17" Mar 18 10:34:05 crc kubenswrapper[4733]: I0318 10:34:05.966041 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563834-7bqxq" Mar 18 10:34:06 crc kubenswrapper[4733]: I0318 10:34:06.434054 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563828-zczv7"] Mar 18 10:34:06 crc kubenswrapper[4733]: I0318 10:34:06.448233 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563828-zczv7"] Mar 18 10:34:07 crc kubenswrapper[4733]: I0318 10:34:07.175138 4733 scope.go:117] "RemoveContainer" containerID="f53819fc80d135628a27c9199c900935f63ae50e8bb03f43fb957f7ef27dfd00" Mar 18 10:34:07 crc kubenswrapper[4733]: E0318 10:34:07.175394 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:34:07 crc kubenswrapper[4733]: I0318 10:34:07.188999 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68574d72-725d-48c2-b645-bd83dcccbf80" path="/var/lib/kubelet/pods/68574d72-725d-48c2-b645-bd83dcccbf80/volumes" Mar 18 10:34:08 crc kubenswrapper[4733]: I0318 10:34:08.687280 4733 scope.go:117] "RemoveContainer" containerID="f4a3549ea82cce03bd994263d641938a407bdfdc2f86792bccee0b653493614d" Mar 18 10:34:12 crc kubenswrapper[4733]: I0318 10:34:12.177602 4733 scope.go:117] "RemoveContainer" containerID="693e6eba0eed87d5064eb695aef9f113ca2e6fa1b8fe4241bf5171215cf4e686" Mar 18 10:34:12 crc kubenswrapper[4733]: E0318 10:34:12.178650 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:34:20 crc kubenswrapper[4733]: I0318 10:34:20.176083 4733 scope.go:117] "RemoveContainer" containerID="f53819fc80d135628a27c9199c900935f63ae50e8bb03f43fb957f7ef27dfd00" Mar 18 10:34:20 crc kubenswrapper[4733]: E0318 10:34:20.176929 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:34:23 crc kubenswrapper[4733]: I0318 10:34:23.176370 4733 scope.go:117] "RemoveContainer" containerID="693e6eba0eed87d5064eb695aef9f113ca2e6fa1b8fe4241bf5171215cf4e686" Mar 18 10:34:23 crc kubenswrapper[4733]: E0318 10:34:23.177308 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:34:35 crc kubenswrapper[4733]: I0318 10:34:35.177413 4733 scope.go:117] "RemoveContainer" containerID="f53819fc80d135628a27c9199c900935f63ae50e8bb03f43fb957f7ef27dfd00" Mar 18 10:34:35 crc kubenswrapper[4733]: E0318 10:34:35.179051 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:34:36 crc kubenswrapper[4733]: I0318 10:34:36.175560 4733 scope.go:117] "RemoveContainer" containerID="693e6eba0eed87d5064eb695aef9f113ca2e6fa1b8fe4241bf5171215cf4e686" Mar 18 10:34:36 crc kubenswrapper[4733]: E0318 10:34:36.176063 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:34:47 crc kubenswrapper[4733]: I0318 10:34:47.175606 4733 scope.go:117] "RemoveContainer" containerID="693e6eba0eed87d5064eb695aef9f113ca2e6fa1b8fe4241bf5171215cf4e686" Mar 18 10:34:47 crc kubenswrapper[4733]: E0318 10:34:47.176871 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:34:50 crc kubenswrapper[4733]: I0318 10:34:50.175584 4733 scope.go:117] "RemoveContainer" containerID="f53819fc80d135628a27c9199c900935f63ae50e8bb03f43fb957f7ef27dfd00" Mar 18 10:34:50 crc kubenswrapper[4733]: E0318 10:34:50.176171 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:35:00 crc kubenswrapper[4733]: I0318 10:35:00.175976 4733 scope.go:117] "RemoveContainer" containerID="693e6eba0eed87d5064eb695aef9f113ca2e6fa1b8fe4241bf5171215cf4e686" Mar 18 10:35:00 crc kubenswrapper[4733]: E0318 10:35:00.177096 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:35:04 crc kubenswrapper[4733]: I0318 10:35:04.194146 4733 scope.go:117] "RemoveContainer" containerID="f53819fc80d135628a27c9199c900935f63ae50e8bb03f43fb957f7ef27dfd00" Mar 18 10:35:04 crc kubenswrapper[4733]: E0318 10:35:04.195258 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:35:11 crc kubenswrapper[4733]: I0318 10:35:11.184555 4733 scope.go:117] "RemoveContainer" containerID="693e6eba0eed87d5064eb695aef9f113ca2e6fa1b8fe4241bf5171215cf4e686" Mar 18 10:35:11 crc kubenswrapper[4733]: I0318 10:35:11.655012 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f0570ce4-1455-4698-85cf-01f7108d9e7f","Type":"ContainerStarted","Data":"245acab52e36967117888a03ca9615fc134fa986b5326378b27571bcb153bf6a"} Mar 18 10:35:11 crc kubenswrapper[4733]: I0318 10:35:11.655681 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 18 10:35:15 crc kubenswrapper[4733]: I0318 10:35:15.176510 4733 scope.go:117] "RemoveContainer" containerID="f53819fc80d135628a27c9199c900935f63ae50e8bb03f43fb957f7ef27dfd00" Mar 18 10:35:15 crc kubenswrapper[4733]: I0318 10:35:15.696863 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b4a4e3e2-bd4d-4f8d-97bc-51267378ab03","Type":"ContainerStarted","Data":"b8c4d43890082484b9a1254cd9426c5cd83f45e4b1a61544da192f3cdebac71e"} Mar 18 10:35:15 crc kubenswrapper[4733]: I0318 10:35:15.697183 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 18 10:35:15 crc kubenswrapper[4733]: I0318 10:35:15.701180 4733 generic.go:334] "Generic (PLEG): container finished" podID="f0570ce4-1455-4698-85cf-01f7108d9e7f" containerID="245acab52e36967117888a03ca9615fc134fa986b5326378b27571bcb153bf6a" exitCode=0 Mar 18 10:35:15 crc kubenswrapper[4733]: I0318 10:35:15.701323 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f0570ce4-1455-4698-85cf-01f7108d9e7f","Type":"ContainerDied","Data":"245acab52e36967117888a03ca9615fc134fa986b5326378b27571bcb153bf6a"} Mar 18 10:35:15 crc kubenswrapper[4733]: I0318 10:35:15.701442 4733 scope.go:117] "RemoveContainer" containerID="693e6eba0eed87d5064eb695aef9f113ca2e6fa1b8fe4241bf5171215cf4e686" Mar 18 10:35:15 crc kubenswrapper[4733]: I0318 10:35:15.702137 4733 scope.go:117] "RemoveContainer" containerID="245acab52e36967117888a03ca9615fc134fa986b5326378b27571bcb153bf6a" Mar 18 10:35:15 crc kubenswrapper[4733]: E0318 10:35:15.702568 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:35:19 crc kubenswrapper[4733]: I0318 10:35:19.751477 4733 generic.go:334] "Generic (PLEG): container finished" podID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" containerID="b8c4d43890082484b9a1254cd9426c5cd83f45e4b1a61544da192f3cdebac71e" exitCode=0 Mar 18 10:35:19 crc kubenswrapper[4733]: I0318 10:35:19.751631 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b4a4e3e2-bd4d-4f8d-97bc-51267378ab03","Type":"ContainerDied","Data":"b8c4d43890082484b9a1254cd9426c5cd83f45e4b1a61544da192f3cdebac71e"} Mar 18 10:35:19 crc kubenswrapper[4733]: I0318 10:35:19.752348 4733 scope.go:117] "RemoveContainer" containerID="f53819fc80d135628a27c9199c900935f63ae50e8bb03f43fb957f7ef27dfd00" Mar 18 10:35:19 crc kubenswrapper[4733]: I0318 10:35:19.753641 4733 scope.go:117] "RemoveContainer" containerID="b8c4d43890082484b9a1254cd9426c5cd83f45e4b1a61544da192f3cdebac71e" Mar 18 10:35:19 crc kubenswrapper[4733]: E0318 10:35:19.754023 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:35:30 crc kubenswrapper[4733]: I0318 10:35:30.175796 4733 scope.go:117] "RemoveContainer" containerID="245acab52e36967117888a03ca9615fc134fa986b5326378b27571bcb153bf6a" Mar 18 10:35:30 crc kubenswrapper[4733]: I0318 10:35:30.178339 4733 scope.go:117] "RemoveContainer" containerID="b8c4d43890082484b9a1254cd9426c5cd83f45e4b1a61544da192f3cdebac71e" Mar 18 10:35:30 crc kubenswrapper[4733]: E0318 10:35:30.178826 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:35:30 crc kubenswrapper[4733]: E0318 10:35:30.178845 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:35:41 crc kubenswrapper[4733]: I0318 10:35:41.182249 4733 scope.go:117] "RemoveContainer" containerID="b8c4d43890082484b9a1254cd9426c5cd83f45e4b1a61544da192f3cdebac71e" Mar 18 10:35:41 crc kubenswrapper[4733]: E0318 10:35:41.183093 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:35:43 crc kubenswrapper[4733]: I0318 10:35:43.571044 4733 patch_prober.go:28] interesting pod/machine-config-daemon-2h7dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:35:43 crc kubenswrapper[4733]: I0318 10:35:43.571527 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:35:44 crc kubenswrapper[4733]: I0318 10:35:44.175605 4733 scope.go:117] "RemoveContainer" containerID="245acab52e36967117888a03ca9615fc134fa986b5326378b27571bcb153bf6a" Mar 18 10:35:44 crc kubenswrapper[4733]: E0318 10:35:44.176506 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:35:52 crc kubenswrapper[4733]: I0318 10:35:52.175597 4733 scope.go:117] "RemoveContainer" containerID="b8c4d43890082484b9a1254cd9426c5cd83f45e4b1a61544da192f3cdebac71e" Mar 18 10:35:52 crc kubenswrapper[4733]: E0318 10:35:52.176639 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:35:58 crc kubenswrapper[4733]: I0318 10:35:58.175554 4733 scope.go:117] "RemoveContainer" containerID="245acab52e36967117888a03ca9615fc134fa986b5326378b27571bcb153bf6a" Mar 18 10:35:58 crc kubenswrapper[4733]: E0318 10:35:58.176344 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:36:00 crc kubenswrapper[4733]: I0318 10:36:00.149229 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563836-4x58h"] Mar 18 10:36:00 crc kubenswrapper[4733]: E0318 10:36:00.149954 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc0f56eb-6c6a-49bf-9a12-ef5f2dd95316" containerName="oc" Mar 18 10:36:00 crc kubenswrapper[4733]: I0318 10:36:00.149971 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc0f56eb-6c6a-49bf-9a12-ef5f2dd95316" containerName="oc" Mar 18 10:36:00 crc kubenswrapper[4733]: I0318 10:36:00.150218 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc0f56eb-6c6a-49bf-9a12-ef5f2dd95316" containerName="oc" Mar 18 10:36:00 crc kubenswrapper[4733]: I0318 10:36:00.150795 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563836-4x58h" Mar 18 10:36:00 crc kubenswrapper[4733]: I0318 10:36:00.153355 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:36:00 crc kubenswrapper[4733]: I0318 10:36:00.153407 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:36:00 crc kubenswrapper[4733]: I0318 10:36:00.154667 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wmd5k" Mar 18 10:36:00 crc kubenswrapper[4733]: I0318 10:36:00.156978 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563836-4x58h"] Mar 18 10:36:00 crc kubenswrapper[4733]: I0318 10:36:00.263636 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c46cx\" (UniqueName: \"kubernetes.io/projected/d31d006c-81a6-4bbb-a44a-fda966944372-kube-api-access-c46cx\") pod \"auto-csr-approver-29563836-4x58h\" (UID: \"d31d006c-81a6-4bbb-a44a-fda966944372\") " pod="openshift-infra/auto-csr-approver-29563836-4x58h" Mar 18 10:36:00 crc kubenswrapper[4733]: I0318 10:36:00.366031 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c46cx\" (UniqueName: \"kubernetes.io/projected/d31d006c-81a6-4bbb-a44a-fda966944372-kube-api-access-c46cx\") pod \"auto-csr-approver-29563836-4x58h\" (UID: \"d31d006c-81a6-4bbb-a44a-fda966944372\") " pod="openshift-infra/auto-csr-approver-29563836-4x58h" Mar 18 10:36:00 crc kubenswrapper[4733]: I0318 10:36:00.386619 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c46cx\" (UniqueName: \"kubernetes.io/projected/d31d006c-81a6-4bbb-a44a-fda966944372-kube-api-access-c46cx\") pod \"auto-csr-approver-29563836-4x58h\" (UID: \"d31d006c-81a6-4bbb-a44a-fda966944372\") " pod="openshift-infra/auto-csr-approver-29563836-4x58h" Mar 18 10:36:00 crc kubenswrapper[4733]: I0318 10:36:00.472967 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563836-4x58h" Mar 18 10:36:00 crc kubenswrapper[4733]: I0318 10:36:00.940454 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563836-4x58h"] Mar 18 10:36:01 crc kubenswrapper[4733]: I0318 10:36:01.554127 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563836-4x58h" event={"ID":"d31d006c-81a6-4bbb-a44a-fda966944372","Type":"ContainerStarted","Data":"450d93d363176e50bafab37a68c6e96830ec396aca9eb8497b1b034d1c6120d3"} Mar 18 10:36:02 crc kubenswrapper[4733]: I0318 10:36:02.564875 4733 generic.go:334] "Generic (PLEG): container finished" podID="d31d006c-81a6-4bbb-a44a-fda966944372" containerID="4d1446897edd6664fd044255842a94c2933d1bab0fe0d09f54123f9a53833063" exitCode=0 Mar 18 10:36:02 crc kubenswrapper[4733]: I0318 10:36:02.564974 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563836-4x58h" event={"ID":"d31d006c-81a6-4bbb-a44a-fda966944372","Type":"ContainerDied","Data":"4d1446897edd6664fd044255842a94c2933d1bab0fe0d09f54123f9a53833063"} Mar 18 10:36:03 crc kubenswrapper[4733]: I0318 10:36:03.177304 4733 scope.go:117] "RemoveContainer" containerID="b8c4d43890082484b9a1254cd9426c5cd83f45e4b1a61544da192f3cdebac71e" Mar 18 10:36:03 crc kubenswrapper[4733]: E0318 10:36:03.177626 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:36:03 crc kubenswrapper[4733]: I0318 10:36:03.911078 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563836-4x58h" Mar 18 10:36:04 crc kubenswrapper[4733]: I0318 10:36:04.029757 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c46cx\" (UniqueName: \"kubernetes.io/projected/d31d006c-81a6-4bbb-a44a-fda966944372-kube-api-access-c46cx\") pod \"d31d006c-81a6-4bbb-a44a-fda966944372\" (UID: \"d31d006c-81a6-4bbb-a44a-fda966944372\") " Mar 18 10:36:04 crc kubenswrapper[4733]: I0318 10:36:04.037011 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d31d006c-81a6-4bbb-a44a-fda966944372-kube-api-access-c46cx" (OuterVolumeSpecName: "kube-api-access-c46cx") pod "d31d006c-81a6-4bbb-a44a-fda966944372" (UID: "d31d006c-81a6-4bbb-a44a-fda966944372"). InnerVolumeSpecName "kube-api-access-c46cx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:36:04 crc kubenswrapper[4733]: I0318 10:36:04.131273 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c46cx\" (UniqueName: \"kubernetes.io/projected/d31d006c-81a6-4bbb-a44a-fda966944372-kube-api-access-c46cx\") on node \"crc\" DevicePath \"\"" Mar 18 10:36:04 crc kubenswrapper[4733]: I0318 10:36:04.581954 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563836-4x58h" event={"ID":"d31d006c-81a6-4bbb-a44a-fda966944372","Type":"ContainerDied","Data":"450d93d363176e50bafab37a68c6e96830ec396aca9eb8497b1b034d1c6120d3"} Mar 18 10:36:04 crc kubenswrapper[4733]: I0318 10:36:04.581998 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="450d93d363176e50bafab37a68c6e96830ec396aca9eb8497b1b034d1c6120d3" Mar 18 10:36:04 crc kubenswrapper[4733]: I0318 10:36:04.582050 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563836-4x58h" Mar 18 10:36:04 crc kubenswrapper[4733]: I0318 10:36:04.991521 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563830-2qqd2"] Mar 18 10:36:05 crc kubenswrapper[4733]: I0318 10:36:05.002392 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563830-2qqd2"] Mar 18 10:36:05 crc kubenswrapper[4733]: I0318 10:36:05.200295 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c8eb139-c576-4ab3-8c2b-a309f3aa4a35" path="/var/lib/kubelet/pods/3c8eb139-c576-4ab3-8c2b-a309f3aa4a35/volumes" Mar 18 10:36:08 crc kubenswrapper[4733]: I0318 10:36:08.807150 4733 scope.go:117] "RemoveContainer" containerID="be977da7d932bb787db9cafb1727d3bd50b5e03495d1e8a82c232ed7c66e241e" Mar 18 10:36:13 crc kubenswrapper[4733]: I0318 10:36:13.176735 4733 scope.go:117] "RemoveContainer" containerID="245acab52e36967117888a03ca9615fc134fa986b5326378b27571bcb153bf6a" Mar 18 10:36:13 crc kubenswrapper[4733]: E0318 10:36:13.177785 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:36:13 crc kubenswrapper[4733]: I0318 10:36:13.571722 4733 patch_prober.go:28] interesting pod/machine-config-daemon-2h7dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:36:13 crc kubenswrapper[4733]: I0318 10:36:13.571831 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:36:14 crc kubenswrapper[4733]: I0318 10:36:14.176059 4733 scope.go:117] "RemoveContainer" containerID="b8c4d43890082484b9a1254cd9426c5cd83f45e4b1a61544da192f3cdebac71e" Mar 18 10:36:14 crc kubenswrapper[4733]: E0318 10:36:14.176893 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:36:25 crc kubenswrapper[4733]: I0318 10:36:25.176921 4733 scope.go:117] "RemoveContainer" containerID="245acab52e36967117888a03ca9615fc134fa986b5326378b27571bcb153bf6a" Mar 18 10:36:25 crc kubenswrapper[4733]: E0318 10:36:25.189522 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:36:27 crc kubenswrapper[4733]: I0318 10:36:27.175941 4733 scope.go:117] "RemoveContainer" containerID="b8c4d43890082484b9a1254cd9426c5cd83f45e4b1a61544da192f3cdebac71e" Mar 18 10:36:27 crc kubenswrapper[4733]: E0318 10:36:27.176636 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:36:37 crc kubenswrapper[4733]: I0318 10:36:37.176043 4733 scope.go:117] "RemoveContainer" containerID="245acab52e36967117888a03ca9615fc134fa986b5326378b27571bcb153bf6a" Mar 18 10:36:37 crc kubenswrapper[4733]: E0318 10:36:37.178739 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:36:41 crc kubenswrapper[4733]: I0318 10:36:41.176297 4733 scope.go:117] "RemoveContainer" containerID="b8c4d43890082484b9a1254cd9426c5cd83f45e4b1a61544da192f3cdebac71e" Mar 18 10:36:41 crc kubenswrapper[4733]: E0318 10:36:41.177743 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:36:43 crc kubenswrapper[4733]: I0318 10:36:43.571855 4733 patch_prober.go:28] interesting pod/machine-config-daemon-2h7dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:36:43 crc kubenswrapper[4733]: I0318 10:36:43.572337 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:36:43 crc kubenswrapper[4733]: I0318 10:36:43.572412 4733 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" Mar 18 10:36:43 crc kubenswrapper[4733]: I0318 10:36:43.573400 4733 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"18491327409d036c07217a5bf65332367e43c6f94559e59f3995caefe0f899d9"} pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 10:36:43 crc kubenswrapper[4733]: I0318 10:36:43.573500 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" containerName="machine-config-daemon" containerID="cri-o://18491327409d036c07217a5bf65332367e43c6f94559e59f3995caefe0f899d9" gracePeriod=600 Mar 18 10:36:44 crc kubenswrapper[4733]: I0318 10:36:44.021808 4733 generic.go:334] "Generic (PLEG): container finished" podID="6f75e1c5-e0c5-43df-944f-77b734070793" containerID="18491327409d036c07217a5bf65332367e43c6f94559e59f3995caefe0f899d9" exitCode=0 Mar 18 10:36:44 crc kubenswrapper[4733]: I0318 10:36:44.022025 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" event={"ID":"6f75e1c5-e0c5-43df-944f-77b734070793","Type":"ContainerDied","Data":"18491327409d036c07217a5bf65332367e43c6f94559e59f3995caefe0f899d9"} Mar 18 10:36:44 crc kubenswrapper[4733]: I0318 10:36:44.022219 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" event={"ID":"6f75e1c5-e0c5-43df-944f-77b734070793","Type":"ContainerStarted","Data":"fc33062a38e6003bcfe678b0b641bcd73299a07f8dcc32e6f590e8bb7c29b637"} Mar 18 10:36:44 crc kubenswrapper[4733]: I0318 10:36:44.022276 4733 scope.go:117] "RemoveContainer" containerID="345f1c51e0b2f38e27fd31ce4a7323d51ffa4b8419f456177dd8653558afb625" Mar 18 10:36:50 crc kubenswrapper[4733]: I0318 10:36:50.085860 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-g9kvf"] Mar 18 10:36:50 crc kubenswrapper[4733]: E0318 10:36:50.088068 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d31d006c-81a6-4bbb-a44a-fda966944372" containerName="oc" Mar 18 10:36:50 crc kubenswrapper[4733]: I0318 10:36:50.088101 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="d31d006c-81a6-4bbb-a44a-fda966944372" containerName="oc" Mar 18 10:36:50 crc kubenswrapper[4733]: I0318 10:36:50.089037 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="d31d006c-81a6-4bbb-a44a-fda966944372" containerName="oc" Mar 18 10:36:50 crc kubenswrapper[4733]: I0318 10:36:50.102608 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g9kvf" Mar 18 10:36:50 crc kubenswrapper[4733]: I0318 10:36:50.123082 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-g9kvf"] Mar 18 10:36:50 crc kubenswrapper[4733]: I0318 10:36:50.202473 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2b2cea8-3250-49e7-9afb-8cd8adfbc175-catalog-content\") pod \"community-operators-g9kvf\" (UID: \"a2b2cea8-3250-49e7-9afb-8cd8adfbc175\") " pod="openshift-marketplace/community-operators-g9kvf" Mar 18 10:36:50 crc kubenswrapper[4733]: I0318 10:36:50.202886 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2b2cea8-3250-49e7-9afb-8cd8adfbc175-utilities\") pod \"community-operators-g9kvf\" (UID: \"a2b2cea8-3250-49e7-9afb-8cd8adfbc175\") " pod="openshift-marketplace/community-operators-g9kvf" Mar 18 10:36:50 crc kubenswrapper[4733]: I0318 10:36:50.202996 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46xhq\" (UniqueName: \"kubernetes.io/projected/a2b2cea8-3250-49e7-9afb-8cd8adfbc175-kube-api-access-46xhq\") pod \"community-operators-g9kvf\" (UID: \"a2b2cea8-3250-49e7-9afb-8cd8adfbc175\") " pod="openshift-marketplace/community-operators-g9kvf" Mar 18 10:36:50 crc kubenswrapper[4733]: I0318 10:36:50.305009 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46xhq\" (UniqueName: \"kubernetes.io/projected/a2b2cea8-3250-49e7-9afb-8cd8adfbc175-kube-api-access-46xhq\") pod \"community-operators-g9kvf\" (UID: \"a2b2cea8-3250-49e7-9afb-8cd8adfbc175\") " pod="openshift-marketplace/community-operators-g9kvf" Mar 18 10:36:50 crc kubenswrapper[4733]: I0318 10:36:50.307160 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2b2cea8-3250-49e7-9afb-8cd8adfbc175-catalog-content\") pod \"community-operators-g9kvf\" (UID: \"a2b2cea8-3250-49e7-9afb-8cd8adfbc175\") " pod="openshift-marketplace/community-operators-g9kvf" Mar 18 10:36:50 crc kubenswrapper[4733]: I0318 10:36:50.307845 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2b2cea8-3250-49e7-9afb-8cd8adfbc175-catalog-content\") pod \"community-operators-g9kvf\" (UID: \"a2b2cea8-3250-49e7-9afb-8cd8adfbc175\") " pod="openshift-marketplace/community-operators-g9kvf" Mar 18 10:36:50 crc kubenswrapper[4733]: I0318 10:36:50.308665 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2b2cea8-3250-49e7-9afb-8cd8adfbc175-utilities\") pod \"community-operators-g9kvf\" (UID: \"a2b2cea8-3250-49e7-9afb-8cd8adfbc175\") " pod="openshift-marketplace/community-operators-g9kvf" Mar 18 10:36:50 crc kubenswrapper[4733]: I0318 10:36:50.309363 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2b2cea8-3250-49e7-9afb-8cd8adfbc175-utilities\") pod \"community-operators-g9kvf\" (UID: \"a2b2cea8-3250-49e7-9afb-8cd8adfbc175\") " pod="openshift-marketplace/community-operators-g9kvf" Mar 18 10:36:50 crc kubenswrapper[4733]: I0318 10:36:50.332504 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46xhq\" (UniqueName: \"kubernetes.io/projected/a2b2cea8-3250-49e7-9afb-8cd8adfbc175-kube-api-access-46xhq\") pod \"community-operators-g9kvf\" (UID: \"a2b2cea8-3250-49e7-9afb-8cd8adfbc175\") " pod="openshift-marketplace/community-operators-g9kvf" Mar 18 10:36:50 crc kubenswrapper[4733]: I0318 10:36:50.447943 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g9kvf" Mar 18 10:36:50 crc kubenswrapper[4733]: I0318 10:36:50.960714 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-g9kvf"] Mar 18 10:36:51 crc kubenswrapper[4733]: I0318 10:36:51.125675 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g9kvf" event={"ID":"a2b2cea8-3250-49e7-9afb-8cd8adfbc175","Type":"ContainerStarted","Data":"77b5823cb17c9bd5ad620ae110d159272c1807518ae06bbaa2150540125e3326"} Mar 18 10:36:51 crc kubenswrapper[4733]: I0318 10:36:51.189157 4733 scope.go:117] "RemoveContainer" containerID="245acab52e36967117888a03ca9615fc134fa986b5326378b27571bcb153bf6a" Mar 18 10:36:51 crc kubenswrapper[4733]: E0318 10:36:51.189830 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:36:52 crc kubenswrapper[4733]: I0318 10:36:52.138710 4733 generic.go:334] "Generic (PLEG): container finished" podID="a2b2cea8-3250-49e7-9afb-8cd8adfbc175" containerID="c5a21fa0cbceaea43ca29ac1a7ba14c49e0499e62c04076b20c4a0dea541c867" exitCode=0 Mar 18 10:36:52 crc kubenswrapper[4733]: I0318 10:36:52.138785 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g9kvf" event={"ID":"a2b2cea8-3250-49e7-9afb-8cd8adfbc175","Type":"ContainerDied","Data":"c5a21fa0cbceaea43ca29ac1a7ba14c49e0499e62c04076b20c4a0dea541c867"} Mar 18 10:36:52 crc kubenswrapper[4733]: I0318 10:36:52.177005 4733 scope.go:117] "RemoveContainer" containerID="b8c4d43890082484b9a1254cd9426c5cd83f45e4b1a61544da192f3cdebac71e" Mar 18 10:36:52 crc kubenswrapper[4733]: E0318 10:36:52.177430 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:36:53 crc kubenswrapper[4733]: I0318 10:36:53.152227 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g9kvf" event={"ID":"a2b2cea8-3250-49e7-9afb-8cd8adfbc175","Type":"ContainerStarted","Data":"0182ab6d575fb1256d6047471250a820525e208bd74c344e9e4505523d32cf37"} Mar 18 10:36:53 crc kubenswrapper[4733]: I0318 10:36:53.674853 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mzcst"] Mar 18 10:36:53 crc kubenswrapper[4733]: I0318 10:36:53.682956 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mzcst" Mar 18 10:36:53 crc kubenswrapper[4733]: I0318 10:36:53.688427 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mzcst"] Mar 18 10:36:53 crc kubenswrapper[4733]: I0318 10:36:53.773133 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mspkt\" (UniqueName: \"kubernetes.io/projected/6a9e3c2c-c4de-4719-88bb-6392a6f7f0e9-kube-api-access-mspkt\") pod \"redhat-marketplace-mzcst\" (UID: \"6a9e3c2c-c4de-4719-88bb-6392a6f7f0e9\") " pod="openshift-marketplace/redhat-marketplace-mzcst" Mar 18 10:36:53 crc kubenswrapper[4733]: I0318 10:36:53.773409 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a9e3c2c-c4de-4719-88bb-6392a6f7f0e9-catalog-content\") pod \"redhat-marketplace-mzcst\" (UID: \"6a9e3c2c-c4de-4719-88bb-6392a6f7f0e9\") " pod="openshift-marketplace/redhat-marketplace-mzcst" Mar 18 10:36:53 crc kubenswrapper[4733]: I0318 10:36:53.773606 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a9e3c2c-c4de-4719-88bb-6392a6f7f0e9-utilities\") pod \"redhat-marketplace-mzcst\" (UID: \"6a9e3c2c-c4de-4719-88bb-6392a6f7f0e9\") " pod="openshift-marketplace/redhat-marketplace-mzcst" Mar 18 10:36:53 crc kubenswrapper[4733]: I0318 10:36:53.874883 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mspkt\" (UniqueName: \"kubernetes.io/projected/6a9e3c2c-c4de-4719-88bb-6392a6f7f0e9-kube-api-access-mspkt\") pod \"redhat-marketplace-mzcst\" (UID: \"6a9e3c2c-c4de-4719-88bb-6392a6f7f0e9\") " pod="openshift-marketplace/redhat-marketplace-mzcst" Mar 18 10:36:53 crc kubenswrapper[4733]: I0318 10:36:53.875077 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a9e3c2c-c4de-4719-88bb-6392a6f7f0e9-catalog-content\") pod \"redhat-marketplace-mzcst\" (UID: \"6a9e3c2c-c4de-4719-88bb-6392a6f7f0e9\") " pod="openshift-marketplace/redhat-marketplace-mzcst" Mar 18 10:36:53 crc kubenswrapper[4733]: I0318 10:36:53.875779 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a9e3c2c-c4de-4719-88bb-6392a6f7f0e9-catalog-content\") pod \"redhat-marketplace-mzcst\" (UID: \"6a9e3c2c-c4de-4719-88bb-6392a6f7f0e9\") " pod="openshift-marketplace/redhat-marketplace-mzcst" Mar 18 10:36:53 crc kubenswrapper[4733]: I0318 10:36:53.875939 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a9e3c2c-c4de-4719-88bb-6392a6f7f0e9-utilities\") pod \"redhat-marketplace-mzcst\" (UID: \"6a9e3c2c-c4de-4719-88bb-6392a6f7f0e9\") " pod="openshift-marketplace/redhat-marketplace-mzcst" Mar 18 10:36:53 crc kubenswrapper[4733]: I0318 10:36:53.876436 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a9e3c2c-c4de-4719-88bb-6392a6f7f0e9-utilities\") pod \"redhat-marketplace-mzcst\" (UID: \"6a9e3c2c-c4de-4719-88bb-6392a6f7f0e9\") " pod="openshift-marketplace/redhat-marketplace-mzcst" Mar 18 10:36:53 crc kubenswrapper[4733]: I0318 10:36:53.902112 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mspkt\" (UniqueName: \"kubernetes.io/projected/6a9e3c2c-c4de-4719-88bb-6392a6f7f0e9-kube-api-access-mspkt\") pod \"redhat-marketplace-mzcst\" (UID: \"6a9e3c2c-c4de-4719-88bb-6392a6f7f0e9\") " pod="openshift-marketplace/redhat-marketplace-mzcst" Mar 18 10:36:54 crc kubenswrapper[4733]: I0318 10:36:54.010764 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mzcst" Mar 18 10:36:54 crc kubenswrapper[4733]: I0318 10:36:54.178490 4733 generic.go:334] "Generic (PLEG): container finished" podID="a2b2cea8-3250-49e7-9afb-8cd8adfbc175" containerID="0182ab6d575fb1256d6047471250a820525e208bd74c344e9e4505523d32cf37" exitCode=0 Mar 18 10:36:54 crc kubenswrapper[4733]: I0318 10:36:54.178548 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g9kvf" event={"ID":"a2b2cea8-3250-49e7-9afb-8cd8adfbc175","Type":"ContainerDied","Data":"0182ab6d575fb1256d6047471250a820525e208bd74c344e9e4505523d32cf37"} Mar 18 10:36:54 crc kubenswrapper[4733]: I0318 10:36:54.509670 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mzcst"] Mar 18 10:36:55 crc kubenswrapper[4733]: I0318 10:36:55.197933 4733 generic.go:334] "Generic (PLEG): container finished" podID="6a9e3c2c-c4de-4719-88bb-6392a6f7f0e9" containerID="43bc7fdfbeffa77f19586c47b1204bb99968bbe1c8c66ec263c804c270f6200b" exitCode=0 Mar 18 10:36:55 crc kubenswrapper[4733]: I0318 10:36:55.198038 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mzcst" event={"ID":"6a9e3c2c-c4de-4719-88bb-6392a6f7f0e9","Type":"ContainerDied","Data":"43bc7fdfbeffa77f19586c47b1204bb99968bbe1c8c66ec263c804c270f6200b"} Mar 18 10:36:55 crc kubenswrapper[4733]: I0318 10:36:55.198377 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mzcst" event={"ID":"6a9e3c2c-c4de-4719-88bb-6392a6f7f0e9","Type":"ContainerStarted","Data":"733fa8b795ed27071ff92b08dafee2478465f1de0391a9876c4ef3a903350321"} Mar 18 10:36:55 crc kubenswrapper[4733]: I0318 10:36:55.201463 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g9kvf" event={"ID":"a2b2cea8-3250-49e7-9afb-8cd8adfbc175","Type":"ContainerStarted","Data":"137f4b5cd6409671241846735f9faafc7b02148bf4e909270882de413e75c2d0"} Mar 18 10:36:55 crc kubenswrapper[4733]: I0318 10:36:55.245046 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-g9kvf" podStartSLOduration=2.731001655 podStartE2EDuration="5.245021351s" podCreationTimestamp="2026-03-18 10:36:50 +0000 UTC" firstStartedPulling="2026-03-18 10:36:52.140860408 +0000 UTC m=+1451.632594763" lastFinishedPulling="2026-03-18 10:36:54.654880134 +0000 UTC m=+1454.146614459" observedRunningTime="2026-03-18 10:36:55.235597225 +0000 UTC m=+1454.727331550" watchObservedRunningTime="2026-03-18 10:36:55.245021351 +0000 UTC m=+1454.736755696" Mar 18 10:36:56 crc kubenswrapper[4733]: I0318 10:36:56.210930 4733 generic.go:334] "Generic (PLEG): container finished" podID="6a9e3c2c-c4de-4719-88bb-6392a6f7f0e9" containerID="1d3c68b39fae2dbff6428ba8437ca1b3885a84632766d99dab081c76cfafac7e" exitCode=0 Mar 18 10:36:56 crc kubenswrapper[4733]: I0318 10:36:56.211029 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mzcst" event={"ID":"6a9e3c2c-c4de-4719-88bb-6392a6f7f0e9","Type":"ContainerDied","Data":"1d3c68b39fae2dbff6428ba8437ca1b3885a84632766d99dab081c76cfafac7e"} Mar 18 10:36:57 crc kubenswrapper[4733]: I0318 10:36:57.236586 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mzcst" event={"ID":"6a9e3c2c-c4de-4719-88bb-6392a6f7f0e9","Type":"ContainerStarted","Data":"e01b9cb5a9a576b6af51a3c336323b76b216d79f06485e7f64ace3d275aedff7"} Mar 18 10:36:57 crc kubenswrapper[4733]: I0318 10:36:57.269010 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mzcst" podStartSLOduration=2.884084295 podStartE2EDuration="4.26899436s" podCreationTimestamp="2026-03-18 10:36:53 +0000 UTC" firstStartedPulling="2026-03-18 10:36:55.201060743 +0000 UTC m=+1454.692795068" lastFinishedPulling="2026-03-18 10:36:56.585970778 +0000 UTC m=+1456.077705133" observedRunningTime="2026-03-18 10:36:57.263614658 +0000 UTC m=+1456.755348983" watchObservedRunningTime="2026-03-18 10:36:57.26899436 +0000 UTC m=+1456.760728675" Mar 18 10:37:00 crc kubenswrapper[4733]: I0318 10:37:00.449574 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-g9kvf" Mar 18 10:37:00 crc kubenswrapper[4733]: I0318 10:37:00.450868 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-g9kvf" Mar 18 10:37:00 crc kubenswrapper[4733]: I0318 10:37:00.527543 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-g9kvf" Mar 18 10:37:01 crc kubenswrapper[4733]: I0318 10:37:01.310602 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-g9kvf" Mar 18 10:37:02 crc kubenswrapper[4733]: I0318 10:37:02.859980 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-g9kvf"] Mar 18 10:37:03 crc kubenswrapper[4733]: I0318 10:37:03.176095 4733 scope.go:117] "RemoveContainer" containerID="245acab52e36967117888a03ca9615fc134fa986b5326378b27571bcb153bf6a" Mar 18 10:37:03 crc kubenswrapper[4733]: E0318 10:37:03.176629 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:37:04 crc kubenswrapper[4733]: I0318 10:37:04.011527 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mzcst" Mar 18 10:37:04 crc kubenswrapper[4733]: I0318 10:37:04.011605 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mzcst" Mar 18 10:37:04 crc kubenswrapper[4733]: I0318 10:37:04.069000 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mzcst" Mar 18 10:37:04 crc kubenswrapper[4733]: I0318 10:37:04.295847 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-g9kvf" podUID="a2b2cea8-3250-49e7-9afb-8cd8adfbc175" containerName="registry-server" containerID="cri-o://137f4b5cd6409671241846735f9faafc7b02148bf4e909270882de413e75c2d0" gracePeriod=2 Mar 18 10:37:04 crc kubenswrapper[4733]: I0318 10:37:04.371728 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mzcst" Mar 18 10:37:04 crc kubenswrapper[4733]: I0318 10:37:04.810769 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g9kvf" Mar 18 10:37:04 crc kubenswrapper[4733]: I0318 10:37:04.960811 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2b2cea8-3250-49e7-9afb-8cd8adfbc175-utilities\") pod \"a2b2cea8-3250-49e7-9afb-8cd8adfbc175\" (UID: \"a2b2cea8-3250-49e7-9afb-8cd8adfbc175\") " Mar 18 10:37:04 crc kubenswrapper[4733]: I0318 10:37:04.961010 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46xhq\" (UniqueName: \"kubernetes.io/projected/a2b2cea8-3250-49e7-9afb-8cd8adfbc175-kube-api-access-46xhq\") pod \"a2b2cea8-3250-49e7-9afb-8cd8adfbc175\" (UID: \"a2b2cea8-3250-49e7-9afb-8cd8adfbc175\") " Mar 18 10:37:04 crc kubenswrapper[4733]: I0318 10:37:04.961036 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2b2cea8-3250-49e7-9afb-8cd8adfbc175-catalog-content\") pod \"a2b2cea8-3250-49e7-9afb-8cd8adfbc175\" (UID: \"a2b2cea8-3250-49e7-9afb-8cd8adfbc175\") " Mar 18 10:37:04 crc kubenswrapper[4733]: I0318 10:37:04.962083 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2b2cea8-3250-49e7-9afb-8cd8adfbc175-utilities" (OuterVolumeSpecName: "utilities") pod "a2b2cea8-3250-49e7-9afb-8cd8adfbc175" (UID: "a2b2cea8-3250-49e7-9afb-8cd8adfbc175"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:37:04 crc kubenswrapper[4733]: I0318 10:37:04.971152 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2b2cea8-3250-49e7-9afb-8cd8adfbc175-kube-api-access-46xhq" (OuterVolumeSpecName: "kube-api-access-46xhq") pod "a2b2cea8-3250-49e7-9afb-8cd8adfbc175" (UID: "a2b2cea8-3250-49e7-9afb-8cd8adfbc175"). InnerVolumeSpecName "kube-api-access-46xhq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:37:05 crc kubenswrapper[4733]: I0318 10:37:05.059434 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2b2cea8-3250-49e7-9afb-8cd8adfbc175-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a2b2cea8-3250-49e7-9afb-8cd8adfbc175" (UID: "a2b2cea8-3250-49e7-9afb-8cd8adfbc175"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:37:05 crc kubenswrapper[4733]: I0318 10:37:05.063635 4733 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2b2cea8-3250-49e7-9afb-8cd8adfbc175-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 10:37:05 crc kubenswrapper[4733]: I0318 10:37:05.063699 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46xhq\" (UniqueName: \"kubernetes.io/projected/a2b2cea8-3250-49e7-9afb-8cd8adfbc175-kube-api-access-46xhq\") on node \"crc\" DevicePath \"\"" Mar 18 10:37:05 crc kubenswrapper[4733]: I0318 10:37:05.063721 4733 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2b2cea8-3250-49e7-9afb-8cd8adfbc175-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 10:37:05 crc kubenswrapper[4733]: I0318 10:37:05.175589 4733 scope.go:117] "RemoveContainer" containerID="b8c4d43890082484b9a1254cd9426c5cd83f45e4b1a61544da192f3cdebac71e" Mar 18 10:37:05 crc kubenswrapper[4733]: E0318 10:37:05.176081 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:37:05 crc kubenswrapper[4733]: I0318 10:37:05.312096 4733 generic.go:334] "Generic (PLEG): container finished" podID="a2b2cea8-3250-49e7-9afb-8cd8adfbc175" containerID="137f4b5cd6409671241846735f9faafc7b02148bf4e909270882de413e75c2d0" exitCode=0 Mar 18 10:37:05 crc kubenswrapper[4733]: I0318 10:37:05.312242 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g9kvf" Mar 18 10:37:05 crc kubenswrapper[4733]: I0318 10:37:05.312236 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g9kvf" event={"ID":"a2b2cea8-3250-49e7-9afb-8cd8adfbc175","Type":"ContainerDied","Data":"137f4b5cd6409671241846735f9faafc7b02148bf4e909270882de413e75c2d0"} Mar 18 10:37:05 crc kubenswrapper[4733]: I0318 10:37:05.312406 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g9kvf" event={"ID":"a2b2cea8-3250-49e7-9afb-8cd8adfbc175","Type":"ContainerDied","Data":"77b5823cb17c9bd5ad620ae110d159272c1807518ae06bbaa2150540125e3326"} Mar 18 10:37:05 crc kubenswrapper[4733]: I0318 10:37:05.312451 4733 scope.go:117] "RemoveContainer" containerID="137f4b5cd6409671241846735f9faafc7b02148bf4e909270882de413e75c2d0" Mar 18 10:37:05 crc kubenswrapper[4733]: I0318 10:37:05.356464 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-g9kvf"] Mar 18 10:37:05 crc kubenswrapper[4733]: I0318 10:37:05.362089 4733 scope.go:117] "RemoveContainer" containerID="0182ab6d575fb1256d6047471250a820525e208bd74c344e9e4505523d32cf37" Mar 18 10:37:05 crc kubenswrapper[4733]: I0318 10:37:05.363845 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-g9kvf"] Mar 18 10:37:05 crc kubenswrapper[4733]: I0318 10:37:05.393149 4733 scope.go:117] "RemoveContainer" containerID="c5a21fa0cbceaea43ca29ac1a7ba14c49e0499e62c04076b20c4a0dea541c867" Mar 18 10:37:05 crc kubenswrapper[4733]: I0318 10:37:05.434562 4733 scope.go:117] "RemoveContainer" containerID="137f4b5cd6409671241846735f9faafc7b02148bf4e909270882de413e75c2d0" Mar 18 10:37:05 crc kubenswrapper[4733]: E0318 10:37:05.435323 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"137f4b5cd6409671241846735f9faafc7b02148bf4e909270882de413e75c2d0\": container with ID starting with 137f4b5cd6409671241846735f9faafc7b02148bf4e909270882de413e75c2d0 not found: ID does not exist" containerID="137f4b5cd6409671241846735f9faafc7b02148bf4e909270882de413e75c2d0" Mar 18 10:37:05 crc kubenswrapper[4733]: I0318 10:37:05.435366 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"137f4b5cd6409671241846735f9faafc7b02148bf4e909270882de413e75c2d0"} err="failed to get container status \"137f4b5cd6409671241846735f9faafc7b02148bf4e909270882de413e75c2d0\": rpc error: code = NotFound desc = could not find container \"137f4b5cd6409671241846735f9faafc7b02148bf4e909270882de413e75c2d0\": container with ID starting with 137f4b5cd6409671241846735f9faafc7b02148bf4e909270882de413e75c2d0 not found: ID does not exist" Mar 18 10:37:05 crc kubenswrapper[4733]: I0318 10:37:05.435392 4733 scope.go:117] "RemoveContainer" containerID="0182ab6d575fb1256d6047471250a820525e208bd74c344e9e4505523d32cf37" Mar 18 10:37:05 crc kubenswrapper[4733]: E0318 10:37:05.435935 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0182ab6d575fb1256d6047471250a820525e208bd74c344e9e4505523d32cf37\": container with ID starting with 0182ab6d575fb1256d6047471250a820525e208bd74c344e9e4505523d32cf37 not found: ID does not exist" containerID="0182ab6d575fb1256d6047471250a820525e208bd74c344e9e4505523d32cf37" Mar 18 10:37:05 crc kubenswrapper[4733]: I0318 10:37:05.436005 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0182ab6d575fb1256d6047471250a820525e208bd74c344e9e4505523d32cf37"} err="failed to get container status \"0182ab6d575fb1256d6047471250a820525e208bd74c344e9e4505523d32cf37\": rpc error: code = NotFound desc = could not find container \"0182ab6d575fb1256d6047471250a820525e208bd74c344e9e4505523d32cf37\": container with ID starting with 0182ab6d575fb1256d6047471250a820525e208bd74c344e9e4505523d32cf37 not found: ID does not exist" Mar 18 10:37:05 crc kubenswrapper[4733]: I0318 10:37:05.436053 4733 scope.go:117] "RemoveContainer" containerID="c5a21fa0cbceaea43ca29ac1a7ba14c49e0499e62c04076b20c4a0dea541c867" Mar 18 10:37:05 crc kubenswrapper[4733]: E0318 10:37:05.438447 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5a21fa0cbceaea43ca29ac1a7ba14c49e0499e62c04076b20c4a0dea541c867\": container with ID starting with c5a21fa0cbceaea43ca29ac1a7ba14c49e0499e62c04076b20c4a0dea541c867 not found: ID does not exist" containerID="c5a21fa0cbceaea43ca29ac1a7ba14c49e0499e62c04076b20c4a0dea541c867" Mar 18 10:37:05 crc kubenswrapper[4733]: I0318 10:37:05.438527 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5a21fa0cbceaea43ca29ac1a7ba14c49e0499e62c04076b20c4a0dea541c867"} err="failed to get container status \"c5a21fa0cbceaea43ca29ac1a7ba14c49e0499e62c04076b20c4a0dea541c867\": rpc error: code = NotFound desc = could not find container \"c5a21fa0cbceaea43ca29ac1a7ba14c49e0499e62c04076b20c4a0dea541c867\": container with ID starting with c5a21fa0cbceaea43ca29ac1a7ba14c49e0499e62c04076b20c4a0dea541c867 not found: ID does not exist" Mar 18 10:37:06 crc kubenswrapper[4733]: I0318 10:37:06.460661 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mzcst"] Mar 18 10:37:06 crc kubenswrapper[4733]: I0318 10:37:06.461312 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mzcst" podUID="6a9e3c2c-c4de-4719-88bb-6392a6f7f0e9" containerName="registry-server" containerID="cri-o://e01b9cb5a9a576b6af51a3c336323b76b216d79f06485e7f64ace3d275aedff7" gracePeriod=2 Mar 18 10:37:06 crc kubenswrapper[4733]: I0318 10:37:06.973940 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mzcst" Mar 18 10:37:07 crc kubenswrapper[4733]: I0318 10:37:07.102092 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mspkt\" (UniqueName: \"kubernetes.io/projected/6a9e3c2c-c4de-4719-88bb-6392a6f7f0e9-kube-api-access-mspkt\") pod \"6a9e3c2c-c4de-4719-88bb-6392a6f7f0e9\" (UID: \"6a9e3c2c-c4de-4719-88bb-6392a6f7f0e9\") " Mar 18 10:37:07 crc kubenswrapper[4733]: I0318 10:37:07.102171 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a9e3c2c-c4de-4719-88bb-6392a6f7f0e9-utilities\") pod \"6a9e3c2c-c4de-4719-88bb-6392a6f7f0e9\" (UID: \"6a9e3c2c-c4de-4719-88bb-6392a6f7f0e9\") " Mar 18 10:37:07 crc kubenswrapper[4733]: I0318 10:37:07.102365 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a9e3c2c-c4de-4719-88bb-6392a6f7f0e9-catalog-content\") pod \"6a9e3c2c-c4de-4719-88bb-6392a6f7f0e9\" (UID: \"6a9e3c2c-c4de-4719-88bb-6392a6f7f0e9\") " Mar 18 10:37:07 crc kubenswrapper[4733]: I0318 10:37:07.103958 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a9e3c2c-c4de-4719-88bb-6392a6f7f0e9-utilities" (OuterVolumeSpecName: "utilities") pod "6a9e3c2c-c4de-4719-88bb-6392a6f7f0e9" (UID: "6a9e3c2c-c4de-4719-88bb-6392a6f7f0e9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:37:07 crc kubenswrapper[4733]: I0318 10:37:07.112675 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a9e3c2c-c4de-4719-88bb-6392a6f7f0e9-kube-api-access-mspkt" (OuterVolumeSpecName: "kube-api-access-mspkt") pod "6a9e3c2c-c4de-4719-88bb-6392a6f7f0e9" (UID: "6a9e3c2c-c4de-4719-88bb-6392a6f7f0e9"). InnerVolumeSpecName "kube-api-access-mspkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:37:07 crc kubenswrapper[4733]: I0318 10:37:07.155265 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a9e3c2c-c4de-4719-88bb-6392a6f7f0e9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6a9e3c2c-c4de-4719-88bb-6392a6f7f0e9" (UID: "6a9e3c2c-c4de-4719-88bb-6392a6f7f0e9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:37:07 crc kubenswrapper[4733]: I0318 10:37:07.190704 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2b2cea8-3250-49e7-9afb-8cd8adfbc175" path="/var/lib/kubelet/pods/a2b2cea8-3250-49e7-9afb-8cd8adfbc175/volumes" Mar 18 10:37:07 crc kubenswrapper[4733]: I0318 10:37:07.204595 4733 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a9e3c2c-c4de-4719-88bb-6392a6f7f0e9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 10:37:07 crc kubenswrapper[4733]: I0318 10:37:07.204673 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mspkt\" (UniqueName: \"kubernetes.io/projected/6a9e3c2c-c4de-4719-88bb-6392a6f7f0e9-kube-api-access-mspkt\") on node \"crc\" DevicePath \"\"" Mar 18 10:37:07 crc kubenswrapper[4733]: I0318 10:37:07.204702 4733 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a9e3c2c-c4de-4719-88bb-6392a6f7f0e9-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 10:37:07 crc kubenswrapper[4733]: I0318 10:37:07.344083 4733 generic.go:334] "Generic (PLEG): container finished" podID="6a9e3c2c-c4de-4719-88bb-6392a6f7f0e9" containerID="e01b9cb5a9a576b6af51a3c336323b76b216d79f06485e7f64ace3d275aedff7" exitCode=0 Mar 18 10:37:07 crc kubenswrapper[4733]: I0318 10:37:07.344143 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mzcst" event={"ID":"6a9e3c2c-c4de-4719-88bb-6392a6f7f0e9","Type":"ContainerDied","Data":"e01b9cb5a9a576b6af51a3c336323b76b216d79f06485e7f64ace3d275aedff7"} Mar 18 10:37:07 crc kubenswrapper[4733]: I0318 10:37:07.344182 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mzcst" event={"ID":"6a9e3c2c-c4de-4719-88bb-6392a6f7f0e9","Type":"ContainerDied","Data":"733fa8b795ed27071ff92b08dafee2478465f1de0391a9876c4ef3a903350321"} Mar 18 10:37:07 crc kubenswrapper[4733]: I0318 10:37:07.344242 4733 scope.go:117] "RemoveContainer" containerID="e01b9cb5a9a576b6af51a3c336323b76b216d79f06485e7f64ace3d275aedff7" Mar 18 10:37:07 crc kubenswrapper[4733]: I0318 10:37:07.344430 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mzcst" Mar 18 10:37:07 crc kubenswrapper[4733]: I0318 10:37:07.385502 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mzcst"] Mar 18 10:37:07 crc kubenswrapper[4733]: I0318 10:37:07.387054 4733 scope.go:117] "RemoveContainer" containerID="1d3c68b39fae2dbff6428ba8437ca1b3885a84632766d99dab081c76cfafac7e" Mar 18 10:37:07 crc kubenswrapper[4733]: I0318 10:37:07.394136 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mzcst"] Mar 18 10:37:07 crc kubenswrapper[4733]: I0318 10:37:07.416554 4733 scope.go:117] "RemoveContainer" containerID="43bc7fdfbeffa77f19586c47b1204bb99968bbe1c8c66ec263c804c270f6200b" Mar 18 10:37:07 crc kubenswrapper[4733]: I0318 10:37:07.457496 4733 scope.go:117] "RemoveContainer" containerID="e01b9cb5a9a576b6af51a3c336323b76b216d79f06485e7f64ace3d275aedff7" Mar 18 10:37:07 crc kubenswrapper[4733]: E0318 10:37:07.458025 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e01b9cb5a9a576b6af51a3c336323b76b216d79f06485e7f64ace3d275aedff7\": container with ID starting with e01b9cb5a9a576b6af51a3c336323b76b216d79f06485e7f64ace3d275aedff7 not found: ID does not exist" containerID="e01b9cb5a9a576b6af51a3c336323b76b216d79f06485e7f64ace3d275aedff7" Mar 18 10:37:07 crc kubenswrapper[4733]: I0318 10:37:07.458085 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e01b9cb5a9a576b6af51a3c336323b76b216d79f06485e7f64ace3d275aedff7"} err="failed to get container status \"e01b9cb5a9a576b6af51a3c336323b76b216d79f06485e7f64ace3d275aedff7\": rpc error: code = NotFound desc = could not find container \"e01b9cb5a9a576b6af51a3c336323b76b216d79f06485e7f64ace3d275aedff7\": container with ID starting with e01b9cb5a9a576b6af51a3c336323b76b216d79f06485e7f64ace3d275aedff7 not found: ID does not exist" Mar 18 10:37:07 crc kubenswrapper[4733]: I0318 10:37:07.458117 4733 scope.go:117] "RemoveContainer" containerID="1d3c68b39fae2dbff6428ba8437ca1b3885a84632766d99dab081c76cfafac7e" Mar 18 10:37:07 crc kubenswrapper[4733]: E0318 10:37:07.458655 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d3c68b39fae2dbff6428ba8437ca1b3885a84632766d99dab081c76cfafac7e\": container with ID starting with 1d3c68b39fae2dbff6428ba8437ca1b3885a84632766d99dab081c76cfafac7e not found: ID does not exist" containerID="1d3c68b39fae2dbff6428ba8437ca1b3885a84632766d99dab081c76cfafac7e" Mar 18 10:37:07 crc kubenswrapper[4733]: I0318 10:37:07.458711 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d3c68b39fae2dbff6428ba8437ca1b3885a84632766d99dab081c76cfafac7e"} err="failed to get container status \"1d3c68b39fae2dbff6428ba8437ca1b3885a84632766d99dab081c76cfafac7e\": rpc error: code = NotFound desc = could not find container \"1d3c68b39fae2dbff6428ba8437ca1b3885a84632766d99dab081c76cfafac7e\": container with ID starting with 1d3c68b39fae2dbff6428ba8437ca1b3885a84632766d99dab081c76cfafac7e not found: ID does not exist" Mar 18 10:37:07 crc kubenswrapper[4733]: I0318 10:37:07.458750 4733 scope.go:117] "RemoveContainer" containerID="43bc7fdfbeffa77f19586c47b1204bb99968bbe1c8c66ec263c804c270f6200b" Mar 18 10:37:07 crc kubenswrapper[4733]: E0318 10:37:07.459299 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43bc7fdfbeffa77f19586c47b1204bb99968bbe1c8c66ec263c804c270f6200b\": container with ID starting with 43bc7fdfbeffa77f19586c47b1204bb99968bbe1c8c66ec263c804c270f6200b not found: ID does not exist" containerID="43bc7fdfbeffa77f19586c47b1204bb99968bbe1c8c66ec263c804c270f6200b" Mar 18 10:37:07 crc kubenswrapper[4733]: I0318 10:37:07.459359 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43bc7fdfbeffa77f19586c47b1204bb99968bbe1c8c66ec263c804c270f6200b"} err="failed to get container status \"43bc7fdfbeffa77f19586c47b1204bb99968bbe1c8c66ec263c804c270f6200b\": rpc error: code = NotFound desc = could not find container \"43bc7fdfbeffa77f19586c47b1204bb99968bbe1c8c66ec263c804c270f6200b\": container with ID starting with 43bc7fdfbeffa77f19586c47b1204bb99968bbe1c8c66ec263c804c270f6200b not found: ID does not exist" Mar 18 10:37:09 crc kubenswrapper[4733]: I0318 10:37:09.192184 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a9e3c2c-c4de-4719-88bb-6392a6f7f0e9" path="/var/lib/kubelet/pods/6a9e3c2c-c4de-4719-88bb-6392a6f7f0e9/volumes" Mar 18 10:37:17 crc kubenswrapper[4733]: I0318 10:37:17.174973 4733 scope.go:117] "RemoveContainer" containerID="245acab52e36967117888a03ca9615fc134fa986b5326378b27571bcb153bf6a" Mar 18 10:37:17 crc kubenswrapper[4733]: E0318 10:37:17.175705 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:37:17 crc kubenswrapper[4733]: I0318 10:37:17.175850 4733 scope.go:117] "RemoveContainer" containerID="b8c4d43890082484b9a1254cd9426c5cd83f45e4b1a61544da192f3cdebac71e" Mar 18 10:37:17 crc kubenswrapper[4733]: E0318 10:37:17.176128 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:37:28 crc kubenswrapper[4733]: I0318 10:37:28.175859 4733 scope.go:117] "RemoveContainer" containerID="b8c4d43890082484b9a1254cd9426c5cd83f45e4b1a61544da192f3cdebac71e" Mar 18 10:37:28 crc kubenswrapper[4733]: E0318 10:37:28.177157 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:37:29 crc kubenswrapper[4733]: I0318 10:37:29.176436 4733 scope.go:117] "RemoveContainer" containerID="245acab52e36967117888a03ca9615fc134fa986b5326378b27571bcb153bf6a" Mar 18 10:37:29 crc kubenswrapper[4733]: E0318 10:37:29.176915 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:37:44 crc kubenswrapper[4733]: I0318 10:37:44.175663 4733 scope.go:117] "RemoveContainer" containerID="b8c4d43890082484b9a1254cd9426c5cd83f45e4b1a61544da192f3cdebac71e" Mar 18 10:37:44 crc kubenswrapper[4733]: I0318 10:37:44.176343 4733 scope.go:117] "RemoveContainer" containerID="245acab52e36967117888a03ca9615fc134fa986b5326378b27571bcb153bf6a" Mar 18 10:37:44 crc kubenswrapper[4733]: E0318 10:37:44.176687 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:37:44 crc kubenswrapper[4733]: E0318 10:37:44.176759 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:37:54 crc kubenswrapper[4733]: I0318 10:37:54.566180 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wxvvt"] Mar 18 10:37:54 crc kubenswrapper[4733]: E0318 10:37:54.567609 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a9e3c2c-c4de-4719-88bb-6392a6f7f0e9" containerName="extract-utilities" Mar 18 10:37:54 crc kubenswrapper[4733]: I0318 10:37:54.567647 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a9e3c2c-c4de-4719-88bb-6392a6f7f0e9" containerName="extract-utilities" Mar 18 10:37:54 crc kubenswrapper[4733]: E0318 10:37:54.567693 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2b2cea8-3250-49e7-9afb-8cd8adfbc175" containerName="extract-content" Mar 18 10:37:54 crc kubenswrapper[4733]: I0318 10:37:54.567712 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2b2cea8-3250-49e7-9afb-8cd8adfbc175" containerName="extract-content" Mar 18 10:37:54 crc kubenswrapper[4733]: E0318 10:37:54.567766 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a9e3c2c-c4de-4719-88bb-6392a6f7f0e9" containerName="extract-content" Mar 18 10:37:54 crc kubenswrapper[4733]: I0318 10:37:54.567785 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a9e3c2c-c4de-4719-88bb-6392a6f7f0e9" containerName="extract-content" Mar 18 10:37:54 crc kubenswrapper[4733]: E0318 10:37:54.567818 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2b2cea8-3250-49e7-9afb-8cd8adfbc175" containerName="registry-server" Mar 18 10:37:54 crc kubenswrapper[4733]: I0318 10:37:54.567836 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2b2cea8-3250-49e7-9afb-8cd8adfbc175" containerName="registry-server" Mar 18 10:37:54 crc kubenswrapper[4733]: E0318 10:37:54.567861 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a9e3c2c-c4de-4719-88bb-6392a6f7f0e9" containerName="registry-server" Mar 18 10:37:54 crc kubenswrapper[4733]: I0318 10:37:54.567877 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a9e3c2c-c4de-4719-88bb-6392a6f7f0e9" containerName="registry-server" Mar 18 10:37:54 crc kubenswrapper[4733]: E0318 10:37:54.567905 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2b2cea8-3250-49e7-9afb-8cd8adfbc175" containerName="extract-utilities" Mar 18 10:37:54 crc kubenswrapper[4733]: I0318 10:37:54.567923 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2b2cea8-3250-49e7-9afb-8cd8adfbc175" containerName="extract-utilities" Mar 18 10:37:54 crc kubenswrapper[4733]: I0318 10:37:54.570177 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a9e3c2c-c4de-4719-88bb-6392a6f7f0e9" containerName="registry-server" Mar 18 10:37:54 crc kubenswrapper[4733]: I0318 10:37:54.570308 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2b2cea8-3250-49e7-9afb-8cd8adfbc175" containerName="registry-server" Mar 18 10:37:54 crc kubenswrapper[4733]: I0318 10:37:54.572753 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wxvvt" Mar 18 10:37:54 crc kubenswrapper[4733]: I0318 10:37:54.585216 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wxvvt"] Mar 18 10:37:54 crc kubenswrapper[4733]: I0318 10:37:54.730709 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0c43d05-915d-4373-aae0-67a182acc4bc-catalog-content\") pod \"redhat-operators-wxvvt\" (UID: \"b0c43d05-915d-4373-aae0-67a182acc4bc\") " pod="openshift-marketplace/redhat-operators-wxvvt" Mar 18 10:37:54 crc kubenswrapper[4733]: I0318 10:37:54.731213 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjpzr\" (UniqueName: \"kubernetes.io/projected/b0c43d05-915d-4373-aae0-67a182acc4bc-kube-api-access-xjpzr\") pod \"redhat-operators-wxvvt\" (UID: \"b0c43d05-915d-4373-aae0-67a182acc4bc\") " pod="openshift-marketplace/redhat-operators-wxvvt" Mar 18 10:37:54 crc kubenswrapper[4733]: I0318 10:37:54.731302 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0c43d05-915d-4373-aae0-67a182acc4bc-utilities\") pod \"redhat-operators-wxvvt\" (UID: \"b0c43d05-915d-4373-aae0-67a182acc4bc\") " pod="openshift-marketplace/redhat-operators-wxvvt" Mar 18 10:37:54 crc kubenswrapper[4733]: I0318 10:37:54.833933 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjpzr\" (UniqueName: \"kubernetes.io/projected/b0c43d05-915d-4373-aae0-67a182acc4bc-kube-api-access-xjpzr\") pod \"redhat-operators-wxvvt\" (UID: \"b0c43d05-915d-4373-aae0-67a182acc4bc\") " pod="openshift-marketplace/redhat-operators-wxvvt" Mar 18 10:37:54 crc kubenswrapper[4733]: I0318 10:37:54.834000 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0c43d05-915d-4373-aae0-67a182acc4bc-utilities\") pod \"redhat-operators-wxvvt\" (UID: \"b0c43d05-915d-4373-aae0-67a182acc4bc\") " pod="openshift-marketplace/redhat-operators-wxvvt" Mar 18 10:37:54 crc kubenswrapper[4733]: I0318 10:37:54.834129 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0c43d05-915d-4373-aae0-67a182acc4bc-catalog-content\") pod \"redhat-operators-wxvvt\" (UID: \"b0c43d05-915d-4373-aae0-67a182acc4bc\") " pod="openshift-marketplace/redhat-operators-wxvvt" Mar 18 10:37:54 crc kubenswrapper[4733]: I0318 10:37:54.835045 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0c43d05-915d-4373-aae0-67a182acc4bc-catalog-content\") pod \"redhat-operators-wxvvt\" (UID: \"b0c43d05-915d-4373-aae0-67a182acc4bc\") " pod="openshift-marketplace/redhat-operators-wxvvt" Mar 18 10:37:54 crc kubenswrapper[4733]: I0318 10:37:54.835151 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0c43d05-915d-4373-aae0-67a182acc4bc-utilities\") pod \"redhat-operators-wxvvt\" (UID: \"b0c43d05-915d-4373-aae0-67a182acc4bc\") " pod="openshift-marketplace/redhat-operators-wxvvt" Mar 18 10:37:54 crc kubenswrapper[4733]: I0318 10:37:54.861123 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjpzr\" (UniqueName: \"kubernetes.io/projected/b0c43d05-915d-4373-aae0-67a182acc4bc-kube-api-access-xjpzr\") pod \"redhat-operators-wxvvt\" (UID: \"b0c43d05-915d-4373-aae0-67a182acc4bc\") " pod="openshift-marketplace/redhat-operators-wxvvt" Mar 18 10:37:54 crc kubenswrapper[4733]: I0318 10:37:54.949440 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wxvvt" Mar 18 10:37:55 crc kubenswrapper[4733]: I0318 10:37:55.398140 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wxvvt"] Mar 18 10:37:55 crc kubenswrapper[4733]: I0318 10:37:55.848732 4733 generic.go:334] "Generic (PLEG): container finished" podID="b0c43d05-915d-4373-aae0-67a182acc4bc" containerID="7261f0adf78b1ca0abdaa2369b3560c3c1bc5bc57577bda27c7f5f634951063e" exitCode=0 Mar 18 10:37:55 crc kubenswrapper[4733]: I0318 10:37:55.848801 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wxvvt" event={"ID":"b0c43d05-915d-4373-aae0-67a182acc4bc","Type":"ContainerDied","Data":"7261f0adf78b1ca0abdaa2369b3560c3c1bc5bc57577bda27c7f5f634951063e"} Mar 18 10:37:55 crc kubenswrapper[4733]: I0318 10:37:55.848840 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wxvvt" event={"ID":"b0c43d05-915d-4373-aae0-67a182acc4bc","Type":"ContainerStarted","Data":"98f9eb05565700e8349c0932c41e77ad80e45cc728fe839b9e5a63014e0f9e85"} Mar 18 10:37:55 crc kubenswrapper[4733]: I0318 10:37:55.852149 4733 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 10:37:57 crc kubenswrapper[4733]: I0318 10:37:57.176419 4733 scope.go:117] "RemoveContainer" containerID="b8c4d43890082484b9a1254cd9426c5cd83f45e4b1a61544da192f3cdebac71e" Mar 18 10:37:57 crc kubenswrapper[4733]: E0318 10:37:57.177795 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:37:57 crc kubenswrapper[4733]: I0318 10:37:57.882227 4733 generic.go:334] "Generic (PLEG): container finished" podID="b0c43d05-915d-4373-aae0-67a182acc4bc" containerID="c49023eae46560082477167844c01c17c714b041e28ae5e5bfa596612d1eda74" exitCode=0 Mar 18 10:37:57 crc kubenswrapper[4733]: I0318 10:37:57.882268 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wxvvt" event={"ID":"b0c43d05-915d-4373-aae0-67a182acc4bc","Type":"ContainerDied","Data":"c49023eae46560082477167844c01c17c714b041e28ae5e5bfa596612d1eda74"} Mar 18 10:37:58 crc kubenswrapper[4733]: I0318 10:37:58.175986 4733 scope.go:117] "RemoveContainer" containerID="245acab52e36967117888a03ca9615fc134fa986b5326378b27571bcb153bf6a" Mar 18 10:37:58 crc kubenswrapper[4733]: I0318 10:37:58.892058 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f0570ce4-1455-4698-85cf-01f7108d9e7f","Type":"ContainerStarted","Data":"39f4b0f268d8d6f6613db69085678023c98d9ef75625187f285b2a88548a855a"} Mar 18 10:37:58 crc kubenswrapper[4733]: I0318 10:37:58.893146 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 18 10:37:58 crc kubenswrapper[4733]: I0318 10:37:58.894517 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wxvvt" event={"ID":"b0c43d05-915d-4373-aae0-67a182acc4bc","Type":"ContainerStarted","Data":"a15d60390589ca2e549d7e30fc944c0099e4336af37991944c6539ec52e623aa"} Mar 18 10:37:58 crc kubenswrapper[4733]: I0318 10:37:58.933466 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wxvvt" podStartSLOduration=2.286969509 podStartE2EDuration="4.933435025s" podCreationTimestamp="2026-03-18 10:37:54 +0000 UTC" firstStartedPulling="2026-03-18 10:37:55.85194359 +0000 UTC m=+1515.343677915" lastFinishedPulling="2026-03-18 10:37:58.498409076 +0000 UTC m=+1517.990143431" observedRunningTime="2026-03-18 10:37:58.930374029 +0000 UTC m=+1518.422108354" watchObservedRunningTime="2026-03-18 10:37:58.933435025 +0000 UTC m=+1518.425169350" Mar 18 10:38:00 crc kubenswrapper[4733]: I0318 10:38:00.150338 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563838-6fbsd"] Mar 18 10:38:00 crc kubenswrapper[4733]: I0318 10:38:00.151528 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563838-6fbsd" Mar 18 10:38:00 crc kubenswrapper[4733]: I0318 10:38:00.154470 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:38:00 crc kubenswrapper[4733]: I0318 10:38:00.154602 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wmd5k" Mar 18 10:38:00 crc kubenswrapper[4733]: I0318 10:38:00.169256 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563838-6fbsd"] Mar 18 10:38:00 crc kubenswrapper[4733]: I0318 10:38:00.170134 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:38:00 crc kubenswrapper[4733]: I0318 10:38:00.231387 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcj82\" (UniqueName: \"kubernetes.io/projected/3959ab36-a688-40ac-b70b-b3cc35b1d7a1-kube-api-access-tcj82\") pod \"auto-csr-approver-29563838-6fbsd\" (UID: \"3959ab36-a688-40ac-b70b-b3cc35b1d7a1\") " pod="openshift-infra/auto-csr-approver-29563838-6fbsd" Mar 18 10:38:00 crc kubenswrapper[4733]: I0318 10:38:00.333111 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcj82\" (UniqueName: \"kubernetes.io/projected/3959ab36-a688-40ac-b70b-b3cc35b1d7a1-kube-api-access-tcj82\") pod \"auto-csr-approver-29563838-6fbsd\" (UID: \"3959ab36-a688-40ac-b70b-b3cc35b1d7a1\") " pod="openshift-infra/auto-csr-approver-29563838-6fbsd" Mar 18 10:38:00 crc kubenswrapper[4733]: I0318 10:38:00.371527 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcj82\" (UniqueName: \"kubernetes.io/projected/3959ab36-a688-40ac-b70b-b3cc35b1d7a1-kube-api-access-tcj82\") pod \"auto-csr-approver-29563838-6fbsd\" (UID: \"3959ab36-a688-40ac-b70b-b3cc35b1d7a1\") " pod="openshift-infra/auto-csr-approver-29563838-6fbsd" Mar 18 10:38:00 crc kubenswrapper[4733]: I0318 10:38:00.486446 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563838-6fbsd" Mar 18 10:38:00 crc kubenswrapper[4733]: W0318 10:38:00.993280 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3959ab36_a688_40ac_b70b_b3cc35b1d7a1.slice/crio-767ef7bf1d3ba1aa0c4f7321a9fd4a583591dd67c6f6f39f86b5ad15f4ac2b8d WatchSource:0}: Error finding container 767ef7bf1d3ba1aa0c4f7321a9fd4a583591dd67c6f6f39f86b5ad15f4ac2b8d: Status 404 returned error can't find the container with id 767ef7bf1d3ba1aa0c4f7321a9fd4a583591dd67c6f6f39f86b5ad15f4ac2b8d Mar 18 10:38:01 crc kubenswrapper[4733]: I0318 10:38:01.006609 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563838-6fbsd"] Mar 18 10:38:01 crc kubenswrapper[4733]: I0318 10:38:01.922971 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563838-6fbsd" event={"ID":"3959ab36-a688-40ac-b70b-b3cc35b1d7a1","Type":"ContainerStarted","Data":"767ef7bf1d3ba1aa0c4f7321a9fd4a583591dd67c6f6f39f86b5ad15f4ac2b8d"} Mar 18 10:38:02 crc kubenswrapper[4733]: I0318 10:38:02.941285 4733 generic.go:334] "Generic (PLEG): container finished" podID="f0570ce4-1455-4698-85cf-01f7108d9e7f" containerID="39f4b0f268d8d6f6613db69085678023c98d9ef75625187f285b2a88548a855a" exitCode=0 Mar 18 10:38:02 crc kubenswrapper[4733]: I0318 10:38:02.941496 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f0570ce4-1455-4698-85cf-01f7108d9e7f","Type":"ContainerDied","Data":"39f4b0f268d8d6f6613db69085678023c98d9ef75625187f285b2a88548a855a"} Mar 18 10:38:02 crc kubenswrapper[4733]: I0318 10:38:02.941874 4733 scope.go:117] "RemoveContainer" containerID="245acab52e36967117888a03ca9615fc134fa986b5326378b27571bcb153bf6a" Mar 18 10:38:02 crc kubenswrapper[4733]: I0318 10:38:02.942954 4733 scope.go:117] "RemoveContainer" containerID="39f4b0f268d8d6f6613db69085678023c98d9ef75625187f285b2a88548a855a" Mar 18 10:38:02 crc kubenswrapper[4733]: E0318 10:38:02.943455 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:38:02 crc kubenswrapper[4733]: I0318 10:38:02.947710 4733 generic.go:334] "Generic (PLEG): container finished" podID="3959ab36-a688-40ac-b70b-b3cc35b1d7a1" containerID="ac18342a3539a4f4eb0b18430ab3c33bd2af4e21dfc3695dc34573a145ad949d" exitCode=0 Mar 18 10:38:02 crc kubenswrapper[4733]: I0318 10:38:02.947783 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563838-6fbsd" event={"ID":"3959ab36-a688-40ac-b70b-b3cc35b1d7a1","Type":"ContainerDied","Data":"ac18342a3539a4f4eb0b18430ab3c33bd2af4e21dfc3695dc34573a145ad949d"} Mar 18 10:38:04 crc kubenswrapper[4733]: I0318 10:38:04.299127 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563838-6fbsd" Mar 18 10:38:04 crc kubenswrapper[4733]: I0318 10:38:04.423334 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tcj82\" (UniqueName: \"kubernetes.io/projected/3959ab36-a688-40ac-b70b-b3cc35b1d7a1-kube-api-access-tcj82\") pod \"3959ab36-a688-40ac-b70b-b3cc35b1d7a1\" (UID: \"3959ab36-a688-40ac-b70b-b3cc35b1d7a1\") " Mar 18 10:38:04 crc kubenswrapper[4733]: I0318 10:38:04.433750 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3959ab36-a688-40ac-b70b-b3cc35b1d7a1-kube-api-access-tcj82" (OuterVolumeSpecName: "kube-api-access-tcj82") pod "3959ab36-a688-40ac-b70b-b3cc35b1d7a1" (UID: "3959ab36-a688-40ac-b70b-b3cc35b1d7a1"). InnerVolumeSpecName "kube-api-access-tcj82". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:38:04 crc kubenswrapper[4733]: I0318 10:38:04.525524 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tcj82\" (UniqueName: \"kubernetes.io/projected/3959ab36-a688-40ac-b70b-b3cc35b1d7a1-kube-api-access-tcj82\") on node \"crc\" DevicePath \"\"" Mar 18 10:38:04 crc kubenswrapper[4733]: I0318 10:38:04.949817 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wxvvt" Mar 18 10:38:04 crc kubenswrapper[4733]: I0318 10:38:04.950048 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wxvvt" Mar 18 10:38:04 crc kubenswrapper[4733]: I0318 10:38:04.975148 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563838-6fbsd" Mar 18 10:38:04 crc kubenswrapper[4733]: I0318 10:38:04.985386 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563838-6fbsd" event={"ID":"3959ab36-a688-40ac-b70b-b3cc35b1d7a1","Type":"ContainerDied","Data":"767ef7bf1d3ba1aa0c4f7321a9fd4a583591dd67c6f6f39f86b5ad15f4ac2b8d"} Mar 18 10:38:04 crc kubenswrapper[4733]: I0318 10:38:04.985461 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="767ef7bf1d3ba1aa0c4f7321a9fd4a583591dd67c6f6f39f86b5ad15f4ac2b8d" Mar 18 10:38:05 crc kubenswrapper[4733]: I0318 10:38:05.398400 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563832-njktw"] Mar 18 10:38:05 crc kubenswrapper[4733]: I0318 10:38:05.408699 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563832-njktw"] Mar 18 10:38:06 crc kubenswrapper[4733]: I0318 10:38:06.010021 4733 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wxvvt" podUID="b0c43d05-915d-4373-aae0-67a182acc4bc" containerName="registry-server" probeResult="failure" output=< Mar 18 10:38:06 crc kubenswrapper[4733]: timeout: failed to connect service ":50051" within 1s Mar 18 10:38:06 crc kubenswrapper[4733]: > Mar 18 10:38:07 crc kubenswrapper[4733]: I0318 10:38:07.189513 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac26b4cb-ac0b-4b78-9c5e-60c6563b478e" path="/var/lib/kubelet/pods/ac26b4cb-ac0b-4b78-9c5e-60c6563b478e/volumes" Mar 18 10:38:08 crc kubenswrapper[4733]: I0318 10:38:08.985972 4733 scope.go:117] "RemoveContainer" containerID="9bc7f39c4918c4a53f61ec2045418343aab6acedb5d7104271be60607764a8a9" Mar 18 10:38:09 crc kubenswrapper[4733]: I0318 10:38:09.013307 4733 scope.go:117] "RemoveContainer" containerID="fc8a98034f827fb8988cc2fa281e7a7c5e2bd32e772267e324591ed784c75b62" Mar 18 10:38:09 crc kubenswrapper[4733]: I0318 10:38:09.075507 4733 scope.go:117] "RemoveContainer" containerID="6a586c5fd4b77aeae152ada8c17b0c5946fd162e927825804e11d81559ba17f0" Mar 18 10:38:10 crc kubenswrapper[4733]: I0318 10:38:10.176415 4733 scope.go:117] "RemoveContainer" containerID="b8c4d43890082484b9a1254cd9426c5cd83f45e4b1a61544da192f3cdebac71e" Mar 18 10:38:11 crc kubenswrapper[4733]: I0318 10:38:11.033430 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b4a4e3e2-bd4d-4f8d-97bc-51267378ab03","Type":"ContainerStarted","Data":"34bcdecb0fa459a1d6253fb10ac79d88f89fe0809b9e5dec1a6ca99fb8f7810c"} Mar 18 10:38:11 crc kubenswrapper[4733]: I0318 10:38:11.034334 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 18 10:38:15 crc kubenswrapper[4733]: I0318 10:38:15.026113 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wxvvt" Mar 18 10:38:15 crc kubenswrapper[4733]: I0318 10:38:15.079306 4733 generic.go:334] "Generic (PLEG): container finished" podID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" containerID="34bcdecb0fa459a1d6253fb10ac79d88f89fe0809b9e5dec1a6ca99fb8f7810c" exitCode=0 Mar 18 10:38:15 crc kubenswrapper[4733]: I0318 10:38:15.079385 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b4a4e3e2-bd4d-4f8d-97bc-51267378ab03","Type":"ContainerDied","Data":"34bcdecb0fa459a1d6253fb10ac79d88f89fe0809b9e5dec1a6ca99fb8f7810c"} Mar 18 10:38:15 crc kubenswrapper[4733]: I0318 10:38:15.079457 4733 scope.go:117] "RemoveContainer" containerID="b8c4d43890082484b9a1254cd9426c5cd83f45e4b1a61544da192f3cdebac71e" Mar 18 10:38:15 crc kubenswrapper[4733]: I0318 10:38:15.080607 4733 scope.go:117] "RemoveContainer" containerID="34bcdecb0fa459a1d6253fb10ac79d88f89fe0809b9e5dec1a6ca99fb8f7810c" Mar 18 10:38:15 crc kubenswrapper[4733]: E0318 10:38:15.081108 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:38:15 crc kubenswrapper[4733]: I0318 10:38:15.115472 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wxvvt" Mar 18 10:38:15 crc kubenswrapper[4733]: I0318 10:38:15.287038 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wxvvt"] Mar 18 10:38:16 crc kubenswrapper[4733]: I0318 10:38:16.093397 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wxvvt" podUID="b0c43d05-915d-4373-aae0-67a182acc4bc" containerName="registry-server" containerID="cri-o://a15d60390589ca2e549d7e30fc944c0099e4336af37991944c6539ec52e623aa" gracePeriod=2 Mar 18 10:38:16 crc kubenswrapper[4733]: I0318 10:38:16.176075 4733 scope.go:117] "RemoveContainer" containerID="39f4b0f268d8d6f6613db69085678023c98d9ef75625187f285b2a88548a855a" Mar 18 10:38:16 crc kubenswrapper[4733]: E0318 10:38:16.176466 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:38:16 crc kubenswrapper[4733]: I0318 10:38:16.660619 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wxvvt" Mar 18 10:38:16 crc kubenswrapper[4733]: I0318 10:38:16.751956 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0c43d05-915d-4373-aae0-67a182acc4bc-utilities\") pod \"b0c43d05-915d-4373-aae0-67a182acc4bc\" (UID: \"b0c43d05-915d-4373-aae0-67a182acc4bc\") " Mar 18 10:38:16 crc kubenswrapper[4733]: I0318 10:38:16.752119 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0c43d05-915d-4373-aae0-67a182acc4bc-catalog-content\") pod \"b0c43d05-915d-4373-aae0-67a182acc4bc\" (UID: \"b0c43d05-915d-4373-aae0-67a182acc4bc\") " Mar 18 10:38:16 crc kubenswrapper[4733]: I0318 10:38:16.752215 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjpzr\" (UniqueName: \"kubernetes.io/projected/b0c43d05-915d-4373-aae0-67a182acc4bc-kube-api-access-xjpzr\") pod \"b0c43d05-915d-4373-aae0-67a182acc4bc\" (UID: \"b0c43d05-915d-4373-aae0-67a182acc4bc\") " Mar 18 10:38:16 crc kubenswrapper[4733]: I0318 10:38:16.754146 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0c43d05-915d-4373-aae0-67a182acc4bc-utilities" (OuterVolumeSpecName: "utilities") pod "b0c43d05-915d-4373-aae0-67a182acc4bc" (UID: "b0c43d05-915d-4373-aae0-67a182acc4bc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:38:16 crc kubenswrapper[4733]: I0318 10:38:16.767658 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0c43d05-915d-4373-aae0-67a182acc4bc-kube-api-access-xjpzr" (OuterVolumeSpecName: "kube-api-access-xjpzr") pod "b0c43d05-915d-4373-aae0-67a182acc4bc" (UID: "b0c43d05-915d-4373-aae0-67a182acc4bc"). InnerVolumeSpecName "kube-api-access-xjpzr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:38:16 crc kubenswrapper[4733]: I0318 10:38:16.853867 4733 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0c43d05-915d-4373-aae0-67a182acc4bc-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 10:38:16 crc kubenswrapper[4733]: I0318 10:38:16.853901 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjpzr\" (UniqueName: \"kubernetes.io/projected/b0c43d05-915d-4373-aae0-67a182acc4bc-kube-api-access-xjpzr\") on node \"crc\" DevicePath \"\"" Mar 18 10:38:16 crc kubenswrapper[4733]: I0318 10:38:16.938745 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0c43d05-915d-4373-aae0-67a182acc4bc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b0c43d05-915d-4373-aae0-67a182acc4bc" (UID: "b0c43d05-915d-4373-aae0-67a182acc4bc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:38:16 crc kubenswrapper[4733]: I0318 10:38:16.955709 4733 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0c43d05-915d-4373-aae0-67a182acc4bc-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 10:38:17 crc kubenswrapper[4733]: I0318 10:38:17.126913 4733 generic.go:334] "Generic (PLEG): container finished" podID="b0c43d05-915d-4373-aae0-67a182acc4bc" containerID="a15d60390589ca2e549d7e30fc944c0099e4336af37991944c6539ec52e623aa" exitCode=0 Mar 18 10:38:17 crc kubenswrapper[4733]: I0318 10:38:17.126974 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wxvvt" event={"ID":"b0c43d05-915d-4373-aae0-67a182acc4bc","Type":"ContainerDied","Data":"a15d60390589ca2e549d7e30fc944c0099e4336af37991944c6539ec52e623aa"} Mar 18 10:38:17 crc kubenswrapper[4733]: I0318 10:38:17.127008 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wxvvt" event={"ID":"b0c43d05-915d-4373-aae0-67a182acc4bc","Type":"ContainerDied","Data":"98f9eb05565700e8349c0932c41e77ad80e45cc728fe839b9e5a63014e0f9e85"} Mar 18 10:38:17 crc kubenswrapper[4733]: I0318 10:38:17.127031 4733 scope.go:117] "RemoveContainer" containerID="a15d60390589ca2e549d7e30fc944c0099e4336af37991944c6539ec52e623aa" Mar 18 10:38:17 crc kubenswrapper[4733]: I0318 10:38:17.127073 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wxvvt" Mar 18 10:38:17 crc kubenswrapper[4733]: I0318 10:38:17.161141 4733 scope.go:117] "RemoveContainer" containerID="c49023eae46560082477167844c01c17c714b041e28ae5e5bfa596612d1eda74" Mar 18 10:38:17 crc kubenswrapper[4733]: I0318 10:38:17.199551 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wxvvt"] Mar 18 10:38:17 crc kubenswrapper[4733]: I0318 10:38:17.200338 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wxvvt"] Mar 18 10:38:17 crc kubenswrapper[4733]: I0318 10:38:17.207389 4733 scope.go:117] "RemoveContainer" containerID="7261f0adf78b1ca0abdaa2369b3560c3c1bc5bc57577bda27c7f5f634951063e" Mar 18 10:38:17 crc kubenswrapper[4733]: I0318 10:38:17.237794 4733 scope.go:117] "RemoveContainer" containerID="a15d60390589ca2e549d7e30fc944c0099e4336af37991944c6539ec52e623aa" Mar 18 10:38:17 crc kubenswrapper[4733]: E0318 10:38:17.238998 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a15d60390589ca2e549d7e30fc944c0099e4336af37991944c6539ec52e623aa\": container with ID starting with a15d60390589ca2e549d7e30fc944c0099e4336af37991944c6539ec52e623aa not found: ID does not exist" containerID="a15d60390589ca2e549d7e30fc944c0099e4336af37991944c6539ec52e623aa" Mar 18 10:38:17 crc kubenswrapper[4733]: I0318 10:38:17.239069 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a15d60390589ca2e549d7e30fc944c0099e4336af37991944c6539ec52e623aa"} err="failed to get container status \"a15d60390589ca2e549d7e30fc944c0099e4336af37991944c6539ec52e623aa\": rpc error: code = NotFound desc = could not find container \"a15d60390589ca2e549d7e30fc944c0099e4336af37991944c6539ec52e623aa\": container with ID starting with a15d60390589ca2e549d7e30fc944c0099e4336af37991944c6539ec52e623aa not found: ID does not exist" Mar 18 10:38:17 crc kubenswrapper[4733]: I0318 10:38:17.239139 4733 scope.go:117] "RemoveContainer" containerID="c49023eae46560082477167844c01c17c714b041e28ae5e5bfa596612d1eda74" Mar 18 10:38:17 crc kubenswrapper[4733]: E0318 10:38:17.240164 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c49023eae46560082477167844c01c17c714b041e28ae5e5bfa596612d1eda74\": container with ID starting with c49023eae46560082477167844c01c17c714b041e28ae5e5bfa596612d1eda74 not found: ID does not exist" containerID="c49023eae46560082477167844c01c17c714b041e28ae5e5bfa596612d1eda74" Mar 18 10:38:17 crc kubenswrapper[4733]: I0318 10:38:17.240227 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c49023eae46560082477167844c01c17c714b041e28ae5e5bfa596612d1eda74"} err="failed to get container status \"c49023eae46560082477167844c01c17c714b041e28ae5e5bfa596612d1eda74\": rpc error: code = NotFound desc = could not find container \"c49023eae46560082477167844c01c17c714b041e28ae5e5bfa596612d1eda74\": container with ID starting with c49023eae46560082477167844c01c17c714b041e28ae5e5bfa596612d1eda74 not found: ID does not exist" Mar 18 10:38:17 crc kubenswrapper[4733]: I0318 10:38:17.240257 4733 scope.go:117] "RemoveContainer" containerID="7261f0adf78b1ca0abdaa2369b3560c3c1bc5bc57577bda27c7f5f634951063e" Mar 18 10:38:17 crc kubenswrapper[4733]: E0318 10:38:17.240873 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7261f0adf78b1ca0abdaa2369b3560c3c1bc5bc57577bda27c7f5f634951063e\": container with ID starting with 7261f0adf78b1ca0abdaa2369b3560c3c1bc5bc57577bda27c7f5f634951063e not found: ID does not exist" containerID="7261f0adf78b1ca0abdaa2369b3560c3c1bc5bc57577bda27c7f5f634951063e" Mar 18 10:38:17 crc kubenswrapper[4733]: I0318 10:38:17.241048 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7261f0adf78b1ca0abdaa2369b3560c3c1bc5bc57577bda27c7f5f634951063e"} err="failed to get container status \"7261f0adf78b1ca0abdaa2369b3560c3c1bc5bc57577bda27c7f5f634951063e\": rpc error: code = NotFound desc = could not find container \"7261f0adf78b1ca0abdaa2369b3560c3c1bc5bc57577bda27c7f5f634951063e\": container with ID starting with 7261f0adf78b1ca0abdaa2369b3560c3c1bc5bc57577bda27c7f5f634951063e not found: ID does not exist" Mar 18 10:38:19 crc kubenswrapper[4733]: I0318 10:38:19.194583 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0c43d05-915d-4373-aae0-67a182acc4bc" path="/var/lib/kubelet/pods/b0c43d05-915d-4373-aae0-67a182acc4bc/volumes" Mar 18 10:38:30 crc kubenswrapper[4733]: I0318 10:38:30.176750 4733 scope.go:117] "RemoveContainer" containerID="39f4b0f268d8d6f6613db69085678023c98d9ef75625187f285b2a88548a855a" Mar 18 10:38:30 crc kubenswrapper[4733]: E0318 10:38:30.177882 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:38:30 crc kubenswrapper[4733]: I0318 10:38:30.177890 4733 scope.go:117] "RemoveContainer" containerID="34bcdecb0fa459a1d6253fb10ac79d88f89fe0809b9e5dec1a6ca99fb8f7810c" Mar 18 10:38:30 crc kubenswrapper[4733]: E0318 10:38:30.178464 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:38:41 crc kubenswrapper[4733]: I0318 10:38:41.185371 4733 scope.go:117] "RemoveContainer" containerID="34bcdecb0fa459a1d6253fb10ac79d88f89fe0809b9e5dec1a6ca99fb8f7810c" Mar 18 10:38:41 crc kubenswrapper[4733]: E0318 10:38:41.186577 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:38:43 crc kubenswrapper[4733]: I0318 10:38:43.571742 4733 patch_prober.go:28] interesting pod/machine-config-daemon-2h7dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:38:43 crc kubenswrapper[4733]: I0318 10:38:43.572238 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:38:44 crc kubenswrapper[4733]: I0318 10:38:44.176104 4733 scope.go:117] "RemoveContainer" containerID="39f4b0f268d8d6f6613db69085678023c98d9ef75625187f285b2a88548a855a" Mar 18 10:38:44 crc kubenswrapper[4733]: E0318 10:38:44.176661 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:38:56 crc kubenswrapper[4733]: I0318 10:38:56.175155 4733 scope.go:117] "RemoveContainer" containerID="39f4b0f268d8d6f6613db69085678023c98d9ef75625187f285b2a88548a855a" Mar 18 10:38:56 crc kubenswrapper[4733]: E0318 10:38:56.176813 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:38:56 crc kubenswrapper[4733]: I0318 10:38:56.176907 4733 scope.go:117] "RemoveContainer" containerID="34bcdecb0fa459a1d6253fb10ac79d88f89fe0809b9e5dec1a6ca99fb8f7810c" Mar 18 10:38:56 crc kubenswrapper[4733]: E0318 10:38:56.177138 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:39:07 crc kubenswrapper[4733]: I0318 10:39:07.176905 4733 scope.go:117] "RemoveContainer" containerID="39f4b0f268d8d6f6613db69085678023c98d9ef75625187f285b2a88548a855a" Mar 18 10:39:07 crc kubenswrapper[4733]: E0318 10:39:07.178328 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:39:09 crc kubenswrapper[4733]: I0318 10:39:09.220517 4733 scope.go:117] "RemoveContainer" containerID="da48f5028812280b5314f3d818c71b3049bdb0d8b1d5755bc74f1fedad4676d7" Mar 18 10:39:11 crc kubenswrapper[4733]: I0318 10:39:11.181379 4733 scope.go:117] "RemoveContainer" containerID="34bcdecb0fa459a1d6253fb10ac79d88f89fe0809b9e5dec1a6ca99fb8f7810c" Mar 18 10:39:11 crc kubenswrapper[4733]: E0318 10:39:11.181980 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:39:13 crc kubenswrapper[4733]: I0318 10:39:13.571700 4733 patch_prober.go:28] interesting pod/machine-config-daemon-2h7dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:39:13 crc kubenswrapper[4733]: I0318 10:39:13.572127 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:39:20 crc kubenswrapper[4733]: I0318 10:39:20.176215 4733 scope.go:117] "RemoveContainer" containerID="39f4b0f268d8d6f6613db69085678023c98d9ef75625187f285b2a88548a855a" Mar 18 10:39:20 crc kubenswrapper[4733]: E0318 10:39:20.177217 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:39:25 crc kubenswrapper[4733]: I0318 10:39:25.175961 4733 scope.go:117] "RemoveContainer" containerID="34bcdecb0fa459a1d6253fb10ac79d88f89fe0809b9e5dec1a6ca99fb8f7810c" Mar 18 10:39:25 crc kubenswrapper[4733]: E0318 10:39:25.176888 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:39:34 crc kubenswrapper[4733]: I0318 10:39:34.176649 4733 scope.go:117] "RemoveContainer" containerID="39f4b0f268d8d6f6613db69085678023c98d9ef75625187f285b2a88548a855a" Mar 18 10:39:34 crc kubenswrapper[4733]: E0318 10:39:34.177586 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:39:36 crc kubenswrapper[4733]: I0318 10:39:36.175634 4733 scope.go:117] "RemoveContainer" containerID="34bcdecb0fa459a1d6253fb10ac79d88f89fe0809b9e5dec1a6ca99fb8f7810c" Mar 18 10:39:36 crc kubenswrapper[4733]: E0318 10:39:36.176949 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:39:43 crc kubenswrapper[4733]: I0318 10:39:43.571798 4733 patch_prober.go:28] interesting pod/machine-config-daemon-2h7dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:39:43 crc kubenswrapper[4733]: I0318 10:39:43.572624 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:39:43 crc kubenswrapper[4733]: I0318 10:39:43.572691 4733 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" Mar 18 10:39:43 crc kubenswrapper[4733]: I0318 10:39:43.573574 4733 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fc33062a38e6003bcfe678b0b641bcd73299a07f8dcc32e6f590e8bb7c29b637"} pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 10:39:43 crc kubenswrapper[4733]: I0318 10:39:43.573670 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" containerName="machine-config-daemon" containerID="cri-o://fc33062a38e6003bcfe678b0b641bcd73299a07f8dcc32e6f590e8bb7c29b637" gracePeriod=600 Mar 18 10:39:43 crc kubenswrapper[4733]: E0318 10:39:43.703727 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 10:39:44 crc kubenswrapper[4733]: I0318 10:39:44.024802 4733 generic.go:334] "Generic (PLEG): container finished" podID="6f75e1c5-e0c5-43df-944f-77b734070793" containerID="fc33062a38e6003bcfe678b0b641bcd73299a07f8dcc32e6f590e8bb7c29b637" exitCode=0 Mar 18 10:39:44 crc kubenswrapper[4733]: I0318 10:39:44.024909 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" event={"ID":"6f75e1c5-e0c5-43df-944f-77b734070793","Type":"ContainerDied","Data":"fc33062a38e6003bcfe678b0b641bcd73299a07f8dcc32e6f590e8bb7c29b637"} Mar 18 10:39:44 crc kubenswrapper[4733]: I0318 10:39:44.025226 4733 scope.go:117] "RemoveContainer" containerID="18491327409d036c07217a5bf65332367e43c6f94559e59f3995caefe0f899d9" Mar 18 10:39:44 crc kubenswrapper[4733]: I0318 10:39:44.026060 4733 scope.go:117] "RemoveContainer" containerID="fc33062a38e6003bcfe678b0b641bcd73299a07f8dcc32e6f590e8bb7c29b637" Mar 18 10:39:44 crc kubenswrapper[4733]: E0318 10:39:44.026498 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 10:39:46 crc kubenswrapper[4733]: I0318 10:39:46.176800 4733 scope.go:117] "RemoveContainer" containerID="39f4b0f268d8d6f6613db69085678023c98d9ef75625187f285b2a88548a855a" Mar 18 10:39:46 crc kubenswrapper[4733]: E0318 10:39:46.177682 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:39:47 crc kubenswrapper[4733]: I0318 10:39:47.175864 4733 scope.go:117] "RemoveContainer" containerID="34bcdecb0fa459a1d6253fb10ac79d88f89fe0809b9e5dec1a6ca99fb8f7810c" Mar 18 10:39:47 crc kubenswrapper[4733]: E0318 10:39:47.176380 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:39:57 crc kubenswrapper[4733]: I0318 10:39:57.175984 4733 scope.go:117] "RemoveContainer" containerID="fc33062a38e6003bcfe678b0b641bcd73299a07f8dcc32e6f590e8bb7c29b637" Mar 18 10:39:57 crc kubenswrapper[4733]: E0318 10:39:57.176983 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 10:40:00 crc kubenswrapper[4733]: I0318 10:40:00.159937 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563840-lvjwz"] Mar 18 10:40:00 crc kubenswrapper[4733]: E0318 10:40:00.160668 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0c43d05-915d-4373-aae0-67a182acc4bc" containerName="extract-utilities" Mar 18 10:40:00 crc kubenswrapper[4733]: I0318 10:40:00.160681 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0c43d05-915d-4373-aae0-67a182acc4bc" containerName="extract-utilities" Mar 18 10:40:00 crc kubenswrapper[4733]: E0318 10:40:00.160690 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0c43d05-915d-4373-aae0-67a182acc4bc" containerName="extract-content" Mar 18 10:40:00 crc kubenswrapper[4733]: I0318 10:40:00.160696 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0c43d05-915d-4373-aae0-67a182acc4bc" containerName="extract-content" Mar 18 10:40:00 crc kubenswrapper[4733]: E0318 10:40:00.160722 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0c43d05-915d-4373-aae0-67a182acc4bc" containerName="registry-server" Mar 18 10:40:00 crc kubenswrapper[4733]: I0318 10:40:00.160727 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0c43d05-915d-4373-aae0-67a182acc4bc" containerName="registry-server" Mar 18 10:40:00 crc kubenswrapper[4733]: E0318 10:40:00.160740 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3959ab36-a688-40ac-b70b-b3cc35b1d7a1" containerName="oc" Mar 18 10:40:00 crc kubenswrapper[4733]: I0318 10:40:00.160746 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="3959ab36-a688-40ac-b70b-b3cc35b1d7a1" containerName="oc" Mar 18 10:40:00 crc kubenswrapper[4733]: I0318 10:40:00.160881 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="3959ab36-a688-40ac-b70b-b3cc35b1d7a1" containerName="oc" Mar 18 10:40:00 crc kubenswrapper[4733]: I0318 10:40:00.160906 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0c43d05-915d-4373-aae0-67a182acc4bc" containerName="registry-server" Mar 18 10:40:00 crc kubenswrapper[4733]: I0318 10:40:00.161466 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563840-lvjwz" Mar 18 10:40:00 crc kubenswrapper[4733]: I0318 10:40:00.165120 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:40:00 crc kubenswrapper[4733]: I0318 10:40:00.165231 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:40:00 crc kubenswrapper[4733]: I0318 10:40:00.165871 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wmd5k" Mar 18 10:40:00 crc kubenswrapper[4733]: I0318 10:40:00.175069 4733 scope.go:117] "RemoveContainer" containerID="39f4b0f268d8d6f6613db69085678023c98d9ef75625187f285b2a88548a855a" Mar 18 10:40:00 crc kubenswrapper[4733]: E0318 10:40:00.175446 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:40:00 crc kubenswrapper[4733]: I0318 10:40:00.176402 4733 scope.go:117] "RemoveContainer" containerID="34bcdecb0fa459a1d6253fb10ac79d88f89fe0809b9e5dec1a6ca99fb8f7810c" Mar 18 10:40:00 crc kubenswrapper[4733]: E0318 10:40:00.177060 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:40:00 crc kubenswrapper[4733]: I0318 10:40:00.177094 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563840-lvjwz"] Mar 18 10:40:00 crc kubenswrapper[4733]: I0318 10:40:00.232749 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvhm5\" (UniqueName: \"kubernetes.io/projected/be59cc18-c769-443f-962f-042b8ba456b8-kube-api-access-rvhm5\") pod \"auto-csr-approver-29563840-lvjwz\" (UID: \"be59cc18-c769-443f-962f-042b8ba456b8\") " pod="openshift-infra/auto-csr-approver-29563840-lvjwz" Mar 18 10:40:00 crc kubenswrapper[4733]: I0318 10:40:00.334629 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvhm5\" (UniqueName: \"kubernetes.io/projected/be59cc18-c769-443f-962f-042b8ba456b8-kube-api-access-rvhm5\") pod \"auto-csr-approver-29563840-lvjwz\" (UID: \"be59cc18-c769-443f-962f-042b8ba456b8\") " pod="openshift-infra/auto-csr-approver-29563840-lvjwz" Mar 18 10:40:00 crc kubenswrapper[4733]: I0318 10:40:00.367623 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvhm5\" (UniqueName: \"kubernetes.io/projected/be59cc18-c769-443f-962f-042b8ba456b8-kube-api-access-rvhm5\") pod \"auto-csr-approver-29563840-lvjwz\" (UID: \"be59cc18-c769-443f-962f-042b8ba456b8\") " pod="openshift-infra/auto-csr-approver-29563840-lvjwz" Mar 18 10:40:00 crc kubenswrapper[4733]: I0318 10:40:00.496468 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563840-lvjwz" Mar 18 10:40:01 crc kubenswrapper[4733]: W0318 10:40:01.047646 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe59cc18_c769_443f_962f_042b8ba456b8.slice/crio-5c4e2819f246915dad9f03c2c399d751618824fd626fb44a75fc69ec2dd1b824 WatchSource:0}: Error finding container 5c4e2819f246915dad9f03c2c399d751618824fd626fb44a75fc69ec2dd1b824: Status 404 returned error can't find the container with id 5c4e2819f246915dad9f03c2c399d751618824fd626fb44a75fc69ec2dd1b824 Mar 18 10:40:01 crc kubenswrapper[4733]: I0318 10:40:01.049085 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563840-lvjwz"] Mar 18 10:40:01 crc kubenswrapper[4733]: I0318 10:40:01.193724 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563840-lvjwz" event={"ID":"be59cc18-c769-443f-962f-042b8ba456b8","Type":"ContainerStarted","Data":"5c4e2819f246915dad9f03c2c399d751618824fd626fb44a75fc69ec2dd1b824"} Mar 18 10:40:03 crc kubenswrapper[4733]: I0318 10:40:03.232679 4733 generic.go:334] "Generic (PLEG): container finished" podID="be59cc18-c769-443f-962f-042b8ba456b8" containerID="875056c9052fb9c1578503b6e7b5412bb4aa77c6a765ef7ede001dc3e0cb6698" exitCode=0 Mar 18 10:40:03 crc kubenswrapper[4733]: I0318 10:40:03.232962 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563840-lvjwz" event={"ID":"be59cc18-c769-443f-962f-042b8ba456b8","Type":"ContainerDied","Data":"875056c9052fb9c1578503b6e7b5412bb4aa77c6a765ef7ede001dc3e0cb6698"} Mar 18 10:40:04 crc kubenswrapper[4733]: I0318 10:40:04.641968 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563840-lvjwz" Mar 18 10:40:04 crc kubenswrapper[4733]: I0318 10:40:04.823357 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvhm5\" (UniqueName: \"kubernetes.io/projected/be59cc18-c769-443f-962f-042b8ba456b8-kube-api-access-rvhm5\") pod \"be59cc18-c769-443f-962f-042b8ba456b8\" (UID: \"be59cc18-c769-443f-962f-042b8ba456b8\") " Mar 18 10:40:04 crc kubenswrapper[4733]: I0318 10:40:04.832954 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be59cc18-c769-443f-962f-042b8ba456b8-kube-api-access-rvhm5" (OuterVolumeSpecName: "kube-api-access-rvhm5") pod "be59cc18-c769-443f-962f-042b8ba456b8" (UID: "be59cc18-c769-443f-962f-042b8ba456b8"). InnerVolumeSpecName "kube-api-access-rvhm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:40:04 crc kubenswrapper[4733]: I0318 10:40:04.925589 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvhm5\" (UniqueName: \"kubernetes.io/projected/be59cc18-c769-443f-962f-042b8ba456b8-kube-api-access-rvhm5\") on node \"crc\" DevicePath \"\"" Mar 18 10:40:05 crc kubenswrapper[4733]: I0318 10:40:05.265167 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563840-lvjwz" event={"ID":"be59cc18-c769-443f-962f-042b8ba456b8","Type":"ContainerDied","Data":"5c4e2819f246915dad9f03c2c399d751618824fd626fb44a75fc69ec2dd1b824"} Mar 18 10:40:05 crc kubenswrapper[4733]: I0318 10:40:05.265258 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c4e2819f246915dad9f03c2c399d751618824fd626fb44a75fc69ec2dd1b824" Mar 18 10:40:05 crc kubenswrapper[4733]: I0318 10:40:05.265340 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563840-lvjwz" Mar 18 10:40:05 crc kubenswrapper[4733]: I0318 10:40:05.745117 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563834-7bqxq"] Mar 18 10:40:05 crc kubenswrapper[4733]: I0318 10:40:05.755499 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563834-7bqxq"] Mar 18 10:40:07 crc kubenswrapper[4733]: I0318 10:40:07.195676 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc0f56eb-6c6a-49bf-9a12-ef5f2dd95316" path="/var/lib/kubelet/pods/dc0f56eb-6c6a-49bf-9a12-ef5f2dd95316/volumes" Mar 18 10:40:09 crc kubenswrapper[4733]: I0318 10:40:09.599645 4733 scope.go:117] "RemoveContainer" containerID="2746f736c334d9ac3079e5dc9b5db5929c610a6933e47547e58536ec78e443c9" Mar 18 10:40:11 crc kubenswrapper[4733]: I0318 10:40:11.179826 4733 scope.go:117] "RemoveContainer" containerID="fc33062a38e6003bcfe678b0b641bcd73299a07f8dcc32e6f590e8bb7c29b637" Mar 18 10:40:11 crc kubenswrapper[4733]: E0318 10:40:11.180469 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 10:40:14 crc kubenswrapper[4733]: I0318 10:40:14.175978 4733 scope.go:117] "RemoveContainer" containerID="34bcdecb0fa459a1d6253fb10ac79d88f89fe0809b9e5dec1a6ca99fb8f7810c" Mar 18 10:40:14 crc kubenswrapper[4733]: I0318 10:40:14.176417 4733 scope.go:117] "RemoveContainer" containerID="39f4b0f268d8d6f6613db69085678023c98d9ef75625187f285b2a88548a855a" Mar 18 10:40:14 crc kubenswrapper[4733]: E0318 10:40:14.176689 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:40:14 crc kubenswrapper[4733]: E0318 10:40:14.176874 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:40:26 crc kubenswrapper[4733]: I0318 10:40:26.175559 4733 scope.go:117] "RemoveContainer" containerID="fc33062a38e6003bcfe678b0b641bcd73299a07f8dcc32e6f590e8bb7c29b637" Mar 18 10:40:26 crc kubenswrapper[4733]: E0318 10:40:26.176223 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 10:40:27 crc kubenswrapper[4733]: I0318 10:40:27.175585 4733 scope.go:117] "RemoveContainer" containerID="34bcdecb0fa459a1d6253fb10ac79d88f89fe0809b9e5dec1a6ca99fb8f7810c" Mar 18 10:40:27 crc kubenswrapper[4733]: E0318 10:40:27.176285 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:40:28 crc kubenswrapper[4733]: I0318 10:40:28.175925 4733 scope.go:117] "RemoveContainer" containerID="39f4b0f268d8d6f6613db69085678023c98d9ef75625187f285b2a88548a855a" Mar 18 10:40:28 crc kubenswrapper[4733]: E0318 10:40:28.176305 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:40:37 crc kubenswrapper[4733]: I0318 10:40:37.175333 4733 scope.go:117] "RemoveContainer" containerID="fc33062a38e6003bcfe678b0b641bcd73299a07f8dcc32e6f590e8bb7c29b637" Mar 18 10:40:37 crc kubenswrapper[4733]: E0318 10:40:37.176332 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 10:40:40 crc kubenswrapper[4733]: I0318 10:40:40.178461 4733 scope.go:117] "RemoveContainer" containerID="39f4b0f268d8d6f6613db69085678023c98d9ef75625187f285b2a88548a855a" Mar 18 10:40:40 crc kubenswrapper[4733]: E0318 10:40:40.179179 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:40:41 crc kubenswrapper[4733]: I0318 10:40:41.204033 4733 scope.go:117] "RemoveContainer" containerID="34bcdecb0fa459a1d6253fb10ac79d88f89fe0809b9e5dec1a6ca99fb8f7810c" Mar 18 10:40:41 crc kubenswrapper[4733]: E0318 10:40:41.205570 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:40:51 crc kubenswrapper[4733]: I0318 10:40:51.184513 4733 scope.go:117] "RemoveContainer" containerID="fc33062a38e6003bcfe678b0b641bcd73299a07f8dcc32e6f590e8bb7c29b637" Mar 18 10:40:51 crc kubenswrapper[4733]: E0318 10:40:51.185535 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 10:40:52 crc kubenswrapper[4733]: I0318 10:40:52.176662 4733 scope.go:117] "RemoveContainer" containerID="34bcdecb0fa459a1d6253fb10ac79d88f89fe0809b9e5dec1a6ca99fb8f7810c" Mar 18 10:40:52 crc kubenswrapper[4733]: E0318 10:40:52.177575 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:40:52 crc kubenswrapper[4733]: I0318 10:40:52.178318 4733 scope.go:117] "RemoveContainer" containerID="39f4b0f268d8d6f6613db69085678023c98d9ef75625187f285b2a88548a855a" Mar 18 10:40:52 crc kubenswrapper[4733]: E0318 10:40:52.178804 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:41:04 crc kubenswrapper[4733]: I0318 10:41:04.176242 4733 scope.go:117] "RemoveContainer" containerID="34bcdecb0fa459a1d6253fb10ac79d88f89fe0809b9e5dec1a6ca99fb8f7810c" Mar 18 10:41:04 crc kubenswrapper[4733]: I0318 10:41:04.177284 4733 scope.go:117] "RemoveContainer" containerID="39f4b0f268d8d6f6613db69085678023c98d9ef75625187f285b2a88548a855a" Mar 18 10:41:04 crc kubenswrapper[4733]: E0318 10:41:04.177620 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:41:04 crc kubenswrapper[4733]: E0318 10:41:04.177743 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:41:06 crc kubenswrapper[4733]: I0318 10:41:06.176281 4733 scope.go:117] "RemoveContainer" containerID="fc33062a38e6003bcfe678b0b641bcd73299a07f8dcc32e6f590e8bb7c29b637" Mar 18 10:41:06 crc kubenswrapper[4733]: E0318 10:41:06.177123 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 10:41:15 crc kubenswrapper[4733]: I0318 10:41:15.175935 4733 scope.go:117] "RemoveContainer" containerID="39f4b0f268d8d6f6613db69085678023c98d9ef75625187f285b2a88548a855a" Mar 18 10:41:15 crc kubenswrapper[4733]: E0318 10:41:15.177290 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:41:16 crc kubenswrapper[4733]: I0318 10:41:16.178357 4733 scope.go:117] "RemoveContainer" containerID="34bcdecb0fa459a1d6253fb10ac79d88f89fe0809b9e5dec1a6ca99fb8f7810c" Mar 18 10:41:16 crc kubenswrapper[4733]: E0318 10:41:16.179739 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:41:19 crc kubenswrapper[4733]: I0318 10:41:17.928263 4733 patch_prober.go:28] interesting pod/router-default-5444994796-xl5d7 container/router namespace/openshift-ingress: Liveness probe status=failure output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 10:41:19 crc kubenswrapper[4733]: I0318 10:41:17.928610 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-ingress/router-default-5444994796-xl5d7" podUID="9c5f567e-b38f-44a0-b1fd-1a96857e811f" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 10:41:19 crc kubenswrapper[4733]: I0318 10:41:17.928690 4733 patch_prober.go:28] interesting pod/router-default-5444994796-xl5d7 container/router namespace/openshift-ingress: Readiness probe status=failure output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 10:41:19 crc kubenswrapper[4733]: I0318 10:41:17.928705 4733 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-xl5d7" podUID="9c5f567e-b38f-44a0-b1fd-1a96857e811f" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 10:41:19 crc kubenswrapper[4733]: I0318 10:41:17.973314 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/neutron-operator-controller-manager-767865f676-gkndg" podUID="216f9239-7d2e-483e-a89f-0955a518aa4a" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.78:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 10:41:19 crc kubenswrapper[4733]: I0318 10:41:18.017921 4733 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/neutron-operator-controller-manager-767865f676-gkndg" podUID="216f9239-7d2e-483e-a89f-0955a518aa4a" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.78:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 10:41:19 crc kubenswrapper[4733]: I0318 10:41:18.177717 4733 scope.go:117] "RemoveContainer" containerID="fc33062a38e6003bcfe678b0b641bcd73299a07f8dcc32e6f590e8bb7c29b637" Mar 18 10:41:19 crc kubenswrapper[4733]: I0318 10:41:18.377438 4733 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-2wc5m container/olm-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.15:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 10:41:19 crc kubenswrapper[4733]: I0318 10:41:18.422447 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-sqr4g" podUID="cd9234ed-fcbc-4d81-9034-27d39b3df6ee" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.90:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 10:41:19 crc kubenswrapper[4733]: I0318 10:41:18.467448 4733 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-nskpj" podUID="6152e0d7-6362-4c7d-ba2b-4a1e55ca4f54" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.89:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 10:41:19 crc kubenswrapper[4733]: I0318 10:41:18.512373 4733 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-2wc5m container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 10:41:19 crc kubenswrapper[4733]: I0318 10:41:18.512421 4733 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2wc5m" podUID="9571ba80-f267-46ed-8d16-e44531cb0ce8" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.15:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 10:41:19 crc kubenswrapper[4733]: I0318 10:41:18.512493 4733 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-sqr4g" podUID="cd9234ed-fcbc-4d81-9034-27d39b3df6ee" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.90:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 10:41:19 crc kubenswrapper[4733]: I0318 10:41:18.512561 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-nskpj" podUID="6152e0d7-6362-4c7d-ba2b-4a1e55ca4f54" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.89:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 10:41:19 crc kubenswrapper[4733]: I0318 10:41:18.377515 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2wc5m" podUID="9571ba80-f267-46ed-8d16-e44531cb0ce8" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.15:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 10:41:19 crc kubenswrapper[4733]: I0318 10:41:19.143653 4733 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-dr9dg" podUID="03476444-8ff8-4b1e-bcbc-ee654241370b" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.52:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 10:41:19 crc kubenswrapper[4733]: I0318 10:41:19.143788 4733 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-z6qb2 container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.217.0.72:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 10:41:19 crc kubenswrapper[4733]: I0318 10:41:19.143821 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-79b997595-z6qb2" podUID="8ae3847e-6357-46a1-9578-88deb6e1531b" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.72:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 10:41:19 crc kubenswrapper[4733]: I0318 10:41:19.144284 4733 patch_prober.go:28] interesting pod/package-server-manager-789f6589d5-kd6gw container/package-server-manager namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 10:41:19 crc kubenswrapper[4733]: I0318 10:41:19.144324 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kd6gw" podUID="9b0edb65-3bcf-484f-9707-d8124df1ec88" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 10:41:19 crc kubenswrapper[4733]: E0318 10:41:19.154965 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 10:41:19 crc kubenswrapper[4733]: I0318 10:41:19.168666 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-pc5zz" podUID="4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 10:41:19 crc kubenswrapper[4733]: I0318 10:41:19.168760 4733 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-z6qb2 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.72:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 10:41:19 crc kubenswrapper[4733]: I0318 10:41:19.168783 4733 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-z6qb2" podUID="8ae3847e-6357-46a1-9578-88deb6e1531b" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.72:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 10:41:19 crc kubenswrapper[4733]: I0318 10:41:19.168926 4733 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Liveness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 10:41:19 crc kubenswrapper[4733]: I0318 10:41:19.168947 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 10:41:27 crc kubenswrapper[4733]: I0318 10:41:27.175581 4733 scope.go:117] "RemoveContainer" containerID="34bcdecb0fa459a1d6253fb10ac79d88f89fe0809b9e5dec1a6ca99fb8f7810c" Mar 18 10:41:27 crc kubenswrapper[4733]: E0318 10:41:27.176521 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:41:29 crc kubenswrapper[4733]: I0318 10:41:29.175570 4733 scope.go:117] "RemoveContainer" containerID="39f4b0f268d8d6f6613db69085678023c98d9ef75625187f285b2a88548a855a" Mar 18 10:41:29 crc kubenswrapper[4733]: E0318 10:41:29.176293 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:41:30 crc kubenswrapper[4733]: I0318 10:41:30.176109 4733 scope.go:117] "RemoveContainer" containerID="fc33062a38e6003bcfe678b0b641bcd73299a07f8dcc32e6f590e8bb7c29b637" Mar 18 10:41:30 crc kubenswrapper[4733]: E0318 10:41:30.177720 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 10:41:38 crc kubenswrapper[4733]: I0318 10:41:38.176243 4733 scope.go:117] "RemoveContainer" containerID="34bcdecb0fa459a1d6253fb10ac79d88f89fe0809b9e5dec1a6ca99fb8f7810c" Mar 18 10:41:38 crc kubenswrapper[4733]: E0318 10:41:38.177724 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:41:41 crc kubenswrapper[4733]: I0318 10:41:41.072408 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-mr65v"] Mar 18 10:41:41 crc kubenswrapper[4733]: I0318 10:41:41.081565 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-0937-account-create-update-bfx7n"] Mar 18 10:41:41 crc kubenswrapper[4733]: I0318 10:41:41.088867 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-5gwmb"] Mar 18 10:41:41 crc kubenswrapper[4733]: I0318 10:41:41.095842 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-5e1e-account-create-update-r9bb4"] Mar 18 10:41:41 crc kubenswrapper[4733]: I0318 10:41:41.103370 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-mr65v"] Mar 18 10:41:41 crc kubenswrapper[4733]: I0318 10:41:41.109323 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-5e1e-account-create-update-r9bb4"] Mar 18 10:41:41 crc kubenswrapper[4733]: I0318 10:41:41.113994 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-0937-account-create-update-bfx7n"] Mar 18 10:41:41 crc kubenswrapper[4733]: I0318 10:41:41.118619 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-5gwmb"] Mar 18 10:41:41 crc kubenswrapper[4733]: I0318 10:41:41.185074 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07730b47-54ba-4b79-952e-6fb12b3b5279" path="/var/lib/kubelet/pods/07730b47-54ba-4b79-952e-6fb12b3b5279/volumes" Mar 18 10:41:41 crc kubenswrapper[4733]: I0318 10:41:41.185774 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a5425fb-7059-4262-9c68-1420a5f3b4f1" path="/var/lib/kubelet/pods/0a5425fb-7059-4262-9c68-1420a5f3b4f1/volumes" Mar 18 10:41:41 crc kubenswrapper[4733]: I0318 10:41:41.186327 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30a7c351-0be1-4547-bacc-8ff02cb59328" path="/var/lib/kubelet/pods/30a7c351-0be1-4547-bacc-8ff02cb59328/volumes" Mar 18 10:41:41 crc kubenswrapper[4733]: I0318 10:41:41.186858 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6346115-9e7a-4489-916d-a129aa83a6dd" path="/var/lib/kubelet/pods/a6346115-9e7a-4489-916d-a129aa83a6dd/volumes" Mar 18 10:41:42 crc kubenswrapper[4733]: I0318 10:41:42.176047 4733 scope.go:117] "RemoveContainer" containerID="fc33062a38e6003bcfe678b0b641bcd73299a07f8dcc32e6f590e8bb7c29b637" Mar 18 10:41:42 crc kubenswrapper[4733]: E0318 10:41:42.176553 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 10:41:42 crc kubenswrapper[4733]: I0318 10:41:42.177023 4733 scope.go:117] "RemoveContainer" containerID="39f4b0f268d8d6f6613db69085678023c98d9ef75625187f285b2a88548a855a" Mar 18 10:41:42 crc kubenswrapper[4733]: E0318 10:41:42.177438 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:41:45 crc kubenswrapper[4733]: I0318 10:41:45.045091 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-5795-account-create-update-nkww7"] Mar 18 10:41:45 crc kubenswrapper[4733]: I0318 10:41:45.057833 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-fvlqt"] Mar 18 10:41:45 crc kubenswrapper[4733]: I0318 10:41:45.064256 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-fvlqt"] Mar 18 10:41:45 crc kubenswrapper[4733]: I0318 10:41:45.070994 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-5795-account-create-update-nkww7"] Mar 18 10:41:45 crc kubenswrapper[4733]: I0318 10:41:45.187057 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84d4401f-2343-41fa-82ae-877674337bf4" path="/var/lib/kubelet/pods/84d4401f-2343-41fa-82ae-877674337bf4/volumes" Mar 18 10:41:45 crc kubenswrapper[4733]: I0318 10:41:45.187884 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff5315db-fb68-4558-85c1-cf538d0e2770" path="/var/lib/kubelet/pods/ff5315db-fb68-4558-85c1-cf538d0e2770/volumes" Mar 18 10:41:53 crc kubenswrapper[4733]: I0318 10:41:53.175841 4733 scope.go:117] "RemoveContainer" containerID="fc33062a38e6003bcfe678b0b641bcd73299a07f8dcc32e6f590e8bb7c29b637" Mar 18 10:41:53 crc kubenswrapper[4733]: I0318 10:41:53.176644 4733 scope.go:117] "RemoveContainer" containerID="34bcdecb0fa459a1d6253fb10ac79d88f89fe0809b9e5dec1a6ca99fb8f7810c" Mar 18 10:41:53 crc kubenswrapper[4733]: E0318 10:41:53.176989 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 10:41:53 crc kubenswrapper[4733]: E0318 10:41:53.177020 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:41:55 crc kubenswrapper[4733]: I0318 10:41:55.706035 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tljzm"] Mar 18 10:41:55 crc kubenswrapper[4733]: E0318 10:41:55.725603 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be59cc18-c769-443f-962f-042b8ba456b8" containerName="oc" Mar 18 10:41:55 crc kubenswrapper[4733]: I0318 10:41:55.725648 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="be59cc18-c769-443f-962f-042b8ba456b8" containerName="oc" Mar 18 10:41:55 crc kubenswrapper[4733]: I0318 10:41:55.726262 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="be59cc18-c769-443f-962f-042b8ba456b8" containerName="oc" Mar 18 10:41:55 crc kubenswrapper[4733]: I0318 10:41:55.729069 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tljzm"] Mar 18 10:41:55 crc kubenswrapper[4733]: I0318 10:41:55.729263 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tljzm" Mar 18 10:41:55 crc kubenswrapper[4733]: I0318 10:41:55.772029 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/137e4764-e111-4ef3-ac40-3f8a07e4df8a-utilities\") pod \"certified-operators-tljzm\" (UID: \"137e4764-e111-4ef3-ac40-3f8a07e4df8a\") " pod="openshift-marketplace/certified-operators-tljzm" Mar 18 10:41:55 crc kubenswrapper[4733]: I0318 10:41:55.772160 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/137e4764-e111-4ef3-ac40-3f8a07e4df8a-catalog-content\") pod \"certified-operators-tljzm\" (UID: \"137e4764-e111-4ef3-ac40-3f8a07e4df8a\") " pod="openshift-marketplace/certified-operators-tljzm" Mar 18 10:41:55 crc kubenswrapper[4733]: I0318 10:41:55.772221 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5462\" (UniqueName: \"kubernetes.io/projected/137e4764-e111-4ef3-ac40-3f8a07e4df8a-kube-api-access-t5462\") pod \"certified-operators-tljzm\" (UID: \"137e4764-e111-4ef3-ac40-3f8a07e4df8a\") " pod="openshift-marketplace/certified-operators-tljzm" Mar 18 10:41:55 crc kubenswrapper[4733]: I0318 10:41:55.873802 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/137e4764-e111-4ef3-ac40-3f8a07e4df8a-catalog-content\") pod \"certified-operators-tljzm\" (UID: \"137e4764-e111-4ef3-ac40-3f8a07e4df8a\") " pod="openshift-marketplace/certified-operators-tljzm" Mar 18 10:41:55 crc kubenswrapper[4733]: I0318 10:41:55.873851 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5462\" (UniqueName: \"kubernetes.io/projected/137e4764-e111-4ef3-ac40-3f8a07e4df8a-kube-api-access-t5462\") pod \"certified-operators-tljzm\" (UID: \"137e4764-e111-4ef3-ac40-3f8a07e4df8a\") " pod="openshift-marketplace/certified-operators-tljzm" Mar 18 10:41:55 crc kubenswrapper[4733]: I0318 10:41:55.873903 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/137e4764-e111-4ef3-ac40-3f8a07e4df8a-utilities\") pod \"certified-operators-tljzm\" (UID: \"137e4764-e111-4ef3-ac40-3f8a07e4df8a\") " pod="openshift-marketplace/certified-operators-tljzm" Mar 18 10:41:55 crc kubenswrapper[4733]: I0318 10:41:55.874459 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/137e4764-e111-4ef3-ac40-3f8a07e4df8a-utilities\") pod \"certified-operators-tljzm\" (UID: \"137e4764-e111-4ef3-ac40-3f8a07e4df8a\") " pod="openshift-marketplace/certified-operators-tljzm" Mar 18 10:41:55 crc kubenswrapper[4733]: I0318 10:41:55.874461 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/137e4764-e111-4ef3-ac40-3f8a07e4df8a-catalog-content\") pod \"certified-operators-tljzm\" (UID: \"137e4764-e111-4ef3-ac40-3f8a07e4df8a\") " pod="openshift-marketplace/certified-operators-tljzm" Mar 18 10:41:55 crc kubenswrapper[4733]: I0318 10:41:55.899624 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5462\" (UniqueName: \"kubernetes.io/projected/137e4764-e111-4ef3-ac40-3f8a07e4df8a-kube-api-access-t5462\") pod \"certified-operators-tljzm\" (UID: \"137e4764-e111-4ef3-ac40-3f8a07e4df8a\") " pod="openshift-marketplace/certified-operators-tljzm" Mar 18 10:41:56 crc kubenswrapper[4733]: I0318 10:41:56.083429 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tljzm" Mar 18 10:41:56 crc kubenswrapper[4733]: I0318 10:41:56.175991 4733 scope.go:117] "RemoveContainer" containerID="39f4b0f268d8d6f6613db69085678023c98d9ef75625187f285b2a88548a855a" Mar 18 10:41:56 crc kubenswrapper[4733]: E0318 10:41:56.176651 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:41:56 crc kubenswrapper[4733]: W0318 10:41:56.611357 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod137e4764_e111_4ef3_ac40_3f8a07e4df8a.slice/crio-387eba0329a2b04d4a916eb64b70411eb77b955884ab7529c7638dae04a05a85 WatchSource:0}: Error finding container 387eba0329a2b04d4a916eb64b70411eb77b955884ab7529c7638dae04a05a85: Status 404 returned error can't find the container with id 387eba0329a2b04d4a916eb64b70411eb77b955884ab7529c7638dae04a05a85 Mar 18 10:41:56 crc kubenswrapper[4733]: I0318 10:41:56.623052 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tljzm"] Mar 18 10:41:57 crc kubenswrapper[4733]: I0318 10:41:57.551079 4733 generic.go:334] "Generic (PLEG): container finished" podID="137e4764-e111-4ef3-ac40-3f8a07e4df8a" containerID="4fcc3926dcfa6f9c4be1799c42751eb86e56709f553565e31fa0e718a378e307" exitCode=0 Mar 18 10:41:57 crc kubenswrapper[4733]: I0318 10:41:57.551224 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tljzm" event={"ID":"137e4764-e111-4ef3-ac40-3f8a07e4df8a","Type":"ContainerDied","Data":"4fcc3926dcfa6f9c4be1799c42751eb86e56709f553565e31fa0e718a378e307"} Mar 18 10:41:57 crc kubenswrapper[4733]: I0318 10:41:57.551639 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tljzm" event={"ID":"137e4764-e111-4ef3-ac40-3f8a07e4df8a","Type":"ContainerStarted","Data":"387eba0329a2b04d4a916eb64b70411eb77b955884ab7529c7638dae04a05a85"} Mar 18 10:41:59 crc kubenswrapper[4733]: I0318 10:41:59.572332 4733 generic.go:334] "Generic (PLEG): container finished" podID="137e4764-e111-4ef3-ac40-3f8a07e4df8a" containerID="fb254defcfea9fc1997376ee41afbe7b64c90aee6f3e5d2915155a5e3c51f9c8" exitCode=0 Mar 18 10:41:59 crc kubenswrapper[4733]: I0318 10:41:59.572451 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tljzm" event={"ID":"137e4764-e111-4ef3-ac40-3f8a07e4df8a","Type":"ContainerDied","Data":"fb254defcfea9fc1997376ee41afbe7b64c90aee6f3e5d2915155a5e3c51f9c8"} Mar 18 10:42:00 crc kubenswrapper[4733]: I0318 10:42:00.158521 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563842-78hdh"] Mar 18 10:42:00 crc kubenswrapper[4733]: I0318 10:42:00.160243 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563842-78hdh" Mar 18 10:42:00 crc kubenswrapper[4733]: I0318 10:42:00.170670 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563842-78hdh"] Mar 18 10:42:00 crc kubenswrapper[4733]: I0318 10:42:00.187837 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:42:00 crc kubenswrapper[4733]: I0318 10:42:00.187894 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:42:00 crc kubenswrapper[4733]: I0318 10:42:00.188163 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wmd5k" Mar 18 10:42:00 crc kubenswrapper[4733]: I0318 10:42:00.357985 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfnb8\" (UniqueName: \"kubernetes.io/projected/8a6751ff-c17a-472f-b315-06edff233f07-kube-api-access-hfnb8\") pod \"auto-csr-approver-29563842-78hdh\" (UID: \"8a6751ff-c17a-472f-b315-06edff233f07\") " pod="openshift-infra/auto-csr-approver-29563842-78hdh" Mar 18 10:42:00 crc kubenswrapper[4733]: I0318 10:42:00.459407 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfnb8\" (UniqueName: \"kubernetes.io/projected/8a6751ff-c17a-472f-b315-06edff233f07-kube-api-access-hfnb8\") pod \"auto-csr-approver-29563842-78hdh\" (UID: \"8a6751ff-c17a-472f-b315-06edff233f07\") " pod="openshift-infra/auto-csr-approver-29563842-78hdh" Mar 18 10:42:00 crc kubenswrapper[4733]: I0318 10:42:00.498430 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfnb8\" (UniqueName: \"kubernetes.io/projected/8a6751ff-c17a-472f-b315-06edff233f07-kube-api-access-hfnb8\") pod \"auto-csr-approver-29563842-78hdh\" (UID: \"8a6751ff-c17a-472f-b315-06edff233f07\") " pod="openshift-infra/auto-csr-approver-29563842-78hdh" Mar 18 10:42:00 crc kubenswrapper[4733]: I0318 10:42:00.524834 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563842-78hdh" Mar 18 10:42:00 crc kubenswrapper[4733]: I0318 10:42:00.585105 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tljzm" event={"ID":"137e4764-e111-4ef3-ac40-3f8a07e4df8a","Type":"ContainerStarted","Data":"5d36f5790f642401acb40728675a784e6d4f0ef6401d704d6528ca8351f6c46b"} Mar 18 10:42:00 crc kubenswrapper[4733]: I0318 10:42:00.628105 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tljzm" podStartSLOduration=3.1727671 podStartE2EDuration="5.628085808s" podCreationTimestamp="2026-03-18 10:41:55 +0000 UTC" firstStartedPulling="2026-03-18 10:41:57.554577155 +0000 UTC m=+1757.046311520" lastFinishedPulling="2026-03-18 10:42:00.009895863 +0000 UTC m=+1759.501630228" observedRunningTime="2026-03-18 10:42:00.624625411 +0000 UTC m=+1760.116359806" watchObservedRunningTime="2026-03-18 10:42:00.628085808 +0000 UTC m=+1760.119820143" Mar 18 10:42:01 crc kubenswrapper[4733]: I0318 10:42:01.011990 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563842-78hdh"] Mar 18 10:42:01 crc kubenswrapper[4733]: W0318 10:42:01.027377 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a6751ff_c17a_472f_b315_06edff233f07.slice/crio-c8194ce2558598ec34fe5d174c0be4529d1eed31838e848e03a5a42254a157e5 WatchSource:0}: Error finding container c8194ce2558598ec34fe5d174c0be4529d1eed31838e848e03a5a42254a157e5: Status 404 returned error can't find the container with id c8194ce2558598ec34fe5d174c0be4529d1eed31838e848e03a5a42254a157e5 Mar 18 10:42:01 crc kubenswrapper[4733]: I0318 10:42:01.594468 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563842-78hdh" event={"ID":"8a6751ff-c17a-472f-b315-06edff233f07","Type":"ContainerStarted","Data":"c8194ce2558598ec34fe5d174c0be4529d1eed31838e848e03a5a42254a157e5"} Mar 18 10:42:02 crc kubenswrapper[4733]: I0318 10:42:02.606532 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563842-78hdh" event={"ID":"8a6751ff-c17a-472f-b315-06edff233f07","Type":"ContainerStarted","Data":"56204ccc444d383e182644d7909ef75775c83c4da5e6940b6afcaf6c25fa0fc2"} Mar 18 10:42:02 crc kubenswrapper[4733]: I0318 10:42:02.628997 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563842-78hdh" podStartSLOduration=1.483118036 podStartE2EDuration="2.628976965s" podCreationTimestamp="2026-03-18 10:42:00 +0000 UTC" firstStartedPulling="2026-03-18 10:42:01.031261796 +0000 UTC m=+1760.522996161" lastFinishedPulling="2026-03-18 10:42:02.177120725 +0000 UTC m=+1761.668855090" observedRunningTime="2026-03-18 10:42:02.621952467 +0000 UTC m=+1762.113686872" watchObservedRunningTime="2026-03-18 10:42:02.628976965 +0000 UTC m=+1762.120711290" Mar 18 10:42:03 crc kubenswrapper[4733]: I0318 10:42:03.621802 4733 generic.go:334] "Generic (PLEG): container finished" podID="8a6751ff-c17a-472f-b315-06edff233f07" containerID="56204ccc444d383e182644d7909ef75775c83c4da5e6940b6afcaf6c25fa0fc2" exitCode=0 Mar 18 10:42:03 crc kubenswrapper[4733]: I0318 10:42:03.621898 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563842-78hdh" event={"ID":"8a6751ff-c17a-472f-b315-06edff233f07","Type":"ContainerDied","Data":"56204ccc444d383e182644d7909ef75775c83c4da5e6940b6afcaf6c25fa0fc2"} Mar 18 10:42:05 crc kubenswrapper[4733]: I0318 10:42:05.055585 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563842-78hdh" Mar 18 10:42:05 crc kubenswrapper[4733]: I0318 10:42:05.244811 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hfnb8\" (UniqueName: \"kubernetes.io/projected/8a6751ff-c17a-472f-b315-06edff233f07-kube-api-access-hfnb8\") pod \"8a6751ff-c17a-472f-b315-06edff233f07\" (UID: \"8a6751ff-c17a-472f-b315-06edff233f07\") " Mar 18 10:42:05 crc kubenswrapper[4733]: I0318 10:42:05.253151 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a6751ff-c17a-472f-b315-06edff233f07-kube-api-access-hfnb8" (OuterVolumeSpecName: "kube-api-access-hfnb8") pod "8a6751ff-c17a-472f-b315-06edff233f07" (UID: "8a6751ff-c17a-472f-b315-06edff233f07"). InnerVolumeSpecName "kube-api-access-hfnb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:42:05 crc kubenswrapper[4733]: I0318 10:42:05.347100 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hfnb8\" (UniqueName: \"kubernetes.io/projected/8a6751ff-c17a-472f-b315-06edff233f07-kube-api-access-hfnb8\") on node \"crc\" DevicePath \"\"" Mar 18 10:42:05 crc kubenswrapper[4733]: I0318 10:42:05.644914 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563842-78hdh" event={"ID":"8a6751ff-c17a-472f-b315-06edff233f07","Type":"ContainerDied","Data":"c8194ce2558598ec34fe5d174c0be4529d1eed31838e848e03a5a42254a157e5"} Mar 18 10:42:05 crc kubenswrapper[4733]: I0318 10:42:05.644975 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8194ce2558598ec34fe5d174c0be4529d1eed31838e848e03a5a42254a157e5" Mar 18 10:42:05 crc kubenswrapper[4733]: I0318 10:42:05.645063 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563842-78hdh" Mar 18 10:42:05 crc kubenswrapper[4733]: I0318 10:42:05.707379 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563836-4x58h"] Mar 18 10:42:05 crc kubenswrapper[4733]: I0318 10:42:05.719313 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563836-4x58h"] Mar 18 10:42:06 crc kubenswrapper[4733]: I0318 10:42:06.083914 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tljzm" Mar 18 10:42:06 crc kubenswrapper[4733]: I0318 10:42:06.083972 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tljzm" Mar 18 10:42:06 crc kubenswrapper[4733]: I0318 10:42:06.165234 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tljzm" Mar 18 10:42:06 crc kubenswrapper[4733]: I0318 10:42:06.746665 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tljzm" Mar 18 10:42:06 crc kubenswrapper[4733]: I0318 10:42:06.814500 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tljzm"] Mar 18 10:42:07 crc kubenswrapper[4733]: I0318 10:42:07.176215 4733 scope.go:117] "RemoveContainer" containerID="fc33062a38e6003bcfe678b0b641bcd73299a07f8dcc32e6f590e8bb7c29b637" Mar 18 10:42:07 crc kubenswrapper[4733]: E0318 10:42:07.176629 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 10:42:07 crc kubenswrapper[4733]: I0318 10:42:07.193908 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d31d006c-81a6-4bbb-a44a-fda966944372" path="/var/lib/kubelet/pods/d31d006c-81a6-4bbb-a44a-fda966944372/volumes" Mar 18 10:42:08 crc kubenswrapper[4733]: I0318 10:42:08.177971 4733 scope.go:117] "RemoveContainer" containerID="34bcdecb0fa459a1d6253fb10ac79d88f89fe0809b9e5dec1a6ca99fb8f7810c" Mar 18 10:42:08 crc kubenswrapper[4733]: E0318 10:42:08.178333 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:42:08 crc kubenswrapper[4733]: I0318 10:42:08.682899 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tljzm" podUID="137e4764-e111-4ef3-ac40-3f8a07e4df8a" containerName="registry-server" containerID="cri-o://5d36f5790f642401acb40728675a784e6d4f0ef6401d704d6528ca8351f6c46b" gracePeriod=2 Mar 18 10:42:09 crc kubenswrapper[4733]: I0318 10:42:09.175630 4733 scope.go:117] "RemoveContainer" containerID="39f4b0f268d8d6f6613db69085678023c98d9ef75625187f285b2a88548a855a" Mar 18 10:42:09 crc kubenswrapper[4733]: E0318 10:42:09.176936 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:42:09 crc kubenswrapper[4733]: I0318 10:42:09.243474 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tljzm" Mar 18 10:42:09 crc kubenswrapper[4733]: I0318 10:42:09.320075 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/137e4764-e111-4ef3-ac40-3f8a07e4df8a-utilities\") pod \"137e4764-e111-4ef3-ac40-3f8a07e4df8a\" (UID: \"137e4764-e111-4ef3-ac40-3f8a07e4df8a\") " Mar 18 10:42:09 crc kubenswrapper[4733]: I0318 10:42:09.320180 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5462\" (UniqueName: \"kubernetes.io/projected/137e4764-e111-4ef3-ac40-3f8a07e4df8a-kube-api-access-t5462\") pod \"137e4764-e111-4ef3-ac40-3f8a07e4df8a\" (UID: \"137e4764-e111-4ef3-ac40-3f8a07e4df8a\") " Mar 18 10:42:09 crc kubenswrapper[4733]: I0318 10:42:09.320393 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/137e4764-e111-4ef3-ac40-3f8a07e4df8a-catalog-content\") pod \"137e4764-e111-4ef3-ac40-3f8a07e4df8a\" (UID: \"137e4764-e111-4ef3-ac40-3f8a07e4df8a\") " Mar 18 10:42:09 crc kubenswrapper[4733]: I0318 10:42:09.322223 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/137e4764-e111-4ef3-ac40-3f8a07e4df8a-utilities" (OuterVolumeSpecName: "utilities") pod "137e4764-e111-4ef3-ac40-3f8a07e4df8a" (UID: "137e4764-e111-4ef3-ac40-3f8a07e4df8a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:42:09 crc kubenswrapper[4733]: I0318 10:42:09.331317 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/137e4764-e111-4ef3-ac40-3f8a07e4df8a-kube-api-access-t5462" (OuterVolumeSpecName: "kube-api-access-t5462") pod "137e4764-e111-4ef3-ac40-3f8a07e4df8a" (UID: "137e4764-e111-4ef3-ac40-3f8a07e4df8a"). InnerVolumeSpecName "kube-api-access-t5462". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:42:09 crc kubenswrapper[4733]: I0318 10:42:09.407281 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/137e4764-e111-4ef3-ac40-3f8a07e4df8a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "137e4764-e111-4ef3-ac40-3f8a07e4df8a" (UID: "137e4764-e111-4ef3-ac40-3f8a07e4df8a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:42:09 crc kubenswrapper[4733]: I0318 10:42:09.422122 4733 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/137e4764-e111-4ef3-ac40-3f8a07e4df8a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 10:42:09 crc kubenswrapper[4733]: I0318 10:42:09.422165 4733 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/137e4764-e111-4ef3-ac40-3f8a07e4df8a-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 10:42:09 crc kubenswrapper[4733]: I0318 10:42:09.422206 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5462\" (UniqueName: \"kubernetes.io/projected/137e4764-e111-4ef3-ac40-3f8a07e4df8a-kube-api-access-t5462\") on node \"crc\" DevicePath \"\"" Mar 18 10:42:09 crc kubenswrapper[4733]: I0318 10:42:09.695072 4733 generic.go:334] "Generic (PLEG): container finished" podID="137e4764-e111-4ef3-ac40-3f8a07e4df8a" containerID="5d36f5790f642401acb40728675a784e6d4f0ef6401d704d6528ca8351f6c46b" exitCode=0 Mar 18 10:42:09 crc kubenswrapper[4733]: I0318 10:42:09.695136 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tljzm" Mar 18 10:42:09 crc kubenswrapper[4733]: I0318 10:42:09.695157 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tljzm" event={"ID":"137e4764-e111-4ef3-ac40-3f8a07e4df8a","Type":"ContainerDied","Data":"5d36f5790f642401acb40728675a784e6d4f0ef6401d704d6528ca8351f6c46b"} Mar 18 10:42:09 crc kubenswrapper[4733]: I0318 10:42:09.695633 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tljzm" event={"ID":"137e4764-e111-4ef3-ac40-3f8a07e4df8a","Type":"ContainerDied","Data":"387eba0329a2b04d4a916eb64b70411eb77b955884ab7529c7638dae04a05a85"} Mar 18 10:42:09 crc kubenswrapper[4733]: I0318 10:42:09.695705 4733 scope.go:117] "RemoveContainer" containerID="5d36f5790f642401acb40728675a784e6d4f0ef6401d704d6528ca8351f6c46b" Mar 18 10:42:09 crc kubenswrapper[4733]: I0318 10:42:09.711839 4733 scope.go:117] "RemoveContainer" containerID="ce9a99c6df86d54aacd4034e75a79275a1f1a3fe6a26a1b9d309967e3b0b146b" Mar 18 10:42:09 crc kubenswrapper[4733]: I0318 10:42:09.725286 4733 scope.go:117] "RemoveContainer" containerID="fb254defcfea9fc1997376ee41afbe7b64c90aee6f3e5d2915155a5e3c51f9c8" Mar 18 10:42:09 crc kubenswrapper[4733]: I0318 10:42:09.752579 4733 scope.go:117] "RemoveContainer" containerID="754b489534a1d2e07cfe28b803b1041b8c35b7a8d870ab7643873669d480405d" Mar 18 10:42:09 crc kubenswrapper[4733]: I0318 10:42:09.761408 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tljzm"] Mar 18 10:42:09 crc kubenswrapper[4733]: I0318 10:42:09.773531 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tljzm"] Mar 18 10:42:09 crc kubenswrapper[4733]: I0318 10:42:09.806548 4733 scope.go:117] "RemoveContainer" containerID="1a52840f130018d9dd9a4d4957090d0bfe7cddccea8c86d998fc7ce63f88d2c3" Mar 18 10:42:09 crc kubenswrapper[4733]: I0318 10:42:09.821803 4733 scope.go:117] "RemoveContainer" containerID="4fcc3926dcfa6f9c4be1799c42751eb86e56709f553565e31fa0e718a378e307" Mar 18 10:42:09 crc kubenswrapper[4733]: I0318 10:42:09.879893 4733 scope.go:117] "RemoveContainer" containerID="5d36f5790f642401acb40728675a784e6d4f0ef6401d704d6528ca8351f6c46b" Mar 18 10:42:09 crc kubenswrapper[4733]: E0318 10:42:09.880634 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d36f5790f642401acb40728675a784e6d4f0ef6401d704d6528ca8351f6c46b\": container with ID starting with 5d36f5790f642401acb40728675a784e6d4f0ef6401d704d6528ca8351f6c46b not found: ID does not exist" containerID="5d36f5790f642401acb40728675a784e6d4f0ef6401d704d6528ca8351f6c46b" Mar 18 10:42:09 crc kubenswrapper[4733]: I0318 10:42:09.880687 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d36f5790f642401acb40728675a784e6d4f0ef6401d704d6528ca8351f6c46b"} err="failed to get container status \"5d36f5790f642401acb40728675a784e6d4f0ef6401d704d6528ca8351f6c46b\": rpc error: code = NotFound desc = could not find container \"5d36f5790f642401acb40728675a784e6d4f0ef6401d704d6528ca8351f6c46b\": container with ID starting with 5d36f5790f642401acb40728675a784e6d4f0ef6401d704d6528ca8351f6c46b not found: ID does not exist" Mar 18 10:42:09 crc kubenswrapper[4733]: I0318 10:42:09.880723 4733 scope.go:117] "RemoveContainer" containerID="fb254defcfea9fc1997376ee41afbe7b64c90aee6f3e5d2915155a5e3c51f9c8" Mar 18 10:42:09 crc kubenswrapper[4733]: E0318 10:42:09.881162 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb254defcfea9fc1997376ee41afbe7b64c90aee6f3e5d2915155a5e3c51f9c8\": container with ID starting with fb254defcfea9fc1997376ee41afbe7b64c90aee6f3e5d2915155a5e3c51f9c8 not found: ID does not exist" containerID="fb254defcfea9fc1997376ee41afbe7b64c90aee6f3e5d2915155a5e3c51f9c8" Mar 18 10:42:09 crc kubenswrapper[4733]: I0318 10:42:09.881238 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb254defcfea9fc1997376ee41afbe7b64c90aee6f3e5d2915155a5e3c51f9c8"} err="failed to get container status \"fb254defcfea9fc1997376ee41afbe7b64c90aee6f3e5d2915155a5e3c51f9c8\": rpc error: code = NotFound desc = could not find container \"fb254defcfea9fc1997376ee41afbe7b64c90aee6f3e5d2915155a5e3c51f9c8\": container with ID starting with fb254defcfea9fc1997376ee41afbe7b64c90aee6f3e5d2915155a5e3c51f9c8 not found: ID does not exist" Mar 18 10:42:09 crc kubenswrapper[4733]: I0318 10:42:09.881266 4733 scope.go:117] "RemoveContainer" containerID="4fcc3926dcfa6f9c4be1799c42751eb86e56709f553565e31fa0e718a378e307" Mar 18 10:42:09 crc kubenswrapper[4733]: E0318 10:42:09.881741 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fcc3926dcfa6f9c4be1799c42751eb86e56709f553565e31fa0e718a378e307\": container with ID starting with 4fcc3926dcfa6f9c4be1799c42751eb86e56709f553565e31fa0e718a378e307 not found: ID does not exist" containerID="4fcc3926dcfa6f9c4be1799c42751eb86e56709f553565e31fa0e718a378e307" Mar 18 10:42:09 crc kubenswrapper[4733]: I0318 10:42:09.881787 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fcc3926dcfa6f9c4be1799c42751eb86e56709f553565e31fa0e718a378e307"} err="failed to get container status \"4fcc3926dcfa6f9c4be1799c42751eb86e56709f553565e31fa0e718a378e307\": rpc error: code = NotFound desc = could not find container \"4fcc3926dcfa6f9c4be1799c42751eb86e56709f553565e31fa0e718a378e307\": container with ID starting with 4fcc3926dcfa6f9c4be1799c42751eb86e56709f553565e31fa0e718a378e307 not found: ID does not exist" Mar 18 10:42:09 crc kubenswrapper[4733]: I0318 10:42:09.901607 4733 scope.go:117] "RemoveContainer" containerID="cb5c331f367d49d9d35cab0a581b0fd4e3d8921934861b35f887d6648ae09cfb" Mar 18 10:42:09 crc kubenswrapper[4733]: I0318 10:42:09.974973 4733 scope.go:117] "RemoveContainer" containerID="c39830f7afc41d4e539449a18ea110efdcfaa942a6d99809d36a31233d6cb82b" Mar 18 10:42:10 crc kubenswrapper[4733]: I0318 10:42:10.009762 4733 scope.go:117] "RemoveContainer" containerID="4d1446897edd6664fd044255842a94c2933d1bab0fe0d09f54123f9a53833063" Mar 18 10:42:10 crc kubenswrapper[4733]: I0318 10:42:10.077118 4733 scope.go:117] "RemoveContainer" containerID="881999bdad04a088176edfb2a1165638bbb818ce5892ed189c2612e4735ca703" Mar 18 10:42:11 crc kubenswrapper[4733]: I0318 10:42:11.188221 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="137e4764-e111-4ef3-ac40-3f8a07e4df8a" path="/var/lib/kubelet/pods/137e4764-e111-4ef3-ac40-3f8a07e4df8a/volumes" Mar 18 10:42:12 crc kubenswrapper[4733]: I0318 10:42:12.048743 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-xb87f"] Mar 18 10:42:12 crc kubenswrapper[4733]: I0318 10:42:12.057876 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-xb87f"] Mar 18 10:42:13 crc kubenswrapper[4733]: I0318 10:42:13.184643 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d533f566-eded-44ca-b276-7e3d437f9fce" path="/var/lib/kubelet/pods/d533f566-eded-44ca-b276-7e3d437f9fce/volumes" Mar 18 10:42:22 crc kubenswrapper[4733]: I0318 10:42:22.176117 4733 scope.go:117] "RemoveContainer" containerID="39f4b0f268d8d6f6613db69085678023c98d9ef75625187f285b2a88548a855a" Mar 18 10:42:22 crc kubenswrapper[4733]: I0318 10:42:22.176679 4733 scope.go:117] "RemoveContainer" containerID="34bcdecb0fa459a1d6253fb10ac79d88f89fe0809b9e5dec1a6ca99fb8f7810c" Mar 18 10:42:22 crc kubenswrapper[4733]: I0318 10:42:22.176856 4733 scope.go:117] "RemoveContainer" containerID="fc33062a38e6003bcfe678b0b641bcd73299a07f8dcc32e6f590e8bb7c29b637" Mar 18 10:42:22 crc kubenswrapper[4733]: E0318 10:42:22.176894 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:42:22 crc kubenswrapper[4733]: E0318 10:42:22.177032 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:42:22 crc kubenswrapper[4733]: E0318 10:42:22.177386 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 10:42:29 crc kubenswrapper[4733]: I0318 10:42:29.050719 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-ptbmt"] Mar 18 10:42:29 crc kubenswrapper[4733]: I0318 10:42:29.064117 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-ptbmt"] Mar 18 10:42:29 crc kubenswrapper[4733]: I0318 10:42:29.190946 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63c8f7bc-4162-4903-b3f9-96c8736a27b8" path="/var/lib/kubelet/pods/63c8f7bc-4162-4903-b3f9-96c8736a27b8/volumes" Mar 18 10:42:33 crc kubenswrapper[4733]: I0318 10:42:33.176564 4733 scope.go:117] "RemoveContainer" containerID="fc33062a38e6003bcfe678b0b641bcd73299a07f8dcc32e6f590e8bb7c29b637" Mar 18 10:42:33 crc kubenswrapper[4733]: E0318 10:42:33.177529 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 10:42:35 crc kubenswrapper[4733]: I0318 10:42:35.175923 4733 scope.go:117] "RemoveContainer" containerID="39f4b0f268d8d6f6613db69085678023c98d9ef75625187f285b2a88548a855a" Mar 18 10:42:35 crc kubenswrapper[4733]: E0318 10:42:35.177019 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:42:36 crc kubenswrapper[4733]: I0318 10:42:36.176457 4733 scope.go:117] "RemoveContainer" containerID="34bcdecb0fa459a1d6253fb10ac79d88f89fe0809b9e5dec1a6ca99fb8f7810c" Mar 18 10:42:36 crc kubenswrapper[4733]: E0318 10:42:36.177266 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:42:45 crc kubenswrapper[4733]: I0318 10:42:45.176281 4733 scope.go:117] "RemoveContainer" containerID="fc33062a38e6003bcfe678b0b641bcd73299a07f8dcc32e6f590e8bb7c29b637" Mar 18 10:42:45 crc kubenswrapper[4733]: E0318 10:42:45.177539 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 10:42:46 crc kubenswrapper[4733]: I0318 10:42:46.176278 4733 scope.go:117] "RemoveContainer" containerID="39f4b0f268d8d6f6613db69085678023c98d9ef75625187f285b2a88548a855a" Mar 18 10:42:46 crc kubenswrapper[4733]: E0318 10:42:46.176731 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:42:49 crc kubenswrapper[4733]: I0318 10:42:49.178424 4733 scope.go:117] "RemoveContainer" containerID="34bcdecb0fa459a1d6253fb10ac79d88f89fe0809b9e5dec1a6ca99fb8f7810c" Mar 18 10:42:49 crc kubenswrapper[4733]: E0318 10:42:49.178792 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:42:56 crc kubenswrapper[4733]: I0318 10:42:56.175399 4733 scope.go:117] "RemoveContainer" containerID="fc33062a38e6003bcfe678b0b641bcd73299a07f8dcc32e6f590e8bb7c29b637" Mar 18 10:42:56 crc kubenswrapper[4733]: E0318 10:42:56.176083 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 10:43:01 crc kubenswrapper[4733]: I0318 10:43:01.180978 4733 scope.go:117] "RemoveContainer" containerID="39f4b0f268d8d6f6613db69085678023c98d9ef75625187f285b2a88548a855a" Mar 18 10:43:01 crc kubenswrapper[4733]: E0318 10:43:01.181535 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:43:01 crc kubenswrapper[4733]: I0318 10:43:01.181670 4733 scope.go:117] "RemoveContainer" containerID="34bcdecb0fa459a1d6253fb10ac79d88f89fe0809b9e5dec1a6ca99fb8f7810c" Mar 18 10:43:01 crc kubenswrapper[4733]: E0318 10:43:01.182056 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:43:07 crc kubenswrapper[4733]: I0318 10:43:07.175495 4733 scope.go:117] "RemoveContainer" containerID="fc33062a38e6003bcfe678b0b641bcd73299a07f8dcc32e6f590e8bb7c29b637" Mar 18 10:43:07 crc kubenswrapper[4733]: E0318 10:43:07.176344 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 10:43:10 crc kubenswrapper[4733]: I0318 10:43:10.201844 4733 scope.go:117] "RemoveContainer" containerID="5eade19f29d1bfd378a24e80adf648a30d27b707a4c07763d5b7990ffd79ce55" Mar 18 10:43:10 crc kubenswrapper[4733]: I0318 10:43:10.252105 4733 scope.go:117] "RemoveContainer" containerID="7076d89bfeedd95679091846270edbde667d954c3e6fdb8ee00f499b50144915" Mar 18 10:43:12 crc kubenswrapper[4733]: I0318 10:43:12.176422 4733 scope.go:117] "RemoveContainer" containerID="34bcdecb0fa459a1d6253fb10ac79d88f89fe0809b9e5dec1a6ca99fb8f7810c" Mar 18 10:43:12 crc kubenswrapper[4733]: E0318 10:43:12.177163 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:43:13 crc kubenswrapper[4733]: I0318 10:43:13.177003 4733 scope.go:117] "RemoveContainer" containerID="39f4b0f268d8d6f6613db69085678023c98d9ef75625187f285b2a88548a855a" Mar 18 10:43:14 crc kubenswrapper[4733]: I0318 10:43:14.320589 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f0570ce4-1455-4698-85cf-01f7108d9e7f","Type":"ContainerStarted","Data":"c6feb5dbe0273076fcc0fc6ebcba5d0f6774752ba12f9bdf9dbed3f20d646a98"} Mar 18 10:43:14 crc kubenswrapper[4733]: I0318 10:43:14.321740 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 18 10:43:18 crc kubenswrapper[4733]: I0318 10:43:18.362606 4733 generic.go:334] "Generic (PLEG): container finished" podID="f0570ce4-1455-4698-85cf-01f7108d9e7f" containerID="c6feb5dbe0273076fcc0fc6ebcba5d0f6774752ba12f9bdf9dbed3f20d646a98" exitCode=0 Mar 18 10:43:18 crc kubenswrapper[4733]: I0318 10:43:18.362742 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f0570ce4-1455-4698-85cf-01f7108d9e7f","Type":"ContainerDied","Data":"c6feb5dbe0273076fcc0fc6ebcba5d0f6774752ba12f9bdf9dbed3f20d646a98"} Mar 18 10:43:18 crc kubenswrapper[4733]: I0318 10:43:18.362973 4733 scope.go:117] "RemoveContainer" containerID="39f4b0f268d8d6f6613db69085678023c98d9ef75625187f285b2a88548a855a" Mar 18 10:43:18 crc kubenswrapper[4733]: I0318 10:43:18.364016 4733 scope.go:117] "RemoveContainer" containerID="c6feb5dbe0273076fcc0fc6ebcba5d0f6774752ba12f9bdf9dbed3f20d646a98" Mar 18 10:43:18 crc kubenswrapper[4733]: E0318 10:43:18.364691 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:43:20 crc kubenswrapper[4733]: I0318 10:43:20.175867 4733 scope.go:117] "RemoveContainer" containerID="fc33062a38e6003bcfe678b0b641bcd73299a07f8dcc32e6f590e8bb7c29b637" Mar 18 10:43:20 crc kubenswrapper[4733]: E0318 10:43:20.176488 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 10:43:23 crc kubenswrapper[4733]: I0318 10:43:23.177515 4733 scope.go:117] "RemoveContainer" containerID="34bcdecb0fa459a1d6253fb10ac79d88f89fe0809b9e5dec1a6ca99fb8f7810c" Mar 18 10:43:23 crc kubenswrapper[4733]: I0318 10:43:23.417077 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b4a4e3e2-bd4d-4f8d-97bc-51267378ab03","Type":"ContainerStarted","Data":"5270a9bb2caacb712ce1222dbbad48bb5b8a42db385b8791217d8ea1eab704e8"} Mar 18 10:43:23 crc kubenswrapper[4733]: I0318 10:43:23.418017 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 18 10:43:28 crc kubenswrapper[4733]: I0318 10:43:28.470077 4733 generic.go:334] "Generic (PLEG): container finished" podID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" containerID="5270a9bb2caacb712ce1222dbbad48bb5b8a42db385b8791217d8ea1eab704e8" exitCode=0 Mar 18 10:43:28 crc kubenswrapper[4733]: I0318 10:43:28.470272 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b4a4e3e2-bd4d-4f8d-97bc-51267378ab03","Type":"ContainerDied","Data":"5270a9bb2caacb712ce1222dbbad48bb5b8a42db385b8791217d8ea1eab704e8"} Mar 18 10:43:28 crc kubenswrapper[4733]: I0318 10:43:28.471156 4733 scope.go:117] "RemoveContainer" containerID="34bcdecb0fa459a1d6253fb10ac79d88f89fe0809b9e5dec1a6ca99fb8f7810c" Mar 18 10:43:28 crc kubenswrapper[4733]: I0318 10:43:28.472253 4733 scope.go:117] "RemoveContainer" containerID="5270a9bb2caacb712ce1222dbbad48bb5b8a42db385b8791217d8ea1eab704e8" Mar 18 10:43:28 crc kubenswrapper[4733]: E0318 10:43:28.472815 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:43:32 crc kubenswrapper[4733]: I0318 10:43:32.177100 4733 scope.go:117] "RemoveContainer" containerID="fc33062a38e6003bcfe678b0b641bcd73299a07f8dcc32e6f590e8bb7c29b637" Mar 18 10:43:32 crc kubenswrapper[4733]: E0318 10:43:32.179575 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 10:43:34 crc kubenswrapper[4733]: I0318 10:43:34.175798 4733 scope.go:117] "RemoveContainer" containerID="c6feb5dbe0273076fcc0fc6ebcba5d0f6774752ba12f9bdf9dbed3f20d646a98" Mar 18 10:43:34 crc kubenswrapper[4733]: E0318 10:43:34.176460 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:43:43 crc kubenswrapper[4733]: I0318 10:43:43.177241 4733 scope.go:117] "RemoveContainer" containerID="5270a9bb2caacb712ce1222dbbad48bb5b8a42db385b8791217d8ea1eab704e8" Mar 18 10:43:43 crc kubenswrapper[4733]: E0318 10:43:43.178048 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:43:47 crc kubenswrapper[4733]: I0318 10:43:47.175676 4733 scope.go:117] "RemoveContainer" containerID="fc33062a38e6003bcfe678b0b641bcd73299a07f8dcc32e6f590e8bb7c29b637" Mar 18 10:43:47 crc kubenswrapper[4733]: E0318 10:43:47.176685 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 10:43:49 crc kubenswrapper[4733]: I0318 10:43:49.176290 4733 scope.go:117] "RemoveContainer" containerID="c6feb5dbe0273076fcc0fc6ebcba5d0f6774752ba12f9bdf9dbed3f20d646a98" Mar 18 10:43:49 crc kubenswrapper[4733]: E0318 10:43:49.177000 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:43:54 crc kubenswrapper[4733]: I0318 10:43:54.176175 4733 scope.go:117] "RemoveContainer" containerID="5270a9bb2caacb712ce1222dbbad48bb5b8a42db385b8791217d8ea1eab704e8" Mar 18 10:43:54 crc kubenswrapper[4733]: E0318 10:43:54.177115 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:43:58 crc kubenswrapper[4733]: I0318 10:43:58.176098 4733 scope.go:117] "RemoveContainer" containerID="fc33062a38e6003bcfe678b0b641bcd73299a07f8dcc32e6f590e8bb7c29b637" Mar 18 10:43:58 crc kubenswrapper[4733]: E0318 10:43:58.177037 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 10:44:00 crc kubenswrapper[4733]: I0318 10:44:00.148336 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563844-cdw7c"] Mar 18 10:44:00 crc kubenswrapper[4733]: E0318 10:44:00.148947 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="137e4764-e111-4ef3-ac40-3f8a07e4df8a" containerName="extract-content" Mar 18 10:44:00 crc kubenswrapper[4733]: I0318 10:44:00.148958 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="137e4764-e111-4ef3-ac40-3f8a07e4df8a" containerName="extract-content" Mar 18 10:44:00 crc kubenswrapper[4733]: E0318 10:44:00.148967 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a6751ff-c17a-472f-b315-06edff233f07" containerName="oc" Mar 18 10:44:00 crc kubenswrapper[4733]: I0318 10:44:00.148973 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a6751ff-c17a-472f-b315-06edff233f07" containerName="oc" Mar 18 10:44:00 crc kubenswrapper[4733]: E0318 10:44:00.148991 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="137e4764-e111-4ef3-ac40-3f8a07e4df8a" containerName="registry-server" Mar 18 10:44:00 crc kubenswrapper[4733]: I0318 10:44:00.148997 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="137e4764-e111-4ef3-ac40-3f8a07e4df8a" containerName="registry-server" Mar 18 10:44:00 crc kubenswrapper[4733]: E0318 10:44:00.149005 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="137e4764-e111-4ef3-ac40-3f8a07e4df8a" containerName="extract-utilities" Mar 18 10:44:00 crc kubenswrapper[4733]: I0318 10:44:00.149011 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="137e4764-e111-4ef3-ac40-3f8a07e4df8a" containerName="extract-utilities" Mar 18 10:44:00 crc kubenswrapper[4733]: I0318 10:44:00.149154 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="137e4764-e111-4ef3-ac40-3f8a07e4df8a" containerName="registry-server" Mar 18 10:44:00 crc kubenswrapper[4733]: I0318 10:44:00.149178 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a6751ff-c17a-472f-b315-06edff233f07" containerName="oc" Mar 18 10:44:00 crc kubenswrapper[4733]: I0318 10:44:00.149666 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563844-cdw7c" Mar 18 10:44:00 crc kubenswrapper[4733]: I0318 10:44:00.151943 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wmd5k" Mar 18 10:44:00 crc kubenswrapper[4733]: I0318 10:44:00.152237 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:44:00 crc kubenswrapper[4733]: I0318 10:44:00.152480 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:44:00 crc kubenswrapper[4733]: I0318 10:44:00.154660 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563844-cdw7c"] Mar 18 10:44:00 crc kubenswrapper[4733]: I0318 10:44:00.328223 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dw8tl\" (UniqueName: \"kubernetes.io/projected/a764647f-75c8-4ce3-82fb-b2b729a895a0-kube-api-access-dw8tl\") pod \"auto-csr-approver-29563844-cdw7c\" (UID: \"a764647f-75c8-4ce3-82fb-b2b729a895a0\") " pod="openshift-infra/auto-csr-approver-29563844-cdw7c" Mar 18 10:44:00 crc kubenswrapper[4733]: I0318 10:44:00.429876 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dw8tl\" (UniqueName: \"kubernetes.io/projected/a764647f-75c8-4ce3-82fb-b2b729a895a0-kube-api-access-dw8tl\") pod \"auto-csr-approver-29563844-cdw7c\" (UID: \"a764647f-75c8-4ce3-82fb-b2b729a895a0\") " pod="openshift-infra/auto-csr-approver-29563844-cdw7c" Mar 18 10:44:00 crc kubenswrapper[4733]: I0318 10:44:00.468484 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dw8tl\" (UniqueName: \"kubernetes.io/projected/a764647f-75c8-4ce3-82fb-b2b729a895a0-kube-api-access-dw8tl\") pod \"auto-csr-approver-29563844-cdw7c\" (UID: \"a764647f-75c8-4ce3-82fb-b2b729a895a0\") " pod="openshift-infra/auto-csr-approver-29563844-cdw7c" Mar 18 10:44:00 crc kubenswrapper[4733]: I0318 10:44:00.515854 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563844-cdw7c" Mar 18 10:44:00 crc kubenswrapper[4733]: I0318 10:44:00.784632 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563844-cdw7c"] Mar 18 10:44:00 crc kubenswrapper[4733]: I0318 10:44:00.804582 4733 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 10:44:01 crc kubenswrapper[4733]: I0318 10:44:01.063831 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563844-cdw7c" event={"ID":"a764647f-75c8-4ce3-82fb-b2b729a895a0","Type":"ContainerStarted","Data":"d7f8967a5b8e446f7586f68c0936471f337551cf637a1d1b3e484af1a9c36311"} Mar 18 10:44:03 crc kubenswrapper[4733]: I0318 10:44:03.080407 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563844-cdw7c" event={"ID":"a764647f-75c8-4ce3-82fb-b2b729a895a0","Type":"ContainerStarted","Data":"4e31a574451b3bae3c66b0663fefb42d9a6b941c8dd7f0cf1a6c603f449c0e3b"} Mar 18 10:44:03 crc kubenswrapper[4733]: I0318 10:44:03.107211 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563844-cdw7c" podStartSLOduration=1.302097448 podStartE2EDuration="3.107181679s" podCreationTimestamp="2026-03-18 10:44:00 +0000 UTC" firstStartedPulling="2026-03-18 10:44:00.803921483 +0000 UTC m=+1880.295655848" lastFinishedPulling="2026-03-18 10:44:02.609005714 +0000 UTC m=+1882.100740079" observedRunningTime="2026-03-18 10:44:03.101324904 +0000 UTC m=+1882.593059259" watchObservedRunningTime="2026-03-18 10:44:03.107181679 +0000 UTC m=+1882.598916004" Mar 18 10:44:03 crc kubenswrapper[4733]: I0318 10:44:03.176340 4733 scope.go:117] "RemoveContainer" containerID="c6feb5dbe0273076fcc0fc6ebcba5d0f6774752ba12f9bdf9dbed3f20d646a98" Mar 18 10:44:03 crc kubenswrapper[4733]: E0318 10:44:03.176588 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:44:04 crc kubenswrapper[4733]: I0318 10:44:04.091343 4733 generic.go:334] "Generic (PLEG): container finished" podID="a764647f-75c8-4ce3-82fb-b2b729a895a0" containerID="4e31a574451b3bae3c66b0663fefb42d9a6b941c8dd7f0cf1a6c603f449c0e3b" exitCode=0 Mar 18 10:44:04 crc kubenswrapper[4733]: I0318 10:44:04.091404 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563844-cdw7c" event={"ID":"a764647f-75c8-4ce3-82fb-b2b729a895a0","Type":"ContainerDied","Data":"4e31a574451b3bae3c66b0663fefb42d9a6b941c8dd7f0cf1a6c603f449c0e3b"} Mar 18 10:44:06 crc kubenswrapper[4733]: I0318 10:44:06.418640 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563844-cdw7c" Mar 18 10:44:06 crc kubenswrapper[4733]: I0318 10:44:06.561410 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dw8tl\" (UniqueName: \"kubernetes.io/projected/a764647f-75c8-4ce3-82fb-b2b729a895a0-kube-api-access-dw8tl\") pod \"a764647f-75c8-4ce3-82fb-b2b729a895a0\" (UID: \"a764647f-75c8-4ce3-82fb-b2b729a895a0\") " Mar 18 10:44:06 crc kubenswrapper[4733]: I0318 10:44:06.579546 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a764647f-75c8-4ce3-82fb-b2b729a895a0-kube-api-access-dw8tl" (OuterVolumeSpecName: "kube-api-access-dw8tl") pod "a764647f-75c8-4ce3-82fb-b2b729a895a0" (UID: "a764647f-75c8-4ce3-82fb-b2b729a895a0"). InnerVolumeSpecName "kube-api-access-dw8tl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:44:06 crc kubenswrapper[4733]: I0318 10:44:06.663124 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dw8tl\" (UniqueName: \"kubernetes.io/projected/a764647f-75c8-4ce3-82fb-b2b729a895a0-kube-api-access-dw8tl\") on node \"crc\" DevicePath \"\"" Mar 18 10:44:07 crc kubenswrapper[4733]: I0318 10:44:07.181933 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563844-cdw7c" Mar 18 10:44:07 crc kubenswrapper[4733]: I0318 10:44:07.196335 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563844-cdw7c" event={"ID":"a764647f-75c8-4ce3-82fb-b2b729a895a0","Type":"ContainerDied","Data":"d7f8967a5b8e446f7586f68c0936471f337551cf637a1d1b3e484af1a9c36311"} Mar 18 10:44:07 crc kubenswrapper[4733]: I0318 10:44:07.196395 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7f8967a5b8e446f7586f68c0936471f337551cf637a1d1b3e484af1a9c36311" Mar 18 10:44:07 crc kubenswrapper[4733]: I0318 10:44:07.524702 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563838-6fbsd"] Mar 18 10:44:07 crc kubenswrapper[4733]: I0318 10:44:07.535259 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563838-6fbsd"] Mar 18 10:44:09 crc kubenswrapper[4733]: I0318 10:44:09.175471 4733 scope.go:117] "RemoveContainer" containerID="5270a9bb2caacb712ce1222dbbad48bb5b8a42db385b8791217d8ea1eab704e8" Mar 18 10:44:09 crc kubenswrapper[4733]: E0318 10:44:09.176605 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:44:09 crc kubenswrapper[4733]: I0318 10:44:09.194840 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3959ab36-a688-40ac-b70b-b3cc35b1d7a1" path="/var/lib/kubelet/pods/3959ab36-a688-40ac-b70b-b3cc35b1d7a1/volumes" Mar 18 10:44:10 crc kubenswrapper[4733]: I0318 10:44:10.177732 4733 scope.go:117] "RemoveContainer" containerID="fc33062a38e6003bcfe678b0b641bcd73299a07f8dcc32e6f590e8bb7c29b637" Mar 18 10:44:10 crc kubenswrapper[4733]: E0318 10:44:10.178116 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 10:44:10 crc kubenswrapper[4733]: I0318 10:44:10.327800 4733 scope.go:117] "RemoveContainer" containerID="ac18342a3539a4f4eb0b18430ab3c33bd2af4e21dfc3695dc34573a145ad949d" Mar 18 10:44:18 crc kubenswrapper[4733]: I0318 10:44:18.175468 4733 scope.go:117] "RemoveContainer" containerID="c6feb5dbe0273076fcc0fc6ebcba5d0f6774752ba12f9bdf9dbed3f20d646a98" Mar 18 10:44:18 crc kubenswrapper[4733]: E0318 10:44:18.176322 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:44:21 crc kubenswrapper[4733]: I0318 10:44:21.188725 4733 scope.go:117] "RemoveContainer" containerID="fc33062a38e6003bcfe678b0b641bcd73299a07f8dcc32e6f590e8bb7c29b637" Mar 18 10:44:21 crc kubenswrapper[4733]: E0318 10:44:21.189445 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 10:44:23 crc kubenswrapper[4733]: I0318 10:44:23.175412 4733 scope.go:117] "RemoveContainer" containerID="5270a9bb2caacb712ce1222dbbad48bb5b8a42db385b8791217d8ea1eab704e8" Mar 18 10:44:23 crc kubenswrapper[4733]: E0318 10:44:23.175885 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:44:30 crc kubenswrapper[4733]: I0318 10:44:30.176069 4733 scope.go:117] "RemoveContainer" containerID="c6feb5dbe0273076fcc0fc6ebcba5d0f6774752ba12f9bdf9dbed3f20d646a98" Mar 18 10:44:30 crc kubenswrapper[4733]: E0318 10:44:30.177219 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:44:35 crc kubenswrapper[4733]: I0318 10:44:35.176306 4733 scope.go:117] "RemoveContainer" containerID="fc33062a38e6003bcfe678b0b641bcd73299a07f8dcc32e6f590e8bb7c29b637" Mar 18 10:44:35 crc kubenswrapper[4733]: E0318 10:44:35.177043 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 10:44:36 crc kubenswrapper[4733]: I0318 10:44:36.178945 4733 scope.go:117] "RemoveContainer" containerID="5270a9bb2caacb712ce1222dbbad48bb5b8a42db385b8791217d8ea1eab704e8" Mar 18 10:44:36 crc kubenswrapper[4733]: E0318 10:44:36.179385 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:44:42 crc kubenswrapper[4733]: I0318 10:44:42.176962 4733 scope.go:117] "RemoveContainer" containerID="c6feb5dbe0273076fcc0fc6ebcba5d0f6774752ba12f9bdf9dbed3f20d646a98" Mar 18 10:44:42 crc kubenswrapper[4733]: E0318 10:44:42.178223 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:44:46 crc kubenswrapper[4733]: I0318 10:44:46.175622 4733 scope.go:117] "RemoveContainer" containerID="fc33062a38e6003bcfe678b0b641bcd73299a07f8dcc32e6f590e8bb7c29b637" Mar 18 10:44:46 crc kubenswrapper[4733]: I0318 10:44:46.559098 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" event={"ID":"6f75e1c5-e0c5-43df-944f-77b734070793","Type":"ContainerStarted","Data":"4aabe714853c502719880f7f27bb562465a6a84fdec9e321e389ec23753f6337"} Mar 18 10:44:50 crc kubenswrapper[4733]: I0318 10:44:50.175567 4733 scope.go:117] "RemoveContainer" containerID="5270a9bb2caacb712ce1222dbbad48bb5b8a42db385b8791217d8ea1eab704e8" Mar 18 10:44:50 crc kubenswrapper[4733]: E0318 10:44:50.176671 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:44:53 crc kubenswrapper[4733]: I0318 10:44:53.175252 4733 scope.go:117] "RemoveContainer" containerID="c6feb5dbe0273076fcc0fc6ebcba5d0f6774752ba12f9bdf9dbed3f20d646a98" Mar 18 10:44:53 crc kubenswrapper[4733]: E0318 10:44:53.176510 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:45:00 crc kubenswrapper[4733]: I0318 10:45:00.161104 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563845-8g7fj"] Mar 18 10:45:00 crc kubenswrapper[4733]: E0318 10:45:00.164033 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a764647f-75c8-4ce3-82fb-b2b729a895a0" containerName="oc" Mar 18 10:45:00 crc kubenswrapper[4733]: I0318 10:45:00.164225 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="a764647f-75c8-4ce3-82fb-b2b729a895a0" containerName="oc" Mar 18 10:45:00 crc kubenswrapper[4733]: I0318 10:45:00.164655 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="a764647f-75c8-4ce3-82fb-b2b729a895a0" containerName="oc" Mar 18 10:45:00 crc kubenswrapper[4733]: I0318 10:45:00.166361 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563845-8g7fj" Mar 18 10:45:00 crc kubenswrapper[4733]: I0318 10:45:00.171716 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 10:45:00 crc kubenswrapper[4733]: I0318 10:45:00.174142 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563845-8g7fj"] Mar 18 10:45:00 crc kubenswrapper[4733]: I0318 10:45:00.175308 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 10:45:00 crc kubenswrapper[4733]: I0318 10:45:00.231379 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b43c9f9c-7dd6-45fd-b0e2-66e4e6de9852-config-volume\") pod \"collect-profiles-29563845-8g7fj\" (UID: \"b43c9f9c-7dd6-45fd-b0e2-66e4e6de9852\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563845-8g7fj" Mar 18 10:45:00 crc kubenswrapper[4733]: I0318 10:45:00.231465 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4khzz\" (UniqueName: \"kubernetes.io/projected/b43c9f9c-7dd6-45fd-b0e2-66e4e6de9852-kube-api-access-4khzz\") pod \"collect-profiles-29563845-8g7fj\" (UID: \"b43c9f9c-7dd6-45fd-b0e2-66e4e6de9852\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563845-8g7fj" Mar 18 10:45:00 crc kubenswrapper[4733]: I0318 10:45:00.231545 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b43c9f9c-7dd6-45fd-b0e2-66e4e6de9852-secret-volume\") pod \"collect-profiles-29563845-8g7fj\" (UID: \"b43c9f9c-7dd6-45fd-b0e2-66e4e6de9852\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563845-8g7fj" Mar 18 10:45:00 crc kubenswrapper[4733]: I0318 10:45:00.334074 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b43c9f9c-7dd6-45fd-b0e2-66e4e6de9852-config-volume\") pod \"collect-profiles-29563845-8g7fj\" (UID: \"b43c9f9c-7dd6-45fd-b0e2-66e4e6de9852\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563845-8g7fj" Mar 18 10:45:00 crc kubenswrapper[4733]: I0318 10:45:00.334131 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4khzz\" (UniqueName: \"kubernetes.io/projected/b43c9f9c-7dd6-45fd-b0e2-66e4e6de9852-kube-api-access-4khzz\") pod \"collect-profiles-29563845-8g7fj\" (UID: \"b43c9f9c-7dd6-45fd-b0e2-66e4e6de9852\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563845-8g7fj" Mar 18 10:45:00 crc kubenswrapper[4733]: I0318 10:45:00.334185 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b43c9f9c-7dd6-45fd-b0e2-66e4e6de9852-secret-volume\") pod \"collect-profiles-29563845-8g7fj\" (UID: \"b43c9f9c-7dd6-45fd-b0e2-66e4e6de9852\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563845-8g7fj" Mar 18 10:45:00 crc kubenswrapper[4733]: I0318 10:45:00.335220 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b43c9f9c-7dd6-45fd-b0e2-66e4e6de9852-config-volume\") pod \"collect-profiles-29563845-8g7fj\" (UID: \"b43c9f9c-7dd6-45fd-b0e2-66e4e6de9852\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563845-8g7fj" Mar 18 10:45:00 crc kubenswrapper[4733]: I0318 10:45:00.344289 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b43c9f9c-7dd6-45fd-b0e2-66e4e6de9852-secret-volume\") pod \"collect-profiles-29563845-8g7fj\" (UID: \"b43c9f9c-7dd6-45fd-b0e2-66e4e6de9852\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563845-8g7fj" Mar 18 10:45:00 crc kubenswrapper[4733]: I0318 10:45:00.361756 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4khzz\" (UniqueName: \"kubernetes.io/projected/b43c9f9c-7dd6-45fd-b0e2-66e4e6de9852-kube-api-access-4khzz\") pod \"collect-profiles-29563845-8g7fj\" (UID: \"b43c9f9c-7dd6-45fd-b0e2-66e4e6de9852\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563845-8g7fj" Mar 18 10:45:00 crc kubenswrapper[4733]: I0318 10:45:00.488824 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563845-8g7fj" Mar 18 10:45:01 crc kubenswrapper[4733]: W0318 10:45:01.003377 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb43c9f9c_7dd6_45fd_b0e2_66e4e6de9852.slice/crio-7d997435b256176d51d88b14b3339d5330b41b331096ed74154ace289a620b7f WatchSource:0}: Error finding container 7d997435b256176d51d88b14b3339d5330b41b331096ed74154ace289a620b7f: Status 404 returned error can't find the container with id 7d997435b256176d51d88b14b3339d5330b41b331096ed74154ace289a620b7f Mar 18 10:45:01 crc kubenswrapper[4733]: I0318 10:45:01.004942 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563845-8g7fj"] Mar 18 10:45:01 crc kubenswrapper[4733]: I0318 10:45:01.026455 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563845-8g7fj" event={"ID":"b43c9f9c-7dd6-45fd-b0e2-66e4e6de9852","Type":"ContainerStarted","Data":"7d997435b256176d51d88b14b3339d5330b41b331096ed74154ace289a620b7f"} Mar 18 10:45:01 crc kubenswrapper[4733]: I0318 10:45:01.179946 4733 scope.go:117] "RemoveContainer" containerID="5270a9bb2caacb712ce1222dbbad48bb5b8a42db385b8791217d8ea1eab704e8" Mar 18 10:45:01 crc kubenswrapper[4733]: E0318 10:45:01.180599 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:45:02 crc kubenswrapper[4733]: I0318 10:45:02.428951 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563845-8g7fj" event={"ID":"b43c9f9c-7dd6-45fd-b0e2-66e4e6de9852","Type":"ContainerDied","Data":"10b0b28bf2347978c5f83c64e4ba31d03fc0ca97939b344013089211d7ff71e7"} Mar 18 10:45:02 crc kubenswrapper[4733]: I0318 10:45:02.428642 4733 generic.go:334] "Generic (PLEG): container finished" podID="b43c9f9c-7dd6-45fd-b0e2-66e4e6de9852" containerID="10b0b28bf2347978c5f83c64e4ba31d03fc0ca97939b344013089211d7ff71e7" exitCode=0 Mar 18 10:45:03 crc kubenswrapper[4733]: I0318 10:45:03.792068 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563845-8g7fj" Mar 18 10:45:03 crc kubenswrapper[4733]: I0318 10:45:03.928396 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b43c9f9c-7dd6-45fd-b0e2-66e4e6de9852-secret-volume\") pod \"b43c9f9c-7dd6-45fd-b0e2-66e4e6de9852\" (UID: \"b43c9f9c-7dd6-45fd-b0e2-66e4e6de9852\") " Mar 18 10:45:03 crc kubenswrapper[4733]: I0318 10:45:03.928599 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b43c9f9c-7dd6-45fd-b0e2-66e4e6de9852-config-volume\") pod \"b43c9f9c-7dd6-45fd-b0e2-66e4e6de9852\" (UID: \"b43c9f9c-7dd6-45fd-b0e2-66e4e6de9852\") " Mar 18 10:45:03 crc kubenswrapper[4733]: I0318 10:45:03.928657 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4khzz\" (UniqueName: \"kubernetes.io/projected/b43c9f9c-7dd6-45fd-b0e2-66e4e6de9852-kube-api-access-4khzz\") pod \"b43c9f9c-7dd6-45fd-b0e2-66e4e6de9852\" (UID: \"b43c9f9c-7dd6-45fd-b0e2-66e4e6de9852\") " Mar 18 10:45:03 crc kubenswrapper[4733]: I0318 10:45:03.929895 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b43c9f9c-7dd6-45fd-b0e2-66e4e6de9852-config-volume" (OuterVolumeSpecName: "config-volume") pod "b43c9f9c-7dd6-45fd-b0e2-66e4e6de9852" (UID: "b43c9f9c-7dd6-45fd-b0e2-66e4e6de9852"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 10:45:03 crc kubenswrapper[4733]: I0318 10:45:03.939035 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b43c9f9c-7dd6-45fd-b0e2-66e4e6de9852-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b43c9f9c-7dd6-45fd-b0e2-66e4e6de9852" (UID: "b43c9f9c-7dd6-45fd-b0e2-66e4e6de9852"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 10:45:03 crc kubenswrapper[4733]: I0318 10:45:03.939176 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b43c9f9c-7dd6-45fd-b0e2-66e4e6de9852-kube-api-access-4khzz" (OuterVolumeSpecName: "kube-api-access-4khzz") pod "b43c9f9c-7dd6-45fd-b0e2-66e4e6de9852" (UID: "b43c9f9c-7dd6-45fd-b0e2-66e4e6de9852"). InnerVolumeSpecName "kube-api-access-4khzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:45:04 crc kubenswrapper[4733]: I0318 10:45:04.030521 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4khzz\" (UniqueName: \"kubernetes.io/projected/b43c9f9c-7dd6-45fd-b0e2-66e4e6de9852-kube-api-access-4khzz\") on node \"crc\" DevicePath \"\"" Mar 18 10:45:04 crc kubenswrapper[4733]: I0318 10:45:04.030825 4733 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b43c9f9c-7dd6-45fd-b0e2-66e4e6de9852-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 10:45:04 crc kubenswrapper[4733]: I0318 10:45:04.030956 4733 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b43c9f9c-7dd6-45fd-b0e2-66e4e6de9852-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 10:45:04 crc kubenswrapper[4733]: I0318 10:45:04.450884 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563845-8g7fj" event={"ID":"b43c9f9c-7dd6-45fd-b0e2-66e4e6de9852","Type":"ContainerDied","Data":"7d997435b256176d51d88b14b3339d5330b41b331096ed74154ace289a620b7f"} Mar 18 10:45:04 crc kubenswrapper[4733]: I0318 10:45:04.450934 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d997435b256176d51d88b14b3339d5330b41b331096ed74154ace289a620b7f" Mar 18 10:45:04 crc kubenswrapper[4733]: I0318 10:45:04.451408 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563845-8g7fj" Mar 18 10:45:06 crc kubenswrapper[4733]: I0318 10:45:06.175499 4733 scope.go:117] "RemoveContainer" containerID="c6feb5dbe0273076fcc0fc6ebcba5d0f6774752ba12f9bdf9dbed3f20d646a98" Mar 18 10:45:06 crc kubenswrapper[4733]: E0318 10:45:06.176549 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:45:13 crc kubenswrapper[4733]: I0318 10:45:13.179603 4733 scope.go:117] "RemoveContainer" containerID="5270a9bb2caacb712ce1222dbbad48bb5b8a42db385b8791217d8ea1eab704e8" Mar 18 10:45:13 crc kubenswrapper[4733]: E0318 10:45:13.180960 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:45:18 crc kubenswrapper[4733]: I0318 10:45:18.175777 4733 scope.go:117] "RemoveContainer" containerID="c6feb5dbe0273076fcc0fc6ebcba5d0f6774752ba12f9bdf9dbed3f20d646a98" Mar 18 10:45:18 crc kubenswrapper[4733]: E0318 10:45:18.176397 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:45:24 crc kubenswrapper[4733]: I0318 10:45:24.176377 4733 scope.go:117] "RemoveContainer" containerID="5270a9bb2caacb712ce1222dbbad48bb5b8a42db385b8791217d8ea1eab704e8" Mar 18 10:45:24 crc kubenswrapper[4733]: E0318 10:45:24.177960 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:45:32 crc kubenswrapper[4733]: I0318 10:45:32.177435 4733 scope.go:117] "RemoveContainer" containerID="c6feb5dbe0273076fcc0fc6ebcba5d0f6774752ba12f9bdf9dbed3f20d646a98" Mar 18 10:45:32 crc kubenswrapper[4733]: E0318 10:45:32.178511 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:45:35 crc kubenswrapper[4733]: I0318 10:45:35.175785 4733 scope.go:117] "RemoveContainer" containerID="5270a9bb2caacb712ce1222dbbad48bb5b8a42db385b8791217d8ea1eab704e8" Mar 18 10:45:35 crc kubenswrapper[4733]: E0318 10:45:35.176789 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:45:46 crc kubenswrapper[4733]: I0318 10:45:46.175643 4733 scope.go:117] "RemoveContainer" containerID="5270a9bb2caacb712ce1222dbbad48bb5b8a42db385b8791217d8ea1eab704e8" Mar 18 10:45:46 crc kubenswrapper[4733]: I0318 10:45:46.176394 4733 scope.go:117] "RemoveContainer" containerID="c6feb5dbe0273076fcc0fc6ebcba5d0f6774752ba12f9bdf9dbed3f20d646a98" Mar 18 10:45:46 crc kubenswrapper[4733]: E0318 10:45:46.176688 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:45:46 crc kubenswrapper[4733]: E0318 10:45:46.176810 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:45:57 crc kubenswrapper[4733]: I0318 10:45:57.175681 4733 scope.go:117] "RemoveContainer" containerID="5270a9bb2caacb712ce1222dbbad48bb5b8a42db385b8791217d8ea1eab704e8" Mar 18 10:45:57 crc kubenswrapper[4733]: E0318 10:45:57.176879 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:45:58 crc kubenswrapper[4733]: I0318 10:45:58.175806 4733 scope.go:117] "RemoveContainer" containerID="c6feb5dbe0273076fcc0fc6ebcba5d0f6774752ba12f9bdf9dbed3f20d646a98" Mar 18 10:45:58 crc kubenswrapper[4733]: E0318 10:45:58.176384 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:46:00 crc kubenswrapper[4733]: I0318 10:46:00.164715 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563846-zfzc4"] Mar 18 10:46:00 crc kubenswrapper[4733]: E0318 10:46:00.165756 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b43c9f9c-7dd6-45fd-b0e2-66e4e6de9852" containerName="collect-profiles" Mar 18 10:46:00 crc kubenswrapper[4733]: I0318 10:46:00.165779 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="b43c9f9c-7dd6-45fd-b0e2-66e4e6de9852" containerName="collect-profiles" Mar 18 10:46:00 crc kubenswrapper[4733]: I0318 10:46:00.166135 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="b43c9f9c-7dd6-45fd-b0e2-66e4e6de9852" containerName="collect-profiles" Mar 18 10:46:00 crc kubenswrapper[4733]: I0318 10:46:00.167059 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563846-zfzc4" Mar 18 10:46:00 crc kubenswrapper[4733]: I0318 10:46:00.171084 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wmd5k" Mar 18 10:46:00 crc kubenswrapper[4733]: I0318 10:46:00.172843 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:46:00 crc kubenswrapper[4733]: I0318 10:46:00.174002 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:46:00 crc kubenswrapper[4733]: I0318 10:46:00.179410 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563846-zfzc4"] Mar 18 10:46:00 crc kubenswrapper[4733]: I0318 10:46:00.187938 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2qcn\" (UniqueName: \"kubernetes.io/projected/02d9650e-918c-4ff4-82bd-ba01e08b6588-kube-api-access-s2qcn\") pod \"auto-csr-approver-29563846-zfzc4\" (UID: \"02d9650e-918c-4ff4-82bd-ba01e08b6588\") " pod="openshift-infra/auto-csr-approver-29563846-zfzc4" Mar 18 10:46:00 crc kubenswrapper[4733]: I0318 10:46:00.289338 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2qcn\" (UniqueName: \"kubernetes.io/projected/02d9650e-918c-4ff4-82bd-ba01e08b6588-kube-api-access-s2qcn\") pod \"auto-csr-approver-29563846-zfzc4\" (UID: \"02d9650e-918c-4ff4-82bd-ba01e08b6588\") " pod="openshift-infra/auto-csr-approver-29563846-zfzc4" Mar 18 10:46:00 crc kubenswrapper[4733]: I0318 10:46:00.323039 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2qcn\" (UniqueName: \"kubernetes.io/projected/02d9650e-918c-4ff4-82bd-ba01e08b6588-kube-api-access-s2qcn\") pod \"auto-csr-approver-29563846-zfzc4\" (UID: \"02d9650e-918c-4ff4-82bd-ba01e08b6588\") " pod="openshift-infra/auto-csr-approver-29563846-zfzc4" Mar 18 10:46:00 crc kubenswrapper[4733]: I0318 10:46:00.507619 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563846-zfzc4" Mar 18 10:46:00 crc kubenswrapper[4733]: I0318 10:46:00.959812 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563846-zfzc4"] Mar 18 10:46:00 crc kubenswrapper[4733]: W0318 10:46:00.966378 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02d9650e_918c_4ff4_82bd_ba01e08b6588.slice/crio-072908d45ce872d36b19bfa216060105bf64d028952b77fda23a47fde15be9a1 WatchSource:0}: Error finding container 072908d45ce872d36b19bfa216060105bf64d028952b77fda23a47fde15be9a1: Status 404 returned error can't find the container with id 072908d45ce872d36b19bfa216060105bf64d028952b77fda23a47fde15be9a1 Mar 18 10:46:01 crc kubenswrapper[4733]: I0318 10:46:01.084330 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563846-zfzc4" event={"ID":"02d9650e-918c-4ff4-82bd-ba01e08b6588","Type":"ContainerStarted","Data":"072908d45ce872d36b19bfa216060105bf64d028952b77fda23a47fde15be9a1"} Mar 18 10:46:04 crc kubenswrapper[4733]: I0318 10:46:04.118889 4733 generic.go:334] "Generic (PLEG): container finished" podID="02d9650e-918c-4ff4-82bd-ba01e08b6588" containerID="df6a0cb730f6fbfc1289efd16f5838442dcf854d748f1b313f239fe3a8ed31b9" exitCode=0 Mar 18 10:46:04 crc kubenswrapper[4733]: I0318 10:46:04.119036 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563846-zfzc4" event={"ID":"02d9650e-918c-4ff4-82bd-ba01e08b6588","Type":"ContainerDied","Data":"df6a0cb730f6fbfc1289efd16f5838442dcf854d748f1b313f239fe3a8ed31b9"} Mar 18 10:46:05 crc kubenswrapper[4733]: I0318 10:46:05.509704 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563846-zfzc4" Mar 18 10:46:05 crc kubenswrapper[4733]: I0318 10:46:05.681478 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2qcn\" (UniqueName: \"kubernetes.io/projected/02d9650e-918c-4ff4-82bd-ba01e08b6588-kube-api-access-s2qcn\") pod \"02d9650e-918c-4ff4-82bd-ba01e08b6588\" (UID: \"02d9650e-918c-4ff4-82bd-ba01e08b6588\") " Mar 18 10:46:05 crc kubenswrapper[4733]: I0318 10:46:05.687402 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02d9650e-918c-4ff4-82bd-ba01e08b6588-kube-api-access-s2qcn" (OuterVolumeSpecName: "kube-api-access-s2qcn") pod "02d9650e-918c-4ff4-82bd-ba01e08b6588" (UID: "02d9650e-918c-4ff4-82bd-ba01e08b6588"). InnerVolumeSpecName "kube-api-access-s2qcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:46:05 crc kubenswrapper[4733]: I0318 10:46:05.782811 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2qcn\" (UniqueName: \"kubernetes.io/projected/02d9650e-918c-4ff4-82bd-ba01e08b6588-kube-api-access-s2qcn\") on node \"crc\" DevicePath \"\"" Mar 18 10:46:06 crc kubenswrapper[4733]: I0318 10:46:06.141452 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563846-zfzc4" event={"ID":"02d9650e-918c-4ff4-82bd-ba01e08b6588","Type":"ContainerDied","Data":"072908d45ce872d36b19bfa216060105bf64d028952b77fda23a47fde15be9a1"} Mar 18 10:46:06 crc kubenswrapper[4733]: I0318 10:46:06.141498 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="072908d45ce872d36b19bfa216060105bf64d028952b77fda23a47fde15be9a1" Mar 18 10:46:06 crc kubenswrapper[4733]: I0318 10:46:06.141574 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563846-zfzc4" Mar 18 10:46:06 crc kubenswrapper[4733]: I0318 10:46:06.581996 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563840-lvjwz"] Mar 18 10:46:06 crc kubenswrapper[4733]: I0318 10:46:06.587717 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563840-lvjwz"] Mar 18 10:46:07 crc kubenswrapper[4733]: I0318 10:46:07.193857 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be59cc18-c769-443f-962f-042b8ba456b8" path="/var/lib/kubelet/pods/be59cc18-c769-443f-962f-042b8ba456b8/volumes" Mar 18 10:46:10 crc kubenswrapper[4733]: I0318 10:46:10.176379 4733 scope.go:117] "RemoveContainer" containerID="c6feb5dbe0273076fcc0fc6ebcba5d0f6774752ba12f9bdf9dbed3f20d646a98" Mar 18 10:46:10 crc kubenswrapper[4733]: E0318 10:46:10.178847 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:46:10 crc kubenswrapper[4733]: I0318 10:46:10.455459 4733 scope.go:117] "RemoveContainer" containerID="875056c9052fb9c1578503b6e7b5412bb4aa77c6a765ef7ede001dc3e0cb6698" Mar 18 10:46:11 crc kubenswrapper[4733]: I0318 10:46:11.210654 4733 scope.go:117] "RemoveContainer" containerID="5270a9bb2caacb712ce1222dbbad48bb5b8a42db385b8791217d8ea1eab704e8" Mar 18 10:46:11 crc kubenswrapper[4733]: E0318 10:46:11.211093 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:46:24 crc kubenswrapper[4733]: I0318 10:46:24.175343 4733 scope.go:117] "RemoveContainer" containerID="5270a9bb2caacb712ce1222dbbad48bb5b8a42db385b8791217d8ea1eab704e8" Mar 18 10:46:24 crc kubenswrapper[4733]: E0318 10:46:24.176527 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:46:25 crc kubenswrapper[4733]: I0318 10:46:25.176391 4733 scope.go:117] "RemoveContainer" containerID="c6feb5dbe0273076fcc0fc6ebcba5d0f6774752ba12f9bdf9dbed3f20d646a98" Mar 18 10:46:25 crc kubenswrapper[4733]: E0318 10:46:25.176952 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:46:37 crc kubenswrapper[4733]: I0318 10:46:37.175623 4733 scope.go:117] "RemoveContainer" containerID="5270a9bb2caacb712ce1222dbbad48bb5b8a42db385b8791217d8ea1eab704e8" Mar 18 10:46:37 crc kubenswrapper[4733]: E0318 10:46:37.176671 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:46:40 crc kubenswrapper[4733]: I0318 10:46:40.176141 4733 scope.go:117] "RemoveContainer" containerID="c6feb5dbe0273076fcc0fc6ebcba5d0f6774752ba12f9bdf9dbed3f20d646a98" Mar 18 10:46:40 crc kubenswrapper[4733]: E0318 10:46:40.176979 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:46:48 crc kubenswrapper[4733]: I0318 10:46:48.176069 4733 scope.go:117] "RemoveContainer" containerID="5270a9bb2caacb712ce1222dbbad48bb5b8a42db385b8791217d8ea1eab704e8" Mar 18 10:46:48 crc kubenswrapper[4733]: E0318 10:46:48.176946 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:46:54 crc kubenswrapper[4733]: I0318 10:46:54.176860 4733 scope.go:117] "RemoveContainer" containerID="c6feb5dbe0273076fcc0fc6ebcba5d0f6774752ba12f9bdf9dbed3f20d646a98" Mar 18 10:46:54 crc kubenswrapper[4733]: E0318 10:46:54.178034 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:47:02 crc kubenswrapper[4733]: I0318 10:47:02.176237 4733 scope.go:117] "RemoveContainer" containerID="5270a9bb2caacb712ce1222dbbad48bb5b8a42db385b8791217d8ea1eab704e8" Mar 18 10:47:02 crc kubenswrapper[4733]: E0318 10:47:02.177529 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:47:06 crc kubenswrapper[4733]: I0318 10:47:06.175955 4733 scope.go:117] "RemoveContainer" containerID="c6feb5dbe0273076fcc0fc6ebcba5d0f6774752ba12f9bdf9dbed3f20d646a98" Mar 18 10:47:06 crc kubenswrapper[4733]: E0318 10:47:06.176686 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:47:13 crc kubenswrapper[4733]: I0318 10:47:13.570911 4733 patch_prober.go:28] interesting pod/machine-config-daemon-2h7dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:47:13 crc kubenswrapper[4733]: I0318 10:47:13.571704 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:47:14 crc kubenswrapper[4733]: I0318 10:47:14.176001 4733 scope.go:117] "RemoveContainer" containerID="5270a9bb2caacb712ce1222dbbad48bb5b8a42db385b8791217d8ea1eab704e8" Mar 18 10:47:14 crc kubenswrapper[4733]: E0318 10:47:14.176416 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:47:21 crc kubenswrapper[4733]: I0318 10:47:21.184029 4733 scope.go:117] "RemoveContainer" containerID="c6feb5dbe0273076fcc0fc6ebcba5d0f6774752ba12f9bdf9dbed3f20d646a98" Mar 18 10:47:21 crc kubenswrapper[4733]: E0318 10:47:21.187343 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:47:26 crc kubenswrapper[4733]: I0318 10:47:26.175824 4733 scope.go:117] "RemoveContainer" containerID="5270a9bb2caacb712ce1222dbbad48bb5b8a42db385b8791217d8ea1eab704e8" Mar 18 10:47:26 crc kubenswrapper[4733]: E0318 10:47:26.177110 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:47:34 crc kubenswrapper[4733]: I0318 10:47:34.175997 4733 scope.go:117] "RemoveContainer" containerID="c6feb5dbe0273076fcc0fc6ebcba5d0f6774752ba12f9bdf9dbed3f20d646a98" Mar 18 10:47:34 crc kubenswrapper[4733]: E0318 10:47:34.177043 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:47:40 crc kubenswrapper[4733]: I0318 10:47:40.176579 4733 scope.go:117] "RemoveContainer" containerID="5270a9bb2caacb712ce1222dbbad48bb5b8a42db385b8791217d8ea1eab704e8" Mar 18 10:47:40 crc kubenswrapper[4733]: E0318 10:47:40.177800 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:47:43 crc kubenswrapper[4733]: I0318 10:47:43.571738 4733 patch_prober.go:28] interesting pod/machine-config-daemon-2h7dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:47:43 crc kubenswrapper[4733]: I0318 10:47:43.572181 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:47:46 crc kubenswrapper[4733]: I0318 10:47:46.711144 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7fmwk"] Mar 18 10:47:46 crc kubenswrapper[4733]: E0318 10:47:46.712126 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02d9650e-918c-4ff4-82bd-ba01e08b6588" containerName="oc" Mar 18 10:47:46 crc kubenswrapper[4733]: I0318 10:47:46.712149 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="02d9650e-918c-4ff4-82bd-ba01e08b6588" containerName="oc" Mar 18 10:47:46 crc kubenswrapper[4733]: I0318 10:47:46.712513 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="02d9650e-918c-4ff4-82bd-ba01e08b6588" containerName="oc" Mar 18 10:47:46 crc kubenswrapper[4733]: I0318 10:47:46.714527 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7fmwk" Mar 18 10:47:46 crc kubenswrapper[4733]: I0318 10:47:46.719176 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7fmwk"] Mar 18 10:47:46 crc kubenswrapper[4733]: I0318 10:47:46.770439 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73638d51-1c01-4323-91f7-ef8fba5f8654-catalog-content\") pod \"community-operators-7fmwk\" (UID: \"73638d51-1c01-4323-91f7-ef8fba5f8654\") " pod="openshift-marketplace/community-operators-7fmwk" Mar 18 10:47:46 crc kubenswrapper[4733]: I0318 10:47:46.770665 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73638d51-1c01-4323-91f7-ef8fba5f8654-utilities\") pod \"community-operators-7fmwk\" (UID: \"73638d51-1c01-4323-91f7-ef8fba5f8654\") " pod="openshift-marketplace/community-operators-7fmwk" Mar 18 10:47:46 crc kubenswrapper[4733]: I0318 10:47:46.770748 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfbkb\" (UniqueName: \"kubernetes.io/projected/73638d51-1c01-4323-91f7-ef8fba5f8654-kube-api-access-vfbkb\") pod \"community-operators-7fmwk\" (UID: \"73638d51-1c01-4323-91f7-ef8fba5f8654\") " pod="openshift-marketplace/community-operators-7fmwk" Mar 18 10:47:46 crc kubenswrapper[4733]: I0318 10:47:46.872331 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73638d51-1c01-4323-91f7-ef8fba5f8654-utilities\") pod \"community-operators-7fmwk\" (UID: \"73638d51-1c01-4323-91f7-ef8fba5f8654\") " pod="openshift-marketplace/community-operators-7fmwk" Mar 18 10:47:46 crc kubenswrapper[4733]: I0318 10:47:46.872396 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfbkb\" (UniqueName: \"kubernetes.io/projected/73638d51-1c01-4323-91f7-ef8fba5f8654-kube-api-access-vfbkb\") pod \"community-operators-7fmwk\" (UID: \"73638d51-1c01-4323-91f7-ef8fba5f8654\") " pod="openshift-marketplace/community-operators-7fmwk" Mar 18 10:47:46 crc kubenswrapper[4733]: I0318 10:47:46.872424 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73638d51-1c01-4323-91f7-ef8fba5f8654-catalog-content\") pod \"community-operators-7fmwk\" (UID: \"73638d51-1c01-4323-91f7-ef8fba5f8654\") " pod="openshift-marketplace/community-operators-7fmwk" Mar 18 10:47:46 crc kubenswrapper[4733]: I0318 10:47:46.873054 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73638d51-1c01-4323-91f7-ef8fba5f8654-catalog-content\") pod \"community-operators-7fmwk\" (UID: \"73638d51-1c01-4323-91f7-ef8fba5f8654\") " pod="openshift-marketplace/community-operators-7fmwk" Mar 18 10:47:46 crc kubenswrapper[4733]: I0318 10:47:46.873081 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73638d51-1c01-4323-91f7-ef8fba5f8654-utilities\") pod \"community-operators-7fmwk\" (UID: \"73638d51-1c01-4323-91f7-ef8fba5f8654\") " pod="openshift-marketplace/community-operators-7fmwk" Mar 18 10:47:46 crc kubenswrapper[4733]: I0318 10:47:46.903286 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfbkb\" (UniqueName: \"kubernetes.io/projected/73638d51-1c01-4323-91f7-ef8fba5f8654-kube-api-access-vfbkb\") pod \"community-operators-7fmwk\" (UID: \"73638d51-1c01-4323-91f7-ef8fba5f8654\") " pod="openshift-marketplace/community-operators-7fmwk" Mar 18 10:47:47 crc kubenswrapper[4733]: I0318 10:47:47.040576 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7fmwk" Mar 18 10:47:47 crc kubenswrapper[4733]: I0318 10:47:47.358856 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7fmwk"] Mar 18 10:47:48 crc kubenswrapper[4733]: I0318 10:47:48.175710 4733 scope.go:117] "RemoveContainer" containerID="c6feb5dbe0273076fcc0fc6ebcba5d0f6774752ba12f9bdf9dbed3f20d646a98" Mar 18 10:47:48 crc kubenswrapper[4733]: E0318 10:47:48.176966 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:47:48 crc kubenswrapper[4733]: I0318 10:47:48.258450 4733 generic.go:334] "Generic (PLEG): container finished" podID="73638d51-1c01-4323-91f7-ef8fba5f8654" containerID="7c559db9726092272a44b539af5e62593a410a2361a733d6503432107bd1fe27" exitCode=0 Mar 18 10:47:48 crc kubenswrapper[4733]: I0318 10:47:48.258533 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7fmwk" event={"ID":"73638d51-1c01-4323-91f7-ef8fba5f8654","Type":"ContainerDied","Data":"7c559db9726092272a44b539af5e62593a410a2361a733d6503432107bd1fe27"} Mar 18 10:47:48 crc kubenswrapper[4733]: I0318 10:47:48.258620 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7fmwk" event={"ID":"73638d51-1c01-4323-91f7-ef8fba5f8654","Type":"ContainerStarted","Data":"928d80f6e5ee50442d6ebfaa23c4a5899ef5981249f762df7a2c9401fd78c54d"} Mar 18 10:47:49 crc kubenswrapper[4733]: I0318 10:47:49.267706 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7fmwk" event={"ID":"73638d51-1c01-4323-91f7-ef8fba5f8654","Type":"ContainerStarted","Data":"c5107691d43bccc1194ae8e4cd1cf5beed33bacfad0b1a00566988d322aecaee"} Mar 18 10:47:50 crc kubenswrapper[4733]: I0318 10:47:50.281098 4733 generic.go:334] "Generic (PLEG): container finished" podID="73638d51-1c01-4323-91f7-ef8fba5f8654" containerID="c5107691d43bccc1194ae8e4cd1cf5beed33bacfad0b1a00566988d322aecaee" exitCode=0 Mar 18 10:47:50 crc kubenswrapper[4733]: I0318 10:47:50.281176 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7fmwk" event={"ID":"73638d51-1c01-4323-91f7-ef8fba5f8654","Type":"ContainerDied","Data":"c5107691d43bccc1194ae8e4cd1cf5beed33bacfad0b1a00566988d322aecaee"} Mar 18 10:47:51 crc kubenswrapper[4733]: I0318 10:47:51.315667 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7fmwk" event={"ID":"73638d51-1c01-4323-91f7-ef8fba5f8654","Type":"ContainerStarted","Data":"19e7075c2d39be3d5800b88e9f5f6ed50e174eda4a0526fd16da79a0781fe40d"} Mar 18 10:47:51 crc kubenswrapper[4733]: I0318 10:47:51.345280 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7fmwk" podStartSLOduration=2.912951338 podStartE2EDuration="5.34525992s" podCreationTimestamp="2026-03-18 10:47:46 +0000 UTC" firstStartedPulling="2026-03-18 10:47:48.261231371 +0000 UTC m=+2107.752965726" lastFinishedPulling="2026-03-18 10:47:50.693539953 +0000 UTC m=+2110.185274308" observedRunningTime="2026-03-18 10:47:51.341830333 +0000 UTC m=+2110.833564698" watchObservedRunningTime="2026-03-18 10:47:51.34525992 +0000 UTC m=+2110.836994255" Mar 18 10:47:54 crc kubenswrapper[4733]: I0318 10:47:54.175514 4733 scope.go:117] "RemoveContainer" containerID="5270a9bb2caacb712ce1222dbbad48bb5b8a42db385b8791217d8ea1eab704e8" Mar 18 10:47:54 crc kubenswrapper[4733]: E0318 10:47:54.176261 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:47:57 crc kubenswrapper[4733]: I0318 10:47:57.041609 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7fmwk" Mar 18 10:47:57 crc kubenswrapper[4733]: I0318 10:47:57.042016 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7fmwk" Mar 18 10:47:57 crc kubenswrapper[4733]: I0318 10:47:57.121141 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7fmwk" Mar 18 10:47:57 crc kubenswrapper[4733]: I0318 10:47:57.441375 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7fmwk" Mar 18 10:47:57 crc kubenswrapper[4733]: I0318 10:47:57.498627 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7fmwk"] Mar 18 10:47:59 crc kubenswrapper[4733]: I0318 10:47:59.385859 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7fmwk" podUID="73638d51-1c01-4323-91f7-ef8fba5f8654" containerName="registry-server" containerID="cri-o://19e7075c2d39be3d5800b88e9f5f6ed50e174eda4a0526fd16da79a0781fe40d" gracePeriod=2 Mar 18 10:47:59 crc kubenswrapper[4733]: I0318 10:47:59.910825 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7fmwk" Mar 18 10:48:00 crc kubenswrapper[4733]: I0318 10:48:00.111363 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73638d51-1c01-4323-91f7-ef8fba5f8654-utilities\") pod \"73638d51-1c01-4323-91f7-ef8fba5f8654\" (UID: \"73638d51-1c01-4323-91f7-ef8fba5f8654\") " Mar 18 10:48:00 crc kubenswrapper[4733]: I0318 10:48:00.111503 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfbkb\" (UniqueName: \"kubernetes.io/projected/73638d51-1c01-4323-91f7-ef8fba5f8654-kube-api-access-vfbkb\") pod \"73638d51-1c01-4323-91f7-ef8fba5f8654\" (UID: \"73638d51-1c01-4323-91f7-ef8fba5f8654\") " Mar 18 10:48:00 crc kubenswrapper[4733]: I0318 10:48:00.111660 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73638d51-1c01-4323-91f7-ef8fba5f8654-catalog-content\") pod \"73638d51-1c01-4323-91f7-ef8fba5f8654\" (UID: \"73638d51-1c01-4323-91f7-ef8fba5f8654\") " Mar 18 10:48:00 crc kubenswrapper[4733]: I0318 10:48:00.112595 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73638d51-1c01-4323-91f7-ef8fba5f8654-utilities" (OuterVolumeSpecName: "utilities") pod "73638d51-1c01-4323-91f7-ef8fba5f8654" (UID: "73638d51-1c01-4323-91f7-ef8fba5f8654"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:48:00 crc kubenswrapper[4733]: I0318 10:48:00.131857 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73638d51-1c01-4323-91f7-ef8fba5f8654-kube-api-access-vfbkb" (OuterVolumeSpecName: "kube-api-access-vfbkb") pod "73638d51-1c01-4323-91f7-ef8fba5f8654" (UID: "73638d51-1c01-4323-91f7-ef8fba5f8654"). InnerVolumeSpecName "kube-api-access-vfbkb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:48:00 crc kubenswrapper[4733]: I0318 10:48:00.163323 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563848-2tfjq"] Mar 18 10:48:00 crc kubenswrapper[4733]: E0318 10:48:00.167173 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73638d51-1c01-4323-91f7-ef8fba5f8654" containerName="registry-server" Mar 18 10:48:00 crc kubenswrapper[4733]: I0318 10:48:00.167380 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="73638d51-1c01-4323-91f7-ef8fba5f8654" containerName="registry-server" Mar 18 10:48:00 crc kubenswrapper[4733]: E0318 10:48:00.167492 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73638d51-1c01-4323-91f7-ef8fba5f8654" containerName="extract-utilities" Mar 18 10:48:00 crc kubenswrapper[4733]: I0318 10:48:00.167591 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="73638d51-1c01-4323-91f7-ef8fba5f8654" containerName="extract-utilities" Mar 18 10:48:00 crc kubenswrapper[4733]: E0318 10:48:00.167698 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73638d51-1c01-4323-91f7-ef8fba5f8654" containerName="extract-content" Mar 18 10:48:00 crc kubenswrapper[4733]: I0318 10:48:00.167799 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="73638d51-1c01-4323-91f7-ef8fba5f8654" containerName="extract-content" Mar 18 10:48:00 crc kubenswrapper[4733]: I0318 10:48:00.169583 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="73638d51-1c01-4323-91f7-ef8fba5f8654" containerName="registry-server" Mar 18 10:48:00 crc kubenswrapper[4733]: I0318 10:48:00.170627 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563848-2tfjq" Mar 18 10:48:00 crc kubenswrapper[4733]: I0318 10:48:00.173940 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wmd5k" Mar 18 10:48:00 crc kubenswrapper[4733]: I0318 10:48:00.174347 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:48:00 crc kubenswrapper[4733]: I0318 10:48:00.174564 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:48:00 crc kubenswrapper[4733]: I0318 10:48:00.181385 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563848-2tfjq"] Mar 18 10:48:00 crc kubenswrapper[4733]: I0318 10:48:00.190989 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73638d51-1c01-4323-91f7-ef8fba5f8654-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "73638d51-1c01-4323-91f7-ef8fba5f8654" (UID: "73638d51-1c01-4323-91f7-ef8fba5f8654"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:48:00 crc kubenswrapper[4733]: I0318 10:48:00.214154 4733 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73638d51-1c01-4323-91f7-ef8fba5f8654-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 10:48:00 crc kubenswrapper[4733]: I0318 10:48:00.214219 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfbkb\" (UniqueName: \"kubernetes.io/projected/73638d51-1c01-4323-91f7-ef8fba5f8654-kube-api-access-vfbkb\") on node \"crc\" DevicePath \"\"" Mar 18 10:48:00 crc kubenswrapper[4733]: I0318 10:48:00.214234 4733 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73638d51-1c01-4323-91f7-ef8fba5f8654-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 10:48:00 crc kubenswrapper[4733]: I0318 10:48:00.315544 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x77ls\" (UniqueName: \"kubernetes.io/projected/1bc1a1de-2965-422f-8ac4-77358d0d7df6-kube-api-access-x77ls\") pod \"auto-csr-approver-29563848-2tfjq\" (UID: \"1bc1a1de-2965-422f-8ac4-77358d0d7df6\") " pod="openshift-infra/auto-csr-approver-29563848-2tfjq" Mar 18 10:48:00 crc kubenswrapper[4733]: I0318 10:48:00.395975 4733 generic.go:334] "Generic (PLEG): container finished" podID="73638d51-1c01-4323-91f7-ef8fba5f8654" containerID="19e7075c2d39be3d5800b88e9f5f6ed50e174eda4a0526fd16da79a0781fe40d" exitCode=0 Mar 18 10:48:00 crc kubenswrapper[4733]: I0318 10:48:00.396034 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7fmwk" event={"ID":"73638d51-1c01-4323-91f7-ef8fba5f8654","Type":"ContainerDied","Data":"19e7075c2d39be3d5800b88e9f5f6ed50e174eda4a0526fd16da79a0781fe40d"} Mar 18 10:48:00 crc kubenswrapper[4733]: I0318 10:48:00.396057 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7fmwk" Mar 18 10:48:00 crc kubenswrapper[4733]: I0318 10:48:00.396073 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7fmwk" event={"ID":"73638d51-1c01-4323-91f7-ef8fba5f8654","Type":"ContainerDied","Data":"928d80f6e5ee50442d6ebfaa23c4a5899ef5981249f762df7a2c9401fd78c54d"} Mar 18 10:48:00 crc kubenswrapper[4733]: I0318 10:48:00.396107 4733 scope.go:117] "RemoveContainer" containerID="19e7075c2d39be3d5800b88e9f5f6ed50e174eda4a0526fd16da79a0781fe40d" Mar 18 10:48:00 crc kubenswrapper[4733]: I0318 10:48:00.423328 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x77ls\" (UniqueName: \"kubernetes.io/projected/1bc1a1de-2965-422f-8ac4-77358d0d7df6-kube-api-access-x77ls\") pod \"auto-csr-approver-29563848-2tfjq\" (UID: \"1bc1a1de-2965-422f-8ac4-77358d0d7df6\") " pod="openshift-infra/auto-csr-approver-29563848-2tfjq" Mar 18 10:48:00 crc kubenswrapper[4733]: I0318 10:48:00.439743 4733 scope.go:117] "RemoveContainer" containerID="c5107691d43bccc1194ae8e4cd1cf5beed33bacfad0b1a00566988d322aecaee" Mar 18 10:48:00 crc kubenswrapper[4733]: I0318 10:48:00.445838 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7fmwk"] Mar 18 10:48:00 crc kubenswrapper[4733]: I0318 10:48:00.458754 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7fmwk"] Mar 18 10:48:00 crc kubenswrapper[4733]: I0318 10:48:00.459106 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x77ls\" (UniqueName: \"kubernetes.io/projected/1bc1a1de-2965-422f-8ac4-77358d0d7df6-kube-api-access-x77ls\") pod \"auto-csr-approver-29563848-2tfjq\" (UID: \"1bc1a1de-2965-422f-8ac4-77358d0d7df6\") " pod="openshift-infra/auto-csr-approver-29563848-2tfjq" Mar 18 10:48:00 crc kubenswrapper[4733]: I0318 10:48:00.477962 4733 scope.go:117] "RemoveContainer" containerID="7c559db9726092272a44b539af5e62593a410a2361a733d6503432107bd1fe27" Mar 18 10:48:00 crc kubenswrapper[4733]: I0318 10:48:00.519963 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563848-2tfjq" Mar 18 10:48:00 crc kubenswrapper[4733]: I0318 10:48:00.522286 4733 scope.go:117] "RemoveContainer" containerID="19e7075c2d39be3d5800b88e9f5f6ed50e174eda4a0526fd16da79a0781fe40d" Mar 18 10:48:00 crc kubenswrapper[4733]: E0318 10:48:00.522776 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19e7075c2d39be3d5800b88e9f5f6ed50e174eda4a0526fd16da79a0781fe40d\": container with ID starting with 19e7075c2d39be3d5800b88e9f5f6ed50e174eda4a0526fd16da79a0781fe40d not found: ID does not exist" containerID="19e7075c2d39be3d5800b88e9f5f6ed50e174eda4a0526fd16da79a0781fe40d" Mar 18 10:48:00 crc kubenswrapper[4733]: I0318 10:48:00.522813 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19e7075c2d39be3d5800b88e9f5f6ed50e174eda4a0526fd16da79a0781fe40d"} err="failed to get container status \"19e7075c2d39be3d5800b88e9f5f6ed50e174eda4a0526fd16da79a0781fe40d\": rpc error: code = NotFound desc = could not find container \"19e7075c2d39be3d5800b88e9f5f6ed50e174eda4a0526fd16da79a0781fe40d\": container with ID starting with 19e7075c2d39be3d5800b88e9f5f6ed50e174eda4a0526fd16da79a0781fe40d not found: ID does not exist" Mar 18 10:48:00 crc kubenswrapper[4733]: I0318 10:48:00.522863 4733 scope.go:117] "RemoveContainer" containerID="c5107691d43bccc1194ae8e4cd1cf5beed33bacfad0b1a00566988d322aecaee" Mar 18 10:48:00 crc kubenswrapper[4733]: E0318 10:48:00.523177 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5107691d43bccc1194ae8e4cd1cf5beed33bacfad0b1a00566988d322aecaee\": container with ID starting with c5107691d43bccc1194ae8e4cd1cf5beed33bacfad0b1a00566988d322aecaee not found: ID does not exist" containerID="c5107691d43bccc1194ae8e4cd1cf5beed33bacfad0b1a00566988d322aecaee" Mar 18 10:48:00 crc kubenswrapper[4733]: I0318 10:48:00.523221 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5107691d43bccc1194ae8e4cd1cf5beed33bacfad0b1a00566988d322aecaee"} err="failed to get container status \"c5107691d43bccc1194ae8e4cd1cf5beed33bacfad0b1a00566988d322aecaee\": rpc error: code = NotFound desc = could not find container \"c5107691d43bccc1194ae8e4cd1cf5beed33bacfad0b1a00566988d322aecaee\": container with ID starting with c5107691d43bccc1194ae8e4cd1cf5beed33bacfad0b1a00566988d322aecaee not found: ID does not exist" Mar 18 10:48:00 crc kubenswrapper[4733]: I0318 10:48:00.523245 4733 scope.go:117] "RemoveContainer" containerID="7c559db9726092272a44b539af5e62593a410a2361a733d6503432107bd1fe27" Mar 18 10:48:00 crc kubenswrapper[4733]: E0318 10:48:00.523591 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c559db9726092272a44b539af5e62593a410a2361a733d6503432107bd1fe27\": container with ID starting with 7c559db9726092272a44b539af5e62593a410a2361a733d6503432107bd1fe27 not found: ID does not exist" containerID="7c559db9726092272a44b539af5e62593a410a2361a733d6503432107bd1fe27" Mar 18 10:48:00 crc kubenswrapper[4733]: I0318 10:48:00.523624 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c559db9726092272a44b539af5e62593a410a2361a733d6503432107bd1fe27"} err="failed to get container status \"7c559db9726092272a44b539af5e62593a410a2361a733d6503432107bd1fe27\": rpc error: code = NotFound desc = could not find container \"7c559db9726092272a44b539af5e62593a410a2361a733d6503432107bd1fe27\": container with ID starting with 7c559db9726092272a44b539af5e62593a410a2361a733d6503432107bd1fe27 not found: ID does not exist" Mar 18 10:48:01 crc kubenswrapper[4733]: W0318 10:48:01.042962 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1bc1a1de_2965_422f_8ac4_77358d0d7df6.slice/crio-23a710f8d256ee258f90aa54f124b4ae12454a3723d975a0640cc7e9a315b4e6 WatchSource:0}: Error finding container 23a710f8d256ee258f90aa54f124b4ae12454a3723d975a0640cc7e9a315b4e6: Status 404 returned error can't find the container with id 23a710f8d256ee258f90aa54f124b4ae12454a3723d975a0640cc7e9a315b4e6 Mar 18 10:48:01 crc kubenswrapper[4733]: I0318 10:48:01.047821 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563848-2tfjq"] Mar 18 10:48:01 crc kubenswrapper[4733]: I0318 10:48:01.184661 4733 scope.go:117] "RemoveContainer" containerID="c6feb5dbe0273076fcc0fc6ebcba5d0f6774752ba12f9bdf9dbed3f20d646a98" Mar 18 10:48:01 crc kubenswrapper[4733]: E0318 10:48:01.184993 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:48:01 crc kubenswrapper[4733]: I0318 10:48:01.188988 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73638d51-1c01-4323-91f7-ef8fba5f8654" path="/var/lib/kubelet/pods/73638d51-1c01-4323-91f7-ef8fba5f8654/volumes" Mar 18 10:48:01 crc kubenswrapper[4733]: I0318 10:48:01.411475 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563848-2tfjq" event={"ID":"1bc1a1de-2965-422f-8ac4-77358d0d7df6","Type":"ContainerStarted","Data":"23a710f8d256ee258f90aa54f124b4ae12454a3723d975a0640cc7e9a315b4e6"} Mar 18 10:48:02 crc kubenswrapper[4733]: I0318 10:48:02.420972 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563848-2tfjq" event={"ID":"1bc1a1de-2965-422f-8ac4-77358d0d7df6","Type":"ContainerStarted","Data":"6fe3d8a40e1ba17924153a168a35cf8fd5e9cc3d1fdeb9b0f70b81b8350f5f56"} Mar 18 10:48:02 crc kubenswrapper[4733]: I0318 10:48:02.435980 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563848-2tfjq" podStartSLOduration=1.613556582 podStartE2EDuration="2.435958701s" podCreationTimestamp="2026-03-18 10:48:00 +0000 UTC" firstStartedPulling="2026-03-18 10:48:01.048358665 +0000 UTC m=+2120.540093030" lastFinishedPulling="2026-03-18 10:48:01.870760794 +0000 UTC m=+2121.362495149" observedRunningTime="2026-03-18 10:48:02.434074088 +0000 UTC m=+2121.925808423" watchObservedRunningTime="2026-03-18 10:48:02.435958701 +0000 UTC m=+2121.927693036" Mar 18 10:48:03 crc kubenswrapper[4733]: I0318 10:48:03.433392 4733 generic.go:334] "Generic (PLEG): container finished" podID="1bc1a1de-2965-422f-8ac4-77358d0d7df6" containerID="6fe3d8a40e1ba17924153a168a35cf8fd5e9cc3d1fdeb9b0f70b81b8350f5f56" exitCode=0 Mar 18 10:48:03 crc kubenswrapper[4733]: I0318 10:48:03.433438 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563848-2tfjq" event={"ID":"1bc1a1de-2965-422f-8ac4-77358d0d7df6","Type":"ContainerDied","Data":"6fe3d8a40e1ba17924153a168a35cf8fd5e9cc3d1fdeb9b0f70b81b8350f5f56"} Mar 18 10:48:04 crc kubenswrapper[4733]: I0318 10:48:04.780760 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563848-2tfjq" Mar 18 10:48:04 crc kubenswrapper[4733]: I0318 10:48:04.901017 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x77ls\" (UniqueName: \"kubernetes.io/projected/1bc1a1de-2965-422f-8ac4-77358d0d7df6-kube-api-access-x77ls\") pod \"1bc1a1de-2965-422f-8ac4-77358d0d7df6\" (UID: \"1bc1a1de-2965-422f-8ac4-77358d0d7df6\") " Mar 18 10:48:04 crc kubenswrapper[4733]: I0318 10:48:04.909972 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bc1a1de-2965-422f-8ac4-77358d0d7df6-kube-api-access-x77ls" (OuterVolumeSpecName: "kube-api-access-x77ls") pod "1bc1a1de-2965-422f-8ac4-77358d0d7df6" (UID: "1bc1a1de-2965-422f-8ac4-77358d0d7df6"). InnerVolumeSpecName "kube-api-access-x77ls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:48:05 crc kubenswrapper[4733]: I0318 10:48:05.003019 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x77ls\" (UniqueName: \"kubernetes.io/projected/1bc1a1de-2965-422f-8ac4-77358d0d7df6-kube-api-access-x77ls\") on node \"crc\" DevicePath \"\"" Mar 18 10:48:05 crc kubenswrapper[4733]: I0318 10:48:05.453274 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563848-2tfjq" event={"ID":"1bc1a1de-2965-422f-8ac4-77358d0d7df6","Type":"ContainerDied","Data":"23a710f8d256ee258f90aa54f124b4ae12454a3723d975a0640cc7e9a315b4e6"} Mar 18 10:48:05 crc kubenswrapper[4733]: I0318 10:48:05.453335 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23a710f8d256ee258f90aa54f124b4ae12454a3723d975a0640cc7e9a315b4e6" Mar 18 10:48:05 crc kubenswrapper[4733]: I0318 10:48:05.453355 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563848-2tfjq" Mar 18 10:48:05 crc kubenswrapper[4733]: I0318 10:48:05.541273 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563842-78hdh"] Mar 18 10:48:05 crc kubenswrapper[4733]: I0318 10:48:05.552639 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563842-78hdh"] Mar 18 10:48:06 crc kubenswrapper[4733]: I0318 10:48:06.176681 4733 scope.go:117] "RemoveContainer" containerID="5270a9bb2caacb712ce1222dbbad48bb5b8a42db385b8791217d8ea1eab704e8" Mar 18 10:48:06 crc kubenswrapper[4733]: E0318 10:48:06.177071 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:48:07 crc kubenswrapper[4733]: I0318 10:48:07.191578 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a6751ff-c17a-472f-b315-06edff233f07" path="/var/lib/kubelet/pods/8a6751ff-c17a-472f-b315-06edff233f07/volumes" Mar 18 10:48:10 crc kubenswrapper[4733]: I0318 10:48:10.583357 4733 scope.go:117] "RemoveContainer" containerID="56204ccc444d383e182644d7909ef75775c83c4da5e6940b6afcaf6c25fa0fc2" Mar 18 10:48:13 crc kubenswrapper[4733]: I0318 10:48:13.575914 4733 patch_prober.go:28] interesting pod/machine-config-daemon-2h7dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:48:13 crc kubenswrapper[4733]: I0318 10:48:13.576607 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:48:13 crc kubenswrapper[4733]: I0318 10:48:13.576675 4733 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" Mar 18 10:48:13 crc kubenswrapper[4733]: I0318 10:48:13.577588 4733 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4aabe714853c502719880f7f27bb562465a6a84fdec9e321e389ec23753f6337"} pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 10:48:13 crc kubenswrapper[4733]: I0318 10:48:13.577687 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" containerName="machine-config-daemon" containerID="cri-o://4aabe714853c502719880f7f27bb562465a6a84fdec9e321e389ec23753f6337" gracePeriod=600 Mar 18 10:48:14 crc kubenswrapper[4733]: I0318 10:48:14.175999 4733 scope.go:117] "RemoveContainer" containerID="c6feb5dbe0273076fcc0fc6ebcba5d0f6774752ba12f9bdf9dbed3f20d646a98" Mar 18 10:48:14 crc kubenswrapper[4733]: E0318 10:48:14.177114 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:48:14 crc kubenswrapper[4733]: I0318 10:48:14.536706 4733 generic.go:334] "Generic (PLEG): container finished" podID="6f75e1c5-e0c5-43df-944f-77b734070793" containerID="4aabe714853c502719880f7f27bb562465a6a84fdec9e321e389ec23753f6337" exitCode=0 Mar 18 10:48:14 crc kubenswrapper[4733]: I0318 10:48:14.536747 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" event={"ID":"6f75e1c5-e0c5-43df-944f-77b734070793","Type":"ContainerDied","Data":"4aabe714853c502719880f7f27bb562465a6a84fdec9e321e389ec23753f6337"} Mar 18 10:48:14 crc kubenswrapper[4733]: I0318 10:48:14.536773 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" event={"ID":"6f75e1c5-e0c5-43df-944f-77b734070793","Type":"ContainerStarted","Data":"13079617ef56fbdc98c390ba5bdaff3c5530411f54f691fbeb11894744ecac48"} Mar 18 10:48:14 crc kubenswrapper[4733]: I0318 10:48:14.536789 4733 scope.go:117] "RemoveContainer" containerID="fc33062a38e6003bcfe678b0b641bcd73299a07f8dcc32e6f590e8bb7c29b637" Mar 18 10:48:21 crc kubenswrapper[4733]: I0318 10:48:21.179792 4733 scope.go:117] "RemoveContainer" containerID="5270a9bb2caacb712ce1222dbbad48bb5b8a42db385b8791217d8ea1eab704e8" Mar 18 10:48:21 crc kubenswrapper[4733]: E0318 10:48:21.180560 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:48:26 crc kubenswrapper[4733]: I0318 10:48:26.176926 4733 scope.go:117] "RemoveContainer" containerID="c6feb5dbe0273076fcc0fc6ebcba5d0f6774752ba12f9bdf9dbed3f20d646a98" Mar 18 10:48:26 crc kubenswrapper[4733]: I0318 10:48:26.671917 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f0570ce4-1455-4698-85cf-01f7108d9e7f","Type":"ContainerStarted","Data":"52c377e9a60c9ca96c08e610b060d46b5fcfe8f4ca8351f71d96116255ccee60"} Mar 18 10:48:26 crc kubenswrapper[4733]: I0318 10:48:26.673120 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 18 10:48:30 crc kubenswrapper[4733]: I0318 10:48:30.709972 4733 generic.go:334] "Generic (PLEG): container finished" podID="f0570ce4-1455-4698-85cf-01f7108d9e7f" containerID="52c377e9a60c9ca96c08e610b060d46b5fcfe8f4ca8351f71d96116255ccee60" exitCode=0 Mar 18 10:48:30 crc kubenswrapper[4733]: I0318 10:48:30.710102 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f0570ce4-1455-4698-85cf-01f7108d9e7f","Type":"ContainerDied","Data":"52c377e9a60c9ca96c08e610b060d46b5fcfe8f4ca8351f71d96116255ccee60"} Mar 18 10:48:30 crc kubenswrapper[4733]: I0318 10:48:30.710530 4733 scope.go:117] "RemoveContainer" containerID="c6feb5dbe0273076fcc0fc6ebcba5d0f6774752ba12f9bdf9dbed3f20d646a98" Mar 18 10:48:30 crc kubenswrapper[4733]: I0318 10:48:30.711421 4733 scope.go:117] "RemoveContainer" containerID="52c377e9a60c9ca96c08e610b060d46b5fcfe8f4ca8351f71d96116255ccee60" Mar 18 10:48:30 crc kubenswrapper[4733]: E0318 10:48:30.711792 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:48:32 crc kubenswrapper[4733]: I0318 10:48:32.176218 4733 scope.go:117] "RemoveContainer" containerID="5270a9bb2caacb712ce1222dbbad48bb5b8a42db385b8791217d8ea1eab704e8" Mar 18 10:48:32 crc kubenswrapper[4733]: I0318 10:48:32.735312 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b4a4e3e2-bd4d-4f8d-97bc-51267378ab03","Type":"ContainerStarted","Data":"615d6075a16d1723238d5f484c97fecdaa694488b8494a55c0a3d329cf030b8f"} Mar 18 10:48:32 crc kubenswrapper[4733]: I0318 10:48:32.735832 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 18 10:48:37 crc kubenswrapper[4733]: I0318 10:48:37.779537 4733 generic.go:334] "Generic (PLEG): container finished" podID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" containerID="615d6075a16d1723238d5f484c97fecdaa694488b8494a55c0a3d329cf030b8f" exitCode=0 Mar 18 10:48:37 crc kubenswrapper[4733]: I0318 10:48:37.779620 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b4a4e3e2-bd4d-4f8d-97bc-51267378ab03","Type":"ContainerDied","Data":"615d6075a16d1723238d5f484c97fecdaa694488b8494a55c0a3d329cf030b8f"} Mar 18 10:48:37 crc kubenswrapper[4733]: I0318 10:48:37.779842 4733 scope.go:117] "RemoveContainer" containerID="5270a9bb2caacb712ce1222dbbad48bb5b8a42db385b8791217d8ea1eab704e8" Mar 18 10:48:37 crc kubenswrapper[4733]: I0318 10:48:37.780658 4733 scope.go:117] "RemoveContainer" containerID="615d6075a16d1723238d5f484c97fecdaa694488b8494a55c0a3d329cf030b8f" Mar 18 10:48:37 crc kubenswrapper[4733]: E0318 10:48:37.781252 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:48:43 crc kubenswrapper[4733]: I0318 10:48:43.176383 4733 scope.go:117] "RemoveContainer" containerID="52c377e9a60c9ca96c08e610b060d46b5fcfe8f4ca8351f71d96116255ccee60" Mar 18 10:48:43 crc kubenswrapper[4733]: E0318 10:48:43.177363 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:48:51 crc kubenswrapper[4733]: I0318 10:48:51.187069 4733 scope.go:117] "RemoveContainer" containerID="615d6075a16d1723238d5f484c97fecdaa694488b8494a55c0a3d329cf030b8f" Mar 18 10:48:51 crc kubenswrapper[4733]: E0318 10:48:51.188426 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:48:57 crc kubenswrapper[4733]: I0318 10:48:57.176583 4733 scope.go:117] "RemoveContainer" containerID="52c377e9a60c9ca96c08e610b060d46b5fcfe8f4ca8351f71d96116255ccee60" Mar 18 10:48:57 crc kubenswrapper[4733]: E0318 10:48:57.177854 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:49:02 crc kubenswrapper[4733]: I0318 10:49:02.176826 4733 scope.go:117] "RemoveContainer" containerID="615d6075a16d1723238d5f484c97fecdaa694488b8494a55c0a3d329cf030b8f" Mar 18 10:49:02 crc kubenswrapper[4733]: E0318 10:49:02.177510 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:49:10 crc kubenswrapper[4733]: I0318 10:49:10.176168 4733 scope.go:117] "RemoveContainer" containerID="52c377e9a60c9ca96c08e610b060d46b5fcfe8f4ca8351f71d96116255ccee60" Mar 18 10:49:10 crc kubenswrapper[4733]: E0318 10:49:10.177301 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:49:14 crc kubenswrapper[4733]: I0318 10:49:14.176704 4733 scope.go:117] "RemoveContainer" containerID="615d6075a16d1723238d5f484c97fecdaa694488b8494a55c0a3d329cf030b8f" Mar 18 10:49:14 crc kubenswrapper[4733]: E0318 10:49:14.177778 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:49:23 crc kubenswrapper[4733]: I0318 10:49:23.176499 4733 scope.go:117] "RemoveContainer" containerID="52c377e9a60c9ca96c08e610b060d46b5fcfe8f4ca8351f71d96116255ccee60" Mar 18 10:49:23 crc kubenswrapper[4733]: E0318 10:49:23.177831 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:49:28 crc kubenswrapper[4733]: I0318 10:49:28.175665 4733 scope.go:117] "RemoveContainer" containerID="615d6075a16d1723238d5f484c97fecdaa694488b8494a55c0a3d329cf030b8f" Mar 18 10:49:28 crc kubenswrapper[4733]: E0318 10:49:28.176881 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:49:38 crc kubenswrapper[4733]: I0318 10:49:38.175699 4733 scope.go:117] "RemoveContainer" containerID="52c377e9a60c9ca96c08e610b060d46b5fcfe8f4ca8351f71d96116255ccee60" Mar 18 10:49:38 crc kubenswrapper[4733]: E0318 10:49:38.176675 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:49:40 crc kubenswrapper[4733]: I0318 10:49:40.175928 4733 scope.go:117] "RemoveContainer" containerID="615d6075a16d1723238d5f484c97fecdaa694488b8494a55c0a3d329cf030b8f" Mar 18 10:49:40 crc kubenswrapper[4733]: E0318 10:49:40.176991 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:49:47 crc kubenswrapper[4733]: I0318 10:49:47.634946 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2nkxk"] Mar 18 10:49:47 crc kubenswrapper[4733]: E0318 10:49:47.636483 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bc1a1de-2965-422f-8ac4-77358d0d7df6" containerName="oc" Mar 18 10:49:47 crc kubenswrapper[4733]: I0318 10:49:47.636506 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bc1a1de-2965-422f-8ac4-77358d0d7df6" containerName="oc" Mar 18 10:49:47 crc kubenswrapper[4733]: I0318 10:49:47.637019 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bc1a1de-2965-422f-8ac4-77358d0d7df6" containerName="oc" Mar 18 10:49:47 crc kubenswrapper[4733]: I0318 10:49:47.641996 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2nkxk" Mar 18 10:49:47 crc kubenswrapper[4733]: I0318 10:49:47.692314 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2nkxk"] Mar 18 10:49:47 crc kubenswrapper[4733]: I0318 10:49:47.755852 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5h8vm\" (UniqueName: \"kubernetes.io/projected/c9c6e055-5d64-4158-b0d2-27de0cb9f7c2-kube-api-access-5h8vm\") pod \"redhat-operators-2nkxk\" (UID: \"c9c6e055-5d64-4158-b0d2-27de0cb9f7c2\") " pod="openshift-marketplace/redhat-operators-2nkxk" Mar 18 10:49:47 crc kubenswrapper[4733]: I0318 10:49:47.755960 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9c6e055-5d64-4158-b0d2-27de0cb9f7c2-utilities\") pod \"redhat-operators-2nkxk\" (UID: \"c9c6e055-5d64-4158-b0d2-27de0cb9f7c2\") " pod="openshift-marketplace/redhat-operators-2nkxk" Mar 18 10:49:47 crc kubenswrapper[4733]: I0318 10:49:47.755993 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9c6e055-5d64-4158-b0d2-27de0cb9f7c2-catalog-content\") pod \"redhat-operators-2nkxk\" (UID: \"c9c6e055-5d64-4158-b0d2-27de0cb9f7c2\") " pod="openshift-marketplace/redhat-operators-2nkxk" Mar 18 10:49:47 crc kubenswrapper[4733]: I0318 10:49:47.857975 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5h8vm\" (UniqueName: \"kubernetes.io/projected/c9c6e055-5d64-4158-b0d2-27de0cb9f7c2-kube-api-access-5h8vm\") pod \"redhat-operators-2nkxk\" (UID: \"c9c6e055-5d64-4158-b0d2-27de0cb9f7c2\") " pod="openshift-marketplace/redhat-operators-2nkxk" Mar 18 10:49:47 crc kubenswrapper[4733]: I0318 10:49:47.858084 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9c6e055-5d64-4158-b0d2-27de0cb9f7c2-utilities\") pod \"redhat-operators-2nkxk\" (UID: \"c9c6e055-5d64-4158-b0d2-27de0cb9f7c2\") " pod="openshift-marketplace/redhat-operators-2nkxk" Mar 18 10:49:47 crc kubenswrapper[4733]: I0318 10:49:47.858122 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9c6e055-5d64-4158-b0d2-27de0cb9f7c2-catalog-content\") pod \"redhat-operators-2nkxk\" (UID: \"c9c6e055-5d64-4158-b0d2-27de0cb9f7c2\") " pod="openshift-marketplace/redhat-operators-2nkxk" Mar 18 10:49:47 crc kubenswrapper[4733]: I0318 10:49:47.858723 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9c6e055-5d64-4158-b0d2-27de0cb9f7c2-catalog-content\") pod \"redhat-operators-2nkxk\" (UID: \"c9c6e055-5d64-4158-b0d2-27de0cb9f7c2\") " pod="openshift-marketplace/redhat-operators-2nkxk" Mar 18 10:49:47 crc kubenswrapper[4733]: I0318 10:49:47.858869 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9c6e055-5d64-4158-b0d2-27de0cb9f7c2-utilities\") pod \"redhat-operators-2nkxk\" (UID: \"c9c6e055-5d64-4158-b0d2-27de0cb9f7c2\") " pod="openshift-marketplace/redhat-operators-2nkxk" Mar 18 10:49:47 crc kubenswrapper[4733]: I0318 10:49:47.880816 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5h8vm\" (UniqueName: \"kubernetes.io/projected/c9c6e055-5d64-4158-b0d2-27de0cb9f7c2-kube-api-access-5h8vm\") pod \"redhat-operators-2nkxk\" (UID: \"c9c6e055-5d64-4158-b0d2-27de0cb9f7c2\") " pod="openshift-marketplace/redhat-operators-2nkxk" Mar 18 10:49:48 crc kubenswrapper[4733]: I0318 10:49:48.002512 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2nkxk" Mar 18 10:49:48 crc kubenswrapper[4733]: I0318 10:49:48.220543 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-p567z"] Mar 18 10:49:48 crc kubenswrapper[4733]: I0318 10:49:48.223502 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p567z" Mar 18 10:49:48 crc kubenswrapper[4733]: I0318 10:49:48.240049 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p567z"] Mar 18 10:49:48 crc kubenswrapper[4733]: I0318 10:49:48.365227 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b49b68f2-ffd0-4ff7-a4ae-691c0a71e896-utilities\") pod \"redhat-marketplace-p567z\" (UID: \"b49b68f2-ffd0-4ff7-a4ae-691c0a71e896\") " pod="openshift-marketplace/redhat-marketplace-p567z" Mar 18 10:49:48 crc kubenswrapper[4733]: I0318 10:49:48.365568 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gxdl\" (UniqueName: \"kubernetes.io/projected/b49b68f2-ffd0-4ff7-a4ae-691c0a71e896-kube-api-access-6gxdl\") pod \"redhat-marketplace-p567z\" (UID: \"b49b68f2-ffd0-4ff7-a4ae-691c0a71e896\") " pod="openshift-marketplace/redhat-marketplace-p567z" Mar 18 10:49:48 crc kubenswrapper[4733]: I0318 10:49:48.365702 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b49b68f2-ffd0-4ff7-a4ae-691c0a71e896-catalog-content\") pod \"redhat-marketplace-p567z\" (UID: \"b49b68f2-ffd0-4ff7-a4ae-691c0a71e896\") " pod="openshift-marketplace/redhat-marketplace-p567z" Mar 18 10:49:48 crc kubenswrapper[4733]: I0318 10:49:48.469353 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2nkxk"] Mar 18 10:49:48 crc kubenswrapper[4733]: I0318 10:49:48.470302 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gxdl\" (UniqueName: \"kubernetes.io/projected/b49b68f2-ffd0-4ff7-a4ae-691c0a71e896-kube-api-access-6gxdl\") pod \"redhat-marketplace-p567z\" (UID: \"b49b68f2-ffd0-4ff7-a4ae-691c0a71e896\") " pod="openshift-marketplace/redhat-marketplace-p567z" Mar 18 10:49:48 crc kubenswrapper[4733]: I0318 10:49:48.470394 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b49b68f2-ffd0-4ff7-a4ae-691c0a71e896-catalog-content\") pod \"redhat-marketplace-p567z\" (UID: \"b49b68f2-ffd0-4ff7-a4ae-691c0a71e896\") " pod="openshift-marketplace/redhat-marketplace-p567z" Mar 18 10:49:48 crc kubenswrapper[4733]: I0318 10:49:48.470416 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b49b68f2-ffd0-4ff7-a4ae-691c0a71e896-utilities\") pod \"redhat-marketplace-p567z\" (UID: \"b49b68f2-ffd0-4ff7-a4ae-691c0a71e896\") " pod="openshift-marketplace/redhat-marketplace-p567z" Mar 18 10:49:48 crc kubenswrapper[4733]: I0318 10:49:48.470982 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b49b68f2-ffd0-4ff7-a4ae-691c0a71e896-utilities\") pod \"redhat-marketplace-p567z\" (UID: \"b49b68f2-ffd0-4ff7-a4ae-691c0a71e896\") " pod="openshift-marketplace/redhat-marketplace-p567z" Mar 18 10:49:48 crc kubenswrapper[4733]: I0318 10:49:48.471385 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b49b68f2-ffd0-4ff7-a4ae-691c0a71e896-catalog-content\") pod \"redhat-marketplace-p567z\" (UID: \"b49b68f2-ffd0-4ff7-a4ae-691c0a71e896\") " pod="openshift-marketplace/redhat-marketplace-p567z" Mar 18 10:49:48 crc kubenswrapper[4733]: I0318 10:49:48.494065 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gxdl\" (UniqueName: \"kubernetes.io/projected/b49b68f2-ffd0-4ff7-a4ae-691c0a71e896-kube-api-access-6gxdl\") pod \"redhat-marketplace-p567z\" (UID: \"b49b68f2-ffd0-4ff7-a4ae-691c0a71e896\") " pod="openshift-marketplace/redhat-marketplace-p567z" Mar 18 10:49:48 crc kubenswrapper[4733]: I0318 10:49:48.500605 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2nkxk" event={"ID":"c9c6e055-5d64-4158-b0d2-27de0cb9f7c2","Type":"ContainerStarted","Data":"ecd120dd90e1036bb5e6d554a5806a7bd302457439032a59781bedfcd41b555a"} Mar 18 10:49:48 crc kubenswrapper[4733]: I0318 10:49:48.540502 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p567z" Mar 18 10:49:48 crc kubenswrapper[4733]: I0318 10:49:48.960584 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p567z"] Mar 18 10:49:49 crc kubenswrapper[4733]: I0318 10:49:49.509103 4733 generic.go:334] "Generic (PLEG): container finished" podID="b49b68f2-ffd0-4ff7-a4ae-691c0a71e896" containerID="716b0e092d29f01bae19a735d99fb8b66ebdd4dfe0eb107d62d5d4661ad290b4" exitCode=0 Mar 18 10:49:49 crc kubenswrapper[4733]: I0318 10:49:49.509180 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p567z" event={"ID":"b49b68f2-ffd0-4ff7-a4ae-691c0a71e896","Type":"ContainerDied","Data":"716b0e092d29f01bae19a735d99fb8b66ebdd4dfe0eb107d62d5d4661ad290b4"} Mar 18 10:49:49 crc kubenswrapper[4733]: I0318 10:49:49.509271 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p567z" event={"ID":"b49b68f2-ffd0-4ff7-a4ae-691c0a71e896","Type":"ContainerStarted","Data":"5941ae8ebdec11816d7bbdab01c89fa7a8cb1647d4a432cd5fc43d5b8d08123c"} Mar 18 10:49:49 crc kubenswrapper[4733]: I0318 10:49:49.511053 4733 generic.go:334] "Generic (PLEG): container finished" podID="c9c6e055-5d64-4158-b0d2-27de0cb9f7c2" containerID="c629fb05ebbbd1dadfdfa794e8f50fea898971c063ea18eb304ec5d519dceb42" exitCode=0 Mar 18 10:49:49 crc kubenswrapper[4733]: I0318 10:49:49.511079 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2nkxk" event={"ID":"c9c6e055-5d64-4158-b0d2-27de0cb9f7c2","Type":"ContainerDied","Data":"c629fb05ebbbd1dadfdfa794e8f50fea898971c063ea18eb304ec5d519dceb42"} Mar 18 10:49:49 crc kubenswrapper[4733]: I0318 10:49:49.511565 4733 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 10:49:50 crc kubenswrapper[4733]: I0318 10:49:50.176127 4733 scope.go:117] "RemoveContainer" containerID="52c377e9a60c9ca96c08e610b060d46b5fcfe8f4ca8351f71d96116255ccee60" Mar 18 10:49:50 crc kubenswrapper[4733]: E0318 10:49:50.176946 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:49:50 crc kubenswrapper[4733]: I0318 10:49:50.519812 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p567z" event={"ID":"b49b68f2-ffd0-4ff7-a4ae-691c0a71e896","Type":"ContainerStarted","Data":"d6b6c73dfb062b43a2fff48a697f356e2654828365f2c4918111e23d64e38e06"} Mar 18 10:49:50 crc kubenswrapper[4733]: I0318 10:49:50.525266 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2nkxk" event={"ID":"c9c6e055-5d64-4158-b0d2-27de0cb9f7c2","Type":"ContainerStarted","Data":"c30c08fe95cea2bf662162f1cab4d1b78eb0e6f559bd1de70cda2437fd791ae7"} Mar 18 10:49:51 crc kubenswrapper[4733]: I0318 10:49:51.182280 4733 scope.go:117] "RemoveContainer" containerID="615d6075a16d1723238d5f484c97fecdaa694488b8494a55c0a3d329cf030b8f" Mar 18 10:49:51 crc kubenswrapper[4733]: E0318 10:49:51.183267 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:49:51 crc kubenswrapper[4733]: I0318 10:49:51.538288 4733 generic.go:334] "Generic (PLEG): container finished" podID="b49b68f2-ffd0-4ff7-a4ae-691c0a71e896" containerID="d6b6c73dfb062b43a2fff48a697f356e2654828365f2c4918111e23d64e38e06" exitCode=0 Mar 18 10:49:51 crc kubenswrapper[4733]: I0318 10:49:51.538369 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p567z" event={"ID":"b49b68f2-ffd0-4ff7-a4ae-691c0a71e896","Type":"ContainerDied","Data":"d6b6c73dfb062b43a2fff48a697f356e2654828365f2c4918111e23d64e38e06"} Mar 18 10:49:51 crc kubenswrapper[4733]: I0318 10:49:51.540974 4733 generic.go:334] "Generic (PLEG): container finished" podID="c9c6e055-5d64-4158-b0d2-27de0cb9f7c2" containerID="c30c08fe95cea2bf662162f1cab4d1b78eb0e6f559bd1de70cda2437fd791ae7" exitCode=0 Mar 18 10:49:51 crc kubenswrapper[4733]: I0318 10:49:51.541030 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2nkxk" event={"ID":"c9c6e055-5d64-4158-b0d2-27de0cb9f7c2","Type":"ContainerDied","Data":"c30c08fe95cea2bf662162f1cab4d1b78eb0e6f559bd1de70cda2437fd791ae7"} Mar 18 10:49:52 crc kubenswrapper[4733]: I0318 10:49:52.553830 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p567z" event={"ID":"b49b68f2-ffd0-4ff7-a4ae-691c0a71e896","Type":"ContainerStarted","Data":"a2de23c37da85617652573e8df45d4ddd82910052cd6fefb446ea209a4de9185"} Mar 18 10:49:52 crc kubenswrapper[4733]: I0318 10:49:52.557334 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2nkxk" event={"ID":"c9c6e055-5d64-4158-b0d2-27de0cb9f7c2","Type":"ContainerStarted","Data":"5a8d6935c3c6f86eaf07e3b76ff0680d7842eea801abcd908667011593413262"} Mar 18 10:49:52 crc kubenswrapper[4733]: I0318 10:49:52.595041 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-p567z" podStartSLOduration=2.018794697 podStartE2EDuration="4.595012614s" podCreationTimestamp="2026-03-18 10:49:48 +0000 UTC" firstStartedPulling="2026-03-18 10:49:49.511149056 +0000 UTC m=+2229.002883421" lastFinishedPulling="2026-03-18 10:49:52.087366973 +0000 UTC m=+2231.579101338" observedRunningTime="2026-03-18 10:49:52.581171234 +0000 UTC m=+2232.072905559" watchObservedRunningTime="2026-03-18 10:49:52.595012614 +0000 UTC m=+2232.086746979" Mar 18 10:49:52 crc kubenswrapper[4733]: I0318 10:49:52.604076 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2nkxk" podStartSLOduration=3.1111192 podStartE2EDuration="5.604056099s" podCreationTimestamp="2026-03-18 10:49:47 +0000 UTC" firstStartedPulling="2026-03-18 10:49:49.513836512 +0000 UTC m=+2229.005570877" lastFinishedPulling="2026-03-18 10:49:52.006773421 +0000 UTC m=+2231.498507776" observedRunningTime="2026-03-18 10:49:52.597989848 +0000 UTC m=+2232.089724193" watchObservedRunningTime="2026-03-18 10:49:52.604056099 +0000 UTC m=+2232.095790434" Mar 18 10:49:58 crc kubenswrapper[4733]: I0318 10:49:58.003059 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2nkxk" Mar 18 10:49:58 crc kubenswrapper[4733]: I0318 10:49:58.003697 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2nkxk" Mar 18 10:49:58 crc kubenswrapper[4733]: I0318 10:49:58.541154 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-p567z" Mar 18 10:49:58 crc kubenswrapper[4733]: I0318 10:49:58.541920 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-p567z" Mar 18 10:49:58 crc kubenswrapper[4733]: I0318 10:49:58.599097 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-p567z" Mar 18 10:49:58 crc kubenswrapper[4733]: I0318 10:49:58.680937 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-p567z" Mar 18 10:49:58 crc kubenswrapper[4733]: I0318 10:49:58.842463 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p567z"] Mar 18 10:49:59 crc kubenswrapper[4733]: I0318 10:49:59.062163 4733 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2nkxk" podUID="c9c6e055-5d64-4158-b0d2-27de0cb9f7c2" containerName="registry-server" probeResult="failure" output=< Mar 18 10:49:59 crc kubenswrapper[4733]: timeout: failed to connect service ":50051" within 1s Mar 18 10:49:59 crc kubenswrapper[4733]: > Mar 18 10:50:00 crc kubenswrapper[4733]: I0318 10:50:00.153644 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563850-8gwpw"] Mar 18 10:50:00 crc kubenswrapper[4733]: I0318 10:50:00.155311 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563850-8gwpw" Mar 18 10:50:00 crc kubenswrapper[4733]: I0318 10:50:00.159252 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:50:00 crc kubenswrapper[4733]: I0318 10:50:00.159818 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wmd5k" Mar 18 10:50:00 crc kubenswrapper[4733]: I0318 10:50:00.160769 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:50:00 crc kubenswrapper[4733]: I0318 10:50:00.165671 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563850-8gwpw"] Mar 18 10:50:00 crc kubenswrapper[4733]: I0318 10:50:00.266847 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ww9k\" (UniqueName: \"kubernetes.io/projected/2166cf23-2a65-4b17-922e-3131be1d6d8b-kube-api-access-2ww9k\") pod \"auto-csr-approver-29563850-8gwpw\" (UID: \"2166cf23-2a65-4b17-922e-3131be1d6d8b\") " pod="openshift-infra/auto-csr-approver-29563850-8gwpw" Mar 18 10:50:00 crc kubenswrapper[4733]: I0318 10:50:00.369466 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ww9k\" (UniqueName: \"kubernetes.io/projected/2166cf23-2a65-4b17-922e-3131be1d6d8b-kube-api-access-2ww9k\") pod \"auto-csr-approver-29563850-8gwpw\" (UID: \"2166cf23-2a65-4b17-922e-3131be1d6d8b\") " pod="openshift-infra/auto-csr-approver-29563850-8gwpw" Mar 18 10:50:00 crc kubenswrapper[4733]: I0318 10:50:00.397695 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ww9k\" (UniqueName: \"kubernetes.io/projected/2166cf23-2a65-4b17-922e-3131be1d6d8b-kube-api-access-2ww9k\") pod \"auto-csr-approver-29563850-8gwpw\" (UID: \"2166cf23-2a65-4b17-922e-3131be1d6d8b\") " pod="openshift-infra/auto-csr-approver-29563850-8gwpw" Mar 18 10:50:00 crc kubenswrapper[4733]: I0318 10:50:00.497075 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563850-8gwpw" Mar 18 10:50:00 crc kubenswrapper[4733]: I0318 10:50:00.678682 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-p567z" podUID="b49b68f2-ffd0-4ff7-a4ae-691c0a71e896" containerName="registry-server" containerID="cri-o://a2de23c37da85617652573e8df45d4ddd82910052cd6fefb446ea209a4de9185" gracePeriod=2 Mar 18 10:50:01 crc kubenswrapper[4733]: I0318 10:50:01.042318 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563850-8gwpw"] Mar 18 10:50:01 crc kubenswrapper[4733]: W0318 10:50:01.053144 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2166cf23_2a65_4b17_922e_3131be1d6d8b.slice/crio-46ecf4faa11e9d76da828b425e3870304d56ac67fef7dbfbe7f42a0695dc20ec WatchSource:0}: Error finding container 46ecf4faa11e9d76da828b425e3870304d56ac67fef7dbfbe7f42a0695dc20ec: Status 404 returned error can't find the container with id 46ecf4faa11e9d76da828b425e3870304d56ac67fef7dbfbe7f42a0695dc20ec Mar 18 10:50:01 crc kubenswrapper[4733]: I0318 10:50:01.185782 4733 scope.go:117] "RemoveContainer" containerID="52c377e9a60c9ca96c08e610b060d46b5fcfe8f4ca8351f71d96116255ccee60" Mar 18 10:50:01 crc kubenswrapper[4733]: E0318 10:50:01.186411 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:50:01 crc kubenswrapper[4733]: I0318 10:50:01.690454 4733 generic.go:334] "Generic (PLEG): container finished" podID="b49b68f2-ffd0-4ff7-a4ae-691c0a71e896" containerID="a2de23c37da85617652573e8df45d4ddd82910052cd6fefb446ea209a4de9185" exitCode=0 Mar 18 10:50:01 crc kubenswrapper[4733]: I0318 10:50:01.690524 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p567z" event={"ID":"b49b68f2-ffd0-4ff7-a4ae-691c0a71e896","Type":"ContainerDied","Data":"a2de23c37da85617652573e8df45d4ddd82910052cd6fefb446ea209a4de9185"} Mar 18 10:50:01 crc kubenswrapper[4733]: I0318 10:50:01.693406 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563850-8gwpw" event={"ID":"2166cf23-2a65-4b17-922e-3131be1d6d8b","Type":"ContainerStarted","Data":"46ecf4faa11e9d76da828b425e3870304d56ac67fef7dbfbe7f42a0695dc20ec"} Mar 18 10:50:01 crc kubenswrapper[4733]: I0318 10:50:01.817473 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p567z" Mar 18 10:50:01 crc kubenswrapper[4733]: I0318 10:50:01.916717 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gxdl\" (UniqueName: \"kubernetes.io/projected/b49b68f2-ffd0-4ff7-a4ae-691c0a71e896-kube-api-access-6gxdl\") pod \"b49b68f2-ffd0-4ff7-a4ae-691c0a71e896\" (UID: \"b49b68f2-ffd0-4ff7-a4ae-691c0a71e896\") " Mar 18 10:50:01 crc kubenswrapper[4733]: I0318 10:50:01.916842 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b49b68f2-ffd0-4ff7-a4ae-691c0a71e896-utilities\") pod \"b49b68f2-ffd0-4ff7-a4ae-691c0a71e896\" (UID: \"b49b68f2-ffd0-4ff7-a4ae-691c0a71e896\") " Mar 18 10:50:01 crc kubenswrapper[4733]: I0318 10:50:01.917028 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b49b68f2-ffd0-4ff7-a4ae-691c0a71e896-catalog-content\") pod \"b49b68f2-ffd0-4ff7-a4ae-691c0a71e896\" (UID: \"b49b68f2-ffd0-4ff7-a4ae-691c0a71e896\") " Mar 18 10:50:01 crc kubenswrapper[4733]: I0318 10:50:01.917776 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b49b68f2-ffd0-4ff7-a4ae-691c0a71e896-utilities" (OuterVolumeSpecName: "utilities") pod "b49b68f2-ffd0-4ff7-a4ae-691c0a71e896" (UID: "b49b68f2-ffd0-4ff7-a4ae-691c0a71e896"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:50:01 crc kubenswrapper[4733]: I0318 10:50:01.926386 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b49b68f2-ffd0-4ff7-a4ae-691c0a71e896-kube-api-access-6gxdl" (OuterVolumeSpecName: "kube-api-access-6gxdl") pod "b49b68f2-ffd0-4ff7-a4ae-691c0a71e896" (UID: "b49b68f2-ffd0-4ff7-a4ae-691c0a71e896"). InnerVolumeSpecName "kube-api-access-6gxdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:50:01 crc kubenswrapper[4733]: I0318 10:50:01.940309 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b49b68f2-ffd0-4ff7-a4ae-691c0a71e896-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b49b68f2-ffd0-4ff7-a4ae-691c0a71e896" (UID: "b49b68f2-ffd0-4ff7-a4ae-691c0a71e896"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:50:02 crc kubenswrapper[4733]: I0318 10:50:02.018415 4733 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b49b68f2-ffd0-4ff7-a4ae-691c0a71e896-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 10:50:02 crc kubenswrapper[4733]: I0318 10:50:02.018444 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gxdl\" (UniqueName: \"kubernetes.io/projected/b49b68f2-ffd0-4ff7-a4ae-691c0a71e896-kube-api-access-6gxdl\") on node \"crc\" DevicePath \"\"" Mar 18 10:50:02 crc kubenswrapper[4733]: I0318 10:50:02.018455 4733 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b49b68f2-ffd0-4ff7-a4ae-691c0a71e896-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 10:50:02 crc kubenswrapper[4733]: I0318 10:50:02.175861 4733 scope.go:117] "RemoveContainer" containerID="615d6075a16d1723238d5f484c97fecdaa694488b8494a55c0a3d329cf030b8f" Mar 18 10:50:02 crc kubenswrapper[4733]: E0318 10:50:02.176261 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:50:02 crc kubenswrapper[4733]: I0318 10:50:02.701818 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p567z" event={"ID":"b49b68f2-ffd0-4ff7-a4ae-691c0a71e896","Type":"ContainerDied","Data":"5941ae8ebdec11816d7bbdab01c89fa7a8cb1647d4a432cd5fc43d5b8d08123c"} Mar 18 10:50:02 crc kubenswrapper[4733]: I0318 10:50:02.702113 4733 scope.go:117] "RemoveContainer" containerID="a2de23c37da85617652573e8df45d4ddd82910052cd6fefb446ea209a4de9185" Mar 18 10:50:02 crc kubenswrapper[4733]: I0318 10:50:02.701851 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p567z" Mar 18 10:50:02 crc kubenswrapper[4733]: I0318 10:50:02.704548 4733 generic.go:334] "Generic (PLEG): container finished" podID="2166cf23-2a65-4b17-922e-3131be1d6d8b" containerID="649f36c8155821a228e5fee55c54c1d5edbde655cee7563ac249384dedef675b" exitCode=0 Mar 18 10:50:02 crc kubenswrapper[4733]: I0318 10:50:02.704606 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563850-8gwpw" event={"ID":"2166cf23-2a65-4b17-922e-3131be1d6d8b","Type":"ContainerDied","Data":"649f36c8155821a228e5fee55c54c1d5edbde655cee7563ac249384dedef675b"} Mar 18 10:50:02 crc kubenswrapper[4733]: I0318 10:50:02.731919 4733 scope.go:117] "RemoveContainer" containerID="d6b6c73dfb062b43a2fff48a697f356e2654828365f2c4918111e23d64e38e06" Mar 18 10:50:02 crc kubenswrapper[4733]: I0318 10:50:02.761279 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p567z"] Mar 18 10:50:02 crc kubenswrapper[4733]: I0318 10:50:02.763221 4733 scope.go:117] "RemoveContainer" containerID="716b0e092d29f01bae19a735d99fb8b66ebdd4dfe0eb107d62d5d4661ad290b4" Mar 18 10:50:02 crc kubenswrapper[4733]: I0318 10:50:02.768687 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-p567z"] Mar 18 10:50:03 crc kubenswrapper[4733]: I0318 10:50:03.185246 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b49b68f2-ffd0-4ff7-a4ae-691c0a71e896" path="/var/lib/kubelet/pods/b49b68f2-ffd0-4ff7-a4ae-691c0a71e896/volumes" Mar 18 10:50:04 crc kubenswrapper[4733]: I0318 10:50:04.113682 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563850-8gwpw" Mar 18 10:50:04 crc kubenswrapper[4733]: I0318 10:50:04.266839 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ww9k\" (UniqueName: \"kubernetes.io/projected/2166cf23-2a65-4b17-922e-3131be1d6d8b-kube-api-access-2ww9k\") pod \"2166cf23-2a65-4b17-922e-3131be1d6d8b\" (UID: \"2166cf23-2a65-4b17-922e-3131be1d6d8b\") " Mar 18 10:50:04 crc kubenswrapper[4733]: I0318 10:50:04.283099 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2166cf23-2a65-4b17-922e-3131be1d6d8b-kube-api-access-2ww9k" (OuterVolumeSpecName: "kube-api-access-2ww9k") pod "2166cf23-2a65-4b17-922e-3131be1d6d8b" (UID: "2166cf23-2a65-4b17-922e-3131be1d6d8b"). InnerVolumeSpecName "kube-api-access-2ww9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:50:04 crc kubenswrapper[4733]: I0318 10:50:04.371124 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ww9k\" (UniqueName: \"kubernetes.io/projected/2166cf23-2a65-4b17-922e-3131be1d6d8b-kube-api-access-2ww9k\") on node \"crc\" DevicePath \"\"" Mar 18 10:50:04 crc kubenswrapper[4733]: I0318 10:50:04.734327 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563850-8gwpw" event={"ID":"2166cf23-2a65-4b17-922e-3131be1d6d8b","Type":"ContainerDied","Data":"46ecf4faa11e9d76da828b425e3870304d56ac67fef7dbfbe7f42a0695dc20ec"} Mar 18 10:50:04 crc kubenswrapper[4733]: I0318 10:50:04.734387 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46ecf4faa11e9d76da828b425e3870304d56ac67fef7dbfbe7f42a0695dc20ec" Mar 18 10:50:04 crc kubenswrapper[4733]: I0318 10:50:04.734399 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563850-8gwpw" Mar 18 10:50:05 crc kubenswrapper[4733]: I0318 10:50:05.219047 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563844-cdw7c"] Mar 18 10:50:05 crc kubenswrapper[4733]: I0318 10:50:05.230677 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563844-cdw7c"] Mar 18 10:50:07 crc kubenswrapper[4733]: I0318 10:50:07.193947 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a764647f-75c8-4ce3-82fb-b2b729a895a0" path="/var/lib/kubelet/pods/a764647f-75c8-4ce3-82fb-b2b729a895a0/volumes" Mar 18 10:50:08 crc kubenswrapper[4733]: I0318 10:50:08.087878 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2nkxk" Mar 18 10:50:08 crc kubenswrapper[4733]: I0318 10:50:08.174350 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2nkxk" Mar 18 10:50:08 crc kubenswrapper[4733]: I0318 10:50:08.339467 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2nkxk"] Mar 18 10:50:09 crc kubenswrapper[4733]: I0318 10:50:09.787849 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2nkxk" podUID="c9c6e055-5d64-4158-b0d2-27de0cb9f7c2" containerName="registry-server" containerID="cri-o://5a8d6935c3c6f86eaf07e3b76ff0680d7842eea801abcd908667011593413262" gracePeriod=2 Mar 18 10:50:10 crc kubenswrapper[4733]: I0318 10:50:10.389984 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2nkxk" Mar 18 10:50:10 crc kubenswrapper[4733]: I0318 10:50:10.502068 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5h8vm\" (UniqueName: \"kubernetes.io/projected/c9c6e055-5d64-4158-b0d2-27de0cb9f7c2-kube-api-access-5h8vm\") pod \"c9c6e055-5d64-4158-b0d2-27de0cb9f7c2\" (UID: \"c9c6e055-5d64-4158-b0d2-27de0cb9f7c2\") " Mar 18 10:50:10 crc kubenswrapper[4733]: I0318 10:50:10.502218 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9c6e055-5d64-4158-b0d2-27de0cb9f7c2-catalog-content\") pod \"c9c6e055-5d64-4158-b0d2-27de0cb9f7c2\" (UID: \"c9c6e055-5d64-4158-b0d2-27de0cb9f7c2\") " Mar 18 10:50:10 crc kubenswrapper[4733]: I0318 10:50:10.502385 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9c6e055-5d64-4158-b0d2-27de0cb9f7c2-utilities\") pod \"c9c6e055-5d64-4158-b0d2-27de0cb9f7c2\" (UID: \"c9c6e055-5d64-4158-b0d2-27de0cb9f7c2\") " Mar 18 10:50:10 crc kubenswrapper[4733]: I0318 10:50:10.504110 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9c6e055-5d64-4158-b0d2-27de0cb9f7c2-utilities" (OuterVolumeSpecName: "utilities") pod "c9c6e055-5d64-4158-b0d2-27de0cb9f7c2" (UID: "c9c6e055-5d64-4158-b0d2-27de0cb9f7c2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:50:10 crc kubenswrapper[4733]: I0318 10:50:10.508845 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9c6e055-5d64-4158-b0d2-27de0cb9f7c2-kube-api-access-5h8vm" (OuterVolumeSpecName: "kube-api-access-5h8vm") pod "c9c6e055-5d64-4158-b0d2-27de0cb9f7c2" (UID: "c9c6e055-5d64-4158-b0d2-27de0cb9f7c2"). InnerVolumeSpecName "kube-api-access-5h8vm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:50:10 crc kubenswrapper[4733]: I0318 10:50:10.604476 4733 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9c6e055-5d64-4158-b0d2-27de0cb9f7c2-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 10:50:10 crc kubenswrapper[4733]: I0318 10:50:10.604523 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5h8vm\" (UniqueName: \"kubernetes.io/projected/c9c6e055-5d64-4158-b0d2-27de0cb9f7c2-kube-api-access-5h8vm\") on node \"crc\" DevicePath \"\"" Mar 18 10:50:10 crc kubenswrapper[4733]: I0318 10:50:10.674340 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9c6e055-5d64-4158-b0d2-27de0cb9f7c2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c9c6e055-5d64-4158-b0d2-27de0cb9f7c2" (UID: "c9c6e055-5d64-4158-b0d2-27de0cb9f7c2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:50:10 crc kubenswrapper[4733]: I0318 10:50:10.706855 4733 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9c6e055-5d64-4158-b0d2-27de0cb9f7c2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 10:50:10 crc kubenswrapper[4733]: I0318 10:50:10.720951 4733 scope.go:117] "RemoveContainer" containerID="4e31a574451b3bae3c66b0663fefb42d9a6b941c8dd7f0cf1a6c603f449c0e3b" Mar 18 10:50:10 crc kubenswrapper[4733]: I0318 10:50:10.799365 4733 generic.go:334] "Generic (PLEG): container finished" podID="c9c6e055-5d64-4158-b0d2-27de0cb9f7c2" containerID="5a8d6935c3c6f86eaf07e3b76ff0680d7842eea801abcd908667011593413262" exitCode=0 Mar 18 10:50:10 crc kubenswrapper[4733]: I0318 10:50:10.799416 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2nkxk" event={"ID":"c9c6e055-5d64-4158-b0d2-27de0cb9f7c2","Type":"ContainerDied","Data":"5a8d6935c3c6f86eaf07e3b76ff0680d7842eea801abcd908667011593413262"} Mar 18 10:50:10 crc kubenswrapper[4733]: I0318 10:50:10.799453 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2nkxk" event={"ID":"c9c6e055-5d64-4158-b0d2-27de0cb9f7c2","Type":"ContainerDied","Data":"ecd120dd90e1036bb5e6d554a5806a7bd302457439032a59781bedfcd41b555a"} Mar 18 10:50:10 crc kubenswrapper[4733]: I0318 10:50:10.799458 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2nkxk" Mar 18 10:50:10 crc kubenswrapper[4733]: I0318 10:50:10.799471 4733 scope.go:117] "RemoveContainer" containerID="5a8d6935c3c6f86eaf07e3b76ff0680d7842eea801abcd908667011593413262" Mar 18 10:50:10 crc kubenswrapper[4733]: I0318 10:50:10.827059 4733 scope.go:117] "RemoveContainer" containerID="c30c08fe95cea2bf662162f1cab4d1b78eb0e6f559bd1de70cda2437fd791ae7" Mar 18 10:50:10 crc kubenswrapper[4733]: I0318 10:50:10.843293 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2nkxk"] Mar 18 10:50:10 crc kubenswrapper[4733]: I0318 10:50:10.850765 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2nkxk"] Mar 18 10:50:10 crc kubenswrapper[4733]: I0318 10:50:10.866848 4733 scope.go:117] "RemoveContainer" containerID="c629fb05ebbbd1dadfdfa794e8f50fea898971c063ea18eb304ec5d519dceb42" Mar 18 10:50:10 crc kubenswrapper[4733]: I0318 10:50:10.889370 4733 scope.go:117] "RemoveContainer" containerID="5a8d6935c3c6f86eaf07e3b76ff0680d7842eea801abcd908667011593413262" Mar 18 10:50:10 crc kubenswrapper[4733]: E0318 10:50:10.889782 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a8d6935c3c6f86eaf07e3b76ff0680d7842eea801abcd908667011593413262\": container with ID starting with 5a8d6935c3c6f86eaf07e3b76ff0680d7842eea801abcd908667011593413262 not found: ID does not exist" containerID="5a8d6935c3c6f86eaf07e3b76ff0680d7842eea801abcd908667011593413262" Mar 18 10:50:10 crc kubenswrapper[4733]: I0318 10:50:10.889843 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a8d6935c3c6f86eaf07e3b76ff0680d7842eea801abcd908667011593413262"} err="failed to get container status \"5a8d6935c3c6f86eaf07e3b76ff0680d7842eea801abcd908667011593413262\": rpc error: code = NotFound desc = could not find container \"5a8d6935c3c6f86eaf07e3b76ff0680d7842eea801abcd908667011593413262\": container with ID starting with 5a8d6935c3c6f86eaf07e3b76ff0680d7842eea801abcd908667011593413262 not found: ID does not exist" Mar 18 10:50:10 crc kubenswrapper[4733]: I0318 10:50:10.889877 4733 scope.go:117] "RemoveContainer" containerID="c30c08fe95cea2bf662162f1cab4d1b78eb0e6f559bd1de70cda2437fd791ae7" Mar 18 10:50:10 crc kubenswrapper[4733]: E0318 10:50:10.890293 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c30c08fe95cea2bf662162f1cab4d1b78eb0e6f559bd1de70cda2437fd791ae7\": container with ID starting with c30c08fe95cea2bf662162f1cab4d1b78eb0e6f559bd1de70cda2437fd791ae7 not found: ID does not exist" containerID="c30c08fe95cea2bf662162f1cab4d1b78eb0e6f559bd1de70cda2437fd791ae7" Mar 18 10:50:10 crc kubenswrapper[4733]: I0318 10:50:10.890336 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c30c08fe95cea2bf662162f1cab4d1b78eb0e6f559bd1de70cda2437fd791ae7"} err="failed to get container status \"c30c08fe95cea2bf662162f1cab4d1b78eb0e6f559bd1de70cda2437fd791ae7\": rpc error: code = NotFound desc = could not find container \"c30c08fe95cea2bf662162f1cab4d1b78eb0e6f559bd1de70cda2437fd791ae7\": container with ID starting with c30c08fe95cea2bf662162f1cab4d1b78eb0e6f559bd1de70cda2437fd791ae7 not found: ID does not exist" Mar 18 10:50:10 crc kubenswrapper[4733]: I0318 10:50:10.890367 4733 scope.go:117] "RemoveContainer" containerID="c629fb05ebbbd1dadfdfa794e8f50fea898971c063ea18eb304ec5d519dceb42" Mar 18 10:50:10 crc kubenswrapper[4733]: E0318 10:50:10.890699 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c629fb05ebbbd1dadfdfa794e8f50fea898971c063ea18eb304ec5d519dceb42\": container with ID starting with c629fb05ebbbd1dadfdfa794e8f50fea898971c063ea18eb304ec5d519dceb42 not found: ID does not exist" containerID="c629fb05ebbbd1dadfdfa794e8f50fea898971c063ea18eb304ec5d519dceb42" Mar 18 10:50:10 crc kubenswrapper[4733]: I0318 10:50:10.890733 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c629fb05ebbbd1dadfdfa794e8f50fea898971c063ea18eb304ec5d519dceb42"} err="failed to get container status \"c629fb05ebbbd1dadfdfa794e8f50fea898971c063ea18eb304ec5d519dceb42\": rpc error: code = NotFound desc = could not find container \"c629fb05ebbbd1dadfdfa794e8f50fea898971c063ea18eb304ec5d519dceb42\": container with ID starting with c629fb05ebbbd1dadfdfa794e8f50fea898971c063ea18eb304ec5d519dceb42 not found: ID does not exist" Mar 18 10:50:11 crc kubenswrapper[4733]: I0318 10:50:11.191584 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9c6e055-5d64-4158-b0d2-27de0cb9f7c2" path="/var/lib/kubelet/pods/c9c6e055-5d64-4158-b0d2-27de0cb9f7c2/volumes" Mar 18 10:50:13 crc kubenswrapper[4733]: I0318 10:50:13.571138 4733 patch_prober.go:28] interesting pod/machine-config-daemon-2h7dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:50:13 crc kubenswrapper[4733]: I0318 10:50:13.571554 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:50:15 crc kubenswrapper[4733]: I0318 10:50:15.176481 4733 scope.go:117] "RemoveContainer" containerID="52c377e9a60c9ca96c08e610b060d46b5fcfe8f4ca8351f71d96116255ccee60" Mar 18 10:50:15 crc kubenswrapper[4733]: E0318 10:50:15.177185 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:50:17 crc kubenswrapper[4733]: I0318 10:50:17.175754 4733 scope.go:117] "RemoveContainer" containerID="615d6075a16d1723238d5f484c97fecdaa694488b8494a55c0a3d329cf030b8f" Mar 18 10:50:17 crc kubenswrapper[4733]: E0318 10:50:17.176167 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:50:28 crc kubenswrapper[4733]: I0318 10:50:28.175406 4733 scope.go:117] "RemoveContainer" containerID="615d6075a16d1723238d5f484c97fecdaa694488b8494a55c0a3d329cf030b8f" Mar 18 10:50:28 crc kubenswrapper[4733]: I0318 10:50:28.176000 4733 scope.go:117] "RemoveContainer" containerID="52c377e9a60c9ca96c08e610b060d46b5fcfe8f4ca8351f71d96116255ccee60" Mar 18 10:50:28 crc kubenswrapper[4733]: E0318 10:50:28.176262 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:50:28 crc kubenswrapper[4733]: E0318 10:50:28.176491 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:50:40 crc kubenswrapper[4733]: I0318 10:50:40.175827 4733 scope.go:117] "RemoveContainer" containerID="52c377e9a60c9ca96c08e610b060d46b5fcfe8f4ca8351f71d96116255ccee60" Mar 18 10:50:40 crc kubenswrapper[4733]: E0318 10:50:40.177813 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:50:42 crc kubenswrapper[4733]: I0318 10:50:42.175101 4733 scope.go:117] "RemoveContainer" containerID="615d6075a16d1723238d5f484c97fecdaa694488b8494a55c0a3d329cf030b8f" Mar 18 10:50:42 crc kubenswrapper[4733]: E0318 10:50:42.175410 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:50:43 crc kubenswrapper[4733]: I0318 10:50:43.571034 4733 patch_prober.go:28] interesting pod/machine-config-daemon-2h7dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:50:43 crc kubenswrapper[4733]: I0318 10:50:43.571562 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:50:52 crc kubenswrapper[4733]: I0318 10:50:52.175966 4733 scope.go:117] "RemoveContainer" containerID="52c377e9a60c9ca96c08e610b060d46b5fcfe8f4ca8351f71d96116255ccee60" Mar 18 10:50:52 crc kubenswrapper[4733]: E0318 10:50:52.177176 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:50:53 crc kubenswrapper[4733]: I0318 10:50:53.176730 4733 scope.go:117] "RemoveContainer" containerID="615d6075a16d1723238d5f484c97fecdaa694488b8494a55c0a3d329cf030b8f" Mar 18 10:50:53 crc kubenswrapper[4733]: E0318 10:50:53.177385 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:51:04 crc kubenswrapper[4733]: I0318 10:51:04.175969 4733 scope.go:117] "RemoveContainer" containerID="615d6075a16d1723238d5f484c97fecdaa694488b8494a55c0a3d329cf030b8f" Mar 18 10:51:04 crc kubenswrapper[4733]: I0318 10:51:04.176622 4733 scope.go:117] "RemoveContainer" containerID="52c377e9a60c9ca96c08e610b060d46b5fcfe8f4ca8351f71d96116255ccee60" Mar 18 10:51:04 crc kubenswrapper[4733]: E0318 10:51:04.176820 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:51:04 crc kubenswrapper[4733]: E0318 10:51:04.177072 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:51:13 crc kubenswrapper[4733]: I0318 10:51:13.570817 4733 patch_prober.go:28] interesting pod/machine-config-daemon-2h7dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:51:13 crc kubenswrapper[4733]: I0318 10:51:13.571437 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:51:13 crc kubenswrapper[4733]: I0318 10:51:13.571488 4733 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" Mar 18 10:51:13 crc kubenswrapper[4733]: I0318 10:51:13.572228 4733 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"13079617ef56fbdc98c390ba5bdaff3c5530411f54f691fbeb11894744ecac48"} pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 10:51:13 crc kubenswrapper[4733]: I0318 10:51:13.572304 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" containerName="machine-config-daemon" containerID="cri-o://13079617ef56fbdc98c390ba5bdaff3c5530411f54f691fbeb11894744ecac48" gracePeriod=600 Mar 18 10:51:13 crc kubenswrapper[4733]: E0318 10:51:13.718094 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 10:51:14 crc kubenswrapper[4733]: I0318 10:51:14.196162 4733 generic.go:334] "Generic (PLEG): container finished" podID="6f75e1c5-e0c5-43df-944f-77b734070793" containerID="13079617ef56fbdc98c390ba5bdaff3c5530411f54f691fbeb11894744ecac48" exitCode=0 Mar 18 10:51:14 crc kubenswrapper[4733]: I0318 10:51:14.196241 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" event={"ID":"6f75e1c5-e0c5-43df-944f-77b734070793","Type":"ContainerDied","Data":"13079617ef56fbdc98c390ba5bdaff3c5530411f54f691fbeb11894744ecac48"} Mar 18 10:51:14 crc kubenswrapper[4733]: I0318 10:51:14.196596 4733 scope.go:117] "RemoveContainer" containerID="4aabe714853c502719880f7f27bb562465a6a84fdec9e321e389ec23753f6337" Mar 18 10:51:14 crc kubenswrapper[4733]: I0318 10:51:14.197263 4733 scope.go:117] "RemoveContainer" containerID="13079617ef56fbdc98c390ba5bdaff3c5530411f54f691fbeb11894744ecac48" Mar 18 10:51:14 crc kubenswrapper[4733]: E0318 10:51:14.198489 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 10:51:15 crc kubenswrapper[4733]: I0318 10:51:15.176925 4733 scope.go:117] "RemoveContainer" containerID="615d6075a16d1723238d5f484c97fecdaa694488b8494a55c0a3d329cf030b8f" Mar 18 10:51:15 crc kubenswrapper[4733]: E0318 10:51:15.177149 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:51:16 crc kubenswrapper[4733]: I0318 10:51:16.176574 4733 scope.go:117] "RemoveContainer" containerID="52c377e9a60c9ca96c08e610b060d46b5fcfe8f4ca8351f71d96116255ccee60" Mar 18 10:51:16 crc kubenswrapper[4733]: E0318 10:51:16.177533 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:51:18 crc kubenswrapper[4733]: E0318 10:51:18.206961 4733 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="1.032s" Mar 18 10:51:26 crc kubenswrapper[4733]: I0318 10:51:26.176084 4733 scope.go:117] "RemoveContainer" containerID="13079617ef56fbdc98c390ba5bdaff3c5530411f54f691fbeb11894744ecac48" Mar 18 10:51:26 crc kubenswrapper[4733]: E0318 10:51:26.177029 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 10:51:28 crc kubenswrapper[4733]: I0318 10:51:28.175958 4733 scope.go:117] "RemoveContainer" containerID="52c377e9a60c9ca96c08e610b060d46b5fcfe8f4ca8351f71d96116255ccee60" Mar 18 10:51:28 crc kubenswrapper[4733]: E0318 10:51:28.176627 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:51:29 crc kubenswrapper[4733]: I0318 10:51:29.176157 4733 scope.go:117] "RemoveContainer" containerID="615d6075a16d1723238d5f484c97fecdaa694488b8494a55c0a3d329cf030b8f" Mar 18 10:51:29 crc kubenswrapper[4733]: E0318 10:51:29.176874 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:51:38 crc kubenswrapper[4733]: I0318 10:51:38.176995 4733 scope.go:117] "RemoveContainer" containerID="13079617ef56fbdc98c390ba5bdaff3c5530411f54f691fbeb11894744ecac48" Mar 18 10:51:38 crc kubenswrapper[4733]: E0318 10:51:38.178269 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 10:51:39 crc kubenswrapper[4733]: I0318 10:51:39.176340 4733 scope.go:117] "RemoveContainer" containerID="52c377e9a60c9ca96c08e610b060d46b5fcfe8f4ca8351f71d96116255ccee60" Mar 18 10:51:39 crc kubenswrapper[4733]: E0318 10:51:39.177531 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:51:42 crc kubenswrapper[4733]: I0318 10:51:42.175723 4733 scope.go:117] "RemoveContainer" containerID="615d6075a16d1723238d5f484c97fecdaa694488b8494a55c0a3d329cf030b8f" Mar 18 10:51:42 crc kubenswrapper[4733]: E0318 10:51:42.176506 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:51:50 crc kubenswrapper[4733]: I0318 10:51:50.176158 4733 scope.go:117] "RemoveContainer" containerID="13079617ef56fbdc98c390ba5bdaff3c5530411f54f691fbeb11894744ecac48" Mar 18 10:51:50 crc kubenswrapper[4733]: E0318 10:51:50.177325 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 10:51:54 crc kubenswrapper[4733]: I0318 10:51:54.175895 4733 scope.go:117] "RemoveContainer" containerID="52c377e9a60c9ca96c08e610b060d46b5fcfe8f4ca8351f71d96116255ccee60" Mar 18 10:51:54 crc kubenswrapper[4733]: E0318 10:51:54.176881 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:51:56 crc kubenswrapper[4733]: I0318 10:51:56.176672 4733 scope.go:117] "RemoveContainer" containerID="615d6075a16d1723238d5f484c97fecdaa694488b8494a55c0a3d329cf030b8f" Mar 18 10:51:56 crc kubenswrapper[4733]: E0318 10:51:56.177386 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:52:00 crc kubenswrapper[4733]: I0318 10:52:00.158322 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563852-nxds9"] Mar 18 10:52:00 crc kubenswrapper[4733]: E0318 10:52:00.159608 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b49b68f2-ffd0-4ff7-a4ae-691c0a71e896" containerName="registry-server" Mar 18 10:52:00 crc kubenswrapper[4733]: I0318 10:52:00.159632 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="b49b68f2-ffd0-4ff7-a4ae-691c0a71e896" containerName="registry-server" Mar 18 10:52:00 crc kubenswrapper[4733]: E0318 10:52:00.159659 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2166cf23-2a65-4b17-922e-3131be1d6d8b" containerName="oc" Mar 18 10:52:00 crc kubenswrapper[4733]: I0318 10:52:00.159670 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="2166cf23-2a65-4b17-922e-3131be1d6d8b" containerName="oc" Mar 18 10:52:00 crc kubenswrapper[4733]: E0318 10:52:00.159695 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9c6e055-5d64-4158-b0d2-27de0cb9f7c2" containerName="registry-server" Mar 18 10:52:00 crc kubenswrapper[4733]: I0318 10:52:00.159708 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9c6e055-5d64-4158-b0d2-27de0cb9f7c2" containerName="registry-server" Mar 18 10:52:00 crc kubenswrapper[4733]: E0318 10:52:00.159724 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9c6e055-5d64-4158-b0d2-27de0cb9f7c2" containerName="extract-content" Mar 18 10:52:00 crc kubenswrapper[4733]: I0318 10:52:00.159734 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9c6e055-5d64-4158-b0d2-27de0cb9f7c2" containerName="extract-content" Mar 18 10:52:00 crc kubenswrapper[4733]: E0318 10:52:00.159757 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b49b68f2-ffd0-4ff7-a4ae-691c0a71e896" containerName="extract-content" Mar 18 10:52:00 crc kubenswrapper[4733]: I0318 10:52:00.159768 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="b49b68f2-ffd0-4ff7-a4ae-691c0a71e896" containerName="extract-content" Mar 18 10:52:00 crc kubenswrapper[4733]: E0318 10:52:00.159796 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9c6e055-5d64-4158-b0d2-27de0cb9f7c2" containerName="extract-utilities" Mar 18 10:52:00 crc kubenswrapper[4733]: I0318 10:52:00.159808 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9c6e055-5d64-4158-b0d2-27de0cb9f7c2" containerName="extract-utilities" Mar 18 10:52:00 crc kubenswrapper[4733]: E0318 10:52:00.159823 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b49b68f2-ffd0-4ff7-a4ae-691c0a71e896" containerName="extract-utilities" Mar 18 10:52:00 crc kubenswrapper[4733]: I0318 10:52:00.159832 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="b49b68f2-ffd0-4ff7-a4ae-691c0a71e896" containerName="extract-utilities" Mar 18 10:52:00 crc kubenswrapper[4733]: I0318 10:52:00.160103 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="b49b68f2-ffd0-4ff7-a4ae-691c0a71e896" containerName="registry-server" Mar 18 10:52:00 crc kubenswrapper[4733]: I0318 10:52:00.160134 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9c6e055-5d64-4158-b0d2-27de0cb9f7c2" containerName="registry-server" Mar 18 10:52:00 crc kubenswrapper[4733]: I0318 10:52:00.160155 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="2166cf23-2a65-4b17-922e-3131be1d6d8b" containerName="oc" Mar 18 10:52:00 crc kubenswrapper[4733]: I0318 10:52:00.160913 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563852-nxds9" Mar 18 10:52:00 crc kubenswrapper[4733]: I0318 10:52:00.164219 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563852-nxds9"] Mar 18 10:52:00 crc kubenswrapper[4733]: I0318 10:52:00.165008 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wmd5k" Mar 18 10:52:00 crc kubenswrapper[4733]: I0318 10:52:00.165901 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:52:00 crc kubenswrapper[4733]: I0318 10:52:00.166489 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:52:00 crc kubenswrapper[4733]: I0318 10:52:00.287672 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nq8r5\" (UniqueName: \"kubernetes.io/projected/21264fda-07b1-4a7f-ac61-432c6dc9a230-kube-api-access-nq8r5\") pod \"auto-csr-approver-29563852-nxds9\" (UID: \"21264fda-07b1-4a7f-ac61-432c6dc9a230\") " pod="openshift-infra/auto-csr-approver-29563852-nxds9" Mar 18 10:52:00 crc kubenswrapper[4733]: I0318 10:52:00.389743 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nq8r5\" (UniqueName: \"kubernetes.io/projected/21264fda-07b1-4a7f-ac61-432c6dc9a230-kube-api-access-nq8r5\") pod \"auto-csr-approver-29563852-nxds9\" (UID: \"21264fda-07b1-4a7f-ac61-432c6dc9a230\") " pod="openshift-infra/auto-csr-approver-29563852-nxds9" Mar 18 10:52:00 crc kubenswrapper[4733]: I0318 10:52:00.422632 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nq8r5\" (UniqueName: \"kubernetes.io/projected/21264fda-07b1-4a7f-ac61-432c6dc9a230-kube-api-access-nq8r5\") pod \"auto-csr-approver-29563852-nxds9\" (UID: \"21264fda-07b1-4a7f-ac61-432c6dc9a230\") " pod="openshift-infra/auto-csr-approver-29563852-nxds9" Mar 18 10:52:00 crc kubenswrapper[4733]: I0318 10:52:00.486582 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563852-nxds9" Mar 18 10:52:00 crc kubenswrapper[4733]: I0318 10:52:00.973021 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563852-nxds9"] Mar 18 10:52:00 crc kubenswrapper[4733]: W0318 10:52:00.990067 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21264fda_07b1_4a7f_ac61_432c6dc9a230.slice/crio-ff53466daa2cf7b98be28892b619a1f0a9aab3cc5dc1eb983f44b8effab0bf15 WatchSource:0}: Error finding container ff53466daa2cf7b98be28892b619a1f0a9aab3cc5dc1eb983f44b8effab0bf15: Status 404 returned error can't find the container with id ff53466daa2cf7b98be28892b619a1f0a9aab3cc5dc1eb983f44b8effab0bf15 Mar 18 10:52:01 crc kubenswrapper[4733]: I0318 10:52:01.683601 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563852-nxds9" event={"ID":"21264fda-07b1-4a7f-ac61-432c6dc9a230","Type":"ContainerStarted","Data":"ff53466daa2cf7b98be28892b619a1f0a9aab3cc5dc1eb983f44b8effab0bf15"} Mar 18 10:52:02 crc kubenswrapper[4733]: I0318 10:52:02.696287 4733 generic.go:334] "Generic (PLEG): container finished" podID="21264fda-07b1-4a7f-ac61-432c6dc9a230" containerID="6234fbb28241739a6b36f7e66aab35dc25489c2ddedad91a0ad07ea33e77be17" exitCode=0 Mar 18 10:52:02 crc kubenswrapper[4733]: I0318 10:52:02.696415 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563852-nxds9" event={"ID":"21264fda-07b1-4a7f-ac61-432c6dc9a230","Type":"ContainerDied","Data":"6234fbb28241739a6b36f7e66aab35dc25489c2ddedad91a0ad07ea33e77be17"} Mar 18 10:52:03 crc kubenswrapper[4733]: I0318 10:52:03.179249 4733 scope.go:117] "RemoveContainer" containerID="13079617ef56fbdc98c390ba5bdaff3c5530411f54f691fbeb11894744ecac48" Mar 18 10:52:03 crc kubenswrapper[4733]: E0318 10:52:03.180128 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 10:52:04 crc kubenswrapper[4733]: I0318 10:52:04.092353 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563852-nxds9" Mar 18 10:52:04 crc kubenswrapper[4733]: I0318 10:52:04.158775 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nq8r5\" (UniqueName: \"kubernetes.io/projected/21264fda-07b1-4a7f-ac61-432c6dc9a230-kube-api-access-nq8r5\") pod \"21264fda-07b1-4a7f-ac61-432c6dc9a230\" (UID: \"21264fda-07b1-4a7f-ac61-432c6dc9a230\") " Mar 18 10:52:04 crc kubenswrapper[4733]: I0318 10:52:04.164988 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21264fda-07b1-4a7f-ac61-432c6dc9a230-kube-api-access-nq8r5" (OuterVolumeSpecName: "kube-api-access-nq8r5") pod "21264fda-07b1-4a7f-ac61-432c6dc9a230" (UID: "21264fda-07b1-4a7f-ac61-432c6dc9a230"). InnerVolumeSpecName "kube-api-access-nq8r5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:52:04 crc kubenswrapper[4733]: I0318 10:52:04.261239 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nq8r5\" (UniqueName: \"kubernetes.io/projected/21264fda-07b1-4a7f-ac61-432c6dc9a230-kube-api-access-nq8r5\") on node \"crc\" DevicePath \"\"" Mar 18 10:52:04 crc kubenswrapper[4733]: I0318 10:52:04.715990 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563852-nxds9" event={"ID":"21264fda-07b1-4a7f-ac61-432c6dc9a230","Type":"ContainerDied","Data":"ff53466daa2cf7b98be28892b619a1f0a9aab3cc5dc1eb983f44b8effab0bf15"} Mar 18 10:52:04 crc kubenswrapper[4733]: I0318 10:52:04.716045 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff53466daa2cf7b98be28892b619a1f0a9aab3cc5dc1eb983f44b8effab0bf15" Mar 18 10:52:04 crc kubenswrapper[4733]: I0318 10:52:04.716063 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563852-nxds9" Mar 18 10:52:05 crc kubenswrapper[4733]: I0318 10:52:05.177353 4733 scope.go:117] "RemoveContainer" containerID="52c377e9a60c9ca96c08e610b060d46b5fcfe8f4ca8351f71d96116255ccee60" Mar 18 10:52:05 crc kubenswrapper[4733]: E0318 10:52:05.177830 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:52:05 crc kubenswrapper[4733]: I0318 10:52:05.217476 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563846-zfzc4"] Mar 18 10:52:05 crc kubenswrapper[4733]: I0318 10:52:05.217549 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563846-zfzc4"] Mar 18 10:52:07 crc kubenswrapper[4733]: I0318 10:52:07.191673 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02d9650e-918c-4ff4-82bd-ba01e08b6588" path="/var/lib/kubelet/pods/02d9650e-918c-4ff4-82bd-ba01e08b6588/volumes" Mar 18 10:52:10 crc kubenswrapper[4733]: I0318 10:52:10.847132 4733 scope.go:117] "RemoveContainer" containerID="df6a0cb730f6fbfc1289efd16f5838442dcf854d748f1b313f239fe3a8ed31b9" Mar 18 10:52:11 crc kubenswrapper[4733]: I0318 10:52:11.185240 4733 scope.go:117] "RemoveContainer" containerID="615d6075a16d1723238d5f484c97fecdaa694488b8494a55c0a3d329cf030b8f" Mar 18 10:52:11 crc kubenswrapper[4733]: E0318 10:52:11.185681 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:52:15 crc kubenswrapper[4733]: I0318 10:52:15.175827 4733 scope.go:117] "RemoveContainer" containerID="13079617ef56fbdc98c390ba5bdaff3c5530411f54f691fbeb11894744ecac48" Mar 18 10:52:15 crc kubenswrapper[4733]: E0318 10:52:15.176704 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 10:52:16 crc kubenswrapper[4733]: I0318 10:52:16.176642 4733 scope.go:117] "RemoveContainer" containerID="52c377e9a60c9ca96c08e610b060d46b5fcfe8f4ca8351f71d96116255ccee60" Mar 18 10:52:16 crc kubenswrapper[4733]: E0318 10:52:16.176939 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:52:22 crc kubenswrapper[4733]: I0318 10:52:22.176543 4733 scope.go:117] "RemoveContainer" containerID="615d6075a16d1723238d5f484c97fecdaa694488b8494a55c0a3d329cf030b8f" Mar 18 10:52:22 crc kubenswrapper[4733]: E0318 10:52:22.177278 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:52:27 crc kubenswrapper[4733]: I0318 10:52:27.175366 4733 scope.go:117] "RemoveContainer" containerID="13079617ef56fbdc98c390ba5bdaff3c5530411f54f691fbeb11894744ecac48" Mar 18 10:52:27 crc kubenswrapper[4733]: E0318 10:52:27.176052 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 10:52:29 crc kubenswrapper[4733]: I0318 10:52:29.176270 4733 scope.go:117] "RemoveContainer" containerID="52c377e9a60c9ca96c08e610b060d46b5fcfe8f4ca8351f71d96116255ccee60" Mar 18 10:52:29 crc kubenswrapper[4733]: E0318 10:52:29.180705 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:52:35 crc kubenswrapper[4733]: I0318 10:52:35.175838 4733 scope.go:117] "RemoveContainer" containerID="615d6075a16d1723238d5f484c97fecdaa694488b8494a55c0a3d329cf030b8f" Mar 18 10:52:35 crc kubenswrapper[4733]: E0318 10:52:35.176828 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:52:40 crc kubenswrapper[4733]: I0318 10:52:40.176257 4733 scope.go:117] "RemoveContainer" containerID="13079617ef56fbdc98c390ba5bdaff3c5530411f54f691fbeb11894744ecac48" Mar 18 10:52:40 crc kubenswrapper[4733]: E0318 10:52:40.177393 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 10:52:40 crc kubenswrapper[4733]: I0318 10:52:40.177393 4733 scope.go:117] "RemoveContainer" containerID="52c377e9a60c9ca96c08e610b060d46b5fcfe8f4ca8351f71d96116255ccee60" Mar 18 10:52:40 crc kubenswrapper[4733]: E0318 10:52:40.178246 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:52:46 crc kubenswrapper[4733]: I0318 10:52:46.176473 4733 scope.go:117] "RemoveContainer" containerID="615d6075a16d1723238d5f484c97fecdaa694488b8494a55c0a3d329cf030b8f" Mar 18 10:52:46 crc kubenswrapper[4733]: E0318 10:52:46.177305 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:52:48 crc kubenswrapper[4733]: I0318 10:52:48.089869 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2vhhc"] Mar 18 10:52:48 crc kubenswrapper[4733]: E0318 10:52:48.090382 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21264fda-07b1-4a7f-ac61-432c6dc9a230" containerName="oc" Mar 18 10:52:48 crc kubenswrapper[4733]: I0318 10:52:48.090405 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="21264fda-07b1-4a7f-ac61-432c6dc9a230" containerName="oc" Mar 18 10:52:48 crc kubenswrapper[4733]: I0318 10:52:48.090732 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="21264fda-07b1-4a7f-ac61-432c6dc9a230" containerName="oc" Mar 18 10:52:48 crc kubenswrapper[4733]: I0318 10:52:48.092669 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2vhhc" Mar 18 10:52:48 crc kubenswrapper[4733]: I0318 10:52:48.124781 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2vhhc"] Mar 18 10:52:48 crc kubenswrapper[4733]: I0318 10:52:48.198483 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r46zw\" (UniqueName: \"kubernetes.io/projected/c884f6b7-6551-4a77-b19f-d0ea8c634eb2-kube-api-access-r46zw\") pod \"certified-operators-2vhhc\" (UID: \"c884f6b7-6551-4a77-b19f-d0ea8c634eb2\") " pod="openshift-marketplace/certified-operators-2vhhc" Mar 18 10:52:48 crc kubenswrapper[4733]: I0318 10:52:48.198619 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c884f6b7-6551-4a77-b19f-d0ea8c634eb2-utilities\") pod \"certified-operators-2vhhc\" (UID: \"c884f6b7-6551-4a77-b19f-d0ea8c634eb2\") " pod="openshift-marketplace/certified-operators-2vhhc" Mar 18 10:52:48 crc kubenswrapper[4733]: I0318 10:52:48.198705 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c884f6b7-6551-4a77-b19f-d0ea8c634eb2-catalog-content\") pod \"certified-operators-2vhhc\" (UID: \"c884f6b7-6551-4a77-b19f-d0ea8c634eb2\") " pod="openshift-marketplace/certified-operators-2vhhc" Mar 18 10:52:48 crc kubenswrapper[4733]: I0318 10:52:48.300109 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c884f6b7-6551-4a77-b19f-d0ea8c634eb2-utilities\") pod \"certified-operators-2vhhc\" (UID: \"c884f6b7-6551-4a77-b19f-d0ea8c634eb2\") " pod="openshift-marketplace/certified-operators-2vhhc" Mar 18 10:52:48 crc kubenswrapper[4733]: I0318 10:52:48.300359 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c884f6b7-6551-4a77-b19f-d0ea8c634eb2-catalog-content\") pod \"certified-operators-2vhhc\" (UID: \"c884f6b7-6551-4a77-b19f-d0ea8c634eb2\") " pod="openshift-marketplace/certified-operators-2vhhc" Mar 18 10:52:48 crc kubenswrapper[4733]: I0318 10:52:48.300446 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r46zw\" (UniqueName: \"kubernetes.io/projected/c884f6b7-6551-4a77-b19f-d0ea8c634eb2-kube-api-access-r46zw\") pod \"certified-operators-2vhhc\" (UID: \"c884f6b7-6551-4a77-b19f-d0ea8c634eb2\") " pod="openshift-marketplace/certified-operators-2vhhc" Mar 18 10:52:48 crc kubenswrapper[4733]: I0318 10:52:48.300699 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c884f6b7-6551-4a77-b19f-d0ea8c634eb2-utilities\") pod \"certified-operators-2vhhc\" (UID: \"c884f6b7-6551-4a77-b19f-d0ea8c634eb2\") " pod="openshift-marketplace/certified-operators-2vhhc" Mar 18 10:52:48 crc kubenswrapper[4733]: I0318 10:52:48.300881 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c884f6b7-6551-4a77-b19f-d0ea8c634eb2-catalog-content\") pod \"certified-operators-2vhhc\" (UID: \"c884f6b7-6551-4a77-b19f-d0ea8c634eb2\") " pod="openshift-marketplace/certified-operators-2vhhc" Mar 18 10:52:48 crc kubenswrapper[4733]: I0318 10:52:48.323282 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r46zw\" (UniqueName: \"kubernetes.io/projected/c884f6b7-6551-4a77-b19f-d0ea8c634eb2-kube-api-access-r46zw\") pod \"certified-operators-2vhhc\" (UID: \"c884f6b7-6551-4a77-b19f-d0ea8c634eb2\") " pod="openshift-marketplace/certified-operators-2vhhc" Mar 18 10:52:48 crc kubenswrapper[4733]: I0318 10:52:48.439549 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2vhhc" Mar 18 10:52:48 crc kubenswrapper[4733]: I0318 10:52:48.718013 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2vhhc"] Mar 18 10:52:49 crc kubenswrapper[4733]: I0318 10:52:49.140963 4733 generic.go:334] "Generic (PLEG): container finished" podID="c884f6b7-6551-4a77-b19f-d0ea8c634eb2" containerID="0a9f0b6652df7606f5074c262197297d4120c34ea16c27776883a78daecb640d" exitCode=0 Mar 18 10:52:49 crc kubenswrapper[4733]: I0318 10:52:49.141077 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2vhhc" event={"ID":"c884f6b7-6551-4a77-b19f-d0ea8c634eb2","Type":"ContainerDied","Data":"0a9f0b6652df7606f5074c262197297d4120c34ea16c27776883a78daecb640d"} Mar 18 10:52:49 crc kubenswrapper[4733]: I0318 10:52:49.141548 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2vhhc" event={"ID":"c884f6b7-6551-4a77-b19f-d0ea8c634eb2","Type":"ContainerStarted","Data":"bfb7ae10f9639c8b35d73989ab0d314c8fd91b3fa87651193e323db772fbad71"} Mar 18 10:52:50 crc kubenswrapper[4733]: I0318 10:52:50.150166 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2vhhc" event={"ID":"c884f6b7-6551-4a77-b19f-d0ea8c634eb2","Type":"ContainerStarted","Data":"4f56f4c122771f35159f24bcb1a971cd9aaf2bdd30567d91aef109eb055c2b50"} Mar 18 10:52:51 crc kubenswrapper[4733]: I0318 10:52:51.164113 4733 generic.go:334] "Generic (PLEG): container finished" podID="c884f6b7-6551-4a77-b19f-d0ea8c634eb2" containerID="4f56f4c122771f35159f24bcb1a971cd9aaf2bdd30567d91aef109eb055c2b50" exitCode=0 Mar 18 10:52:51 crc kubenswrapper[4733]: I0318 10:52:51.164178 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2vhhc" event={"ID":"c884f6b7-6551-4a77-b19f-d0ea8c634eb2","Type":"ContainerDied","Data":"4f56f4c122771f35159f24bcb1a971cd9aaf2bdd30567d91aef109eb055c2b50"} Mar 18 10:52:52 crc kubenswrapper[4733]: I0318 10:52:52.176103 4733 scope.go:117] "RemoveContainer" containerID="13079617ef56fbdc98c390ba5bdaff3c5530411f54f691fbeb11894744ecac48" Mar 18 10:52:52 crc kubenswrapper[4733]: E0318 10:52:52.176942 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 10:52:52 crc kubenswrapper[4733]: I0318 10:52:52.179616 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2vhhc" event={"ID":"c884f6b7-6551-4a77-b19f-d0ea8c634eb2","Type":"ContainerStarted","Data":"f289f6deea9d24ec4de7af790bd1a748a15b66a6de7ff132e8b3a144800ff738"} Mar 18 10:52:52 crc kubenswrapper[4733]: I0318 10:52:52.204782 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2vhhc" podStartSLOduration=1.801296522 podStartE2EDuration="4.204763189s" podCreationTimestamp="2026-03-18 10:52:48 +0000 UTC" firstStartedPulling="2026-03-18 10:52:49.143559419 +0000 UTC m=+2408.635293774" lastFinishedPulling="2026-03-18 10:52:51.547026116 +0000 UTC m=+2411.038760441" observedRunningTime="2026-03-18 10:52:52.199883931 +0000 UTC m=+2411.691618256" watchObservedRunningTime="2026-03-18 10:52:52.204763189 +0000 UTC m=+2411.696497514" Mar 18 10:52:53 crc kubenswrapper[4733]: I0318 10:52:53.176417 4733 scope.go:117] "RemoveContainer" containerID="52c377e9a60c9ca96c08e610b060d46b5fcfe8f4ca8351f71d96116255ccee60" Mar 18 10:52:53 crc kubenswrapper[4733]: E0318 10:52:53.177042 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:52:58 crc kubenswrapper[4733]: I0318 10:52:58.175786 4733 scope.go:117] "RemoveContainer" containerID="615d6075a16d1723238d5f484c97fecdaa694488b8494a55c0a3d329cf030b8f" Mar 18 10:52:58 crc kubenswrapper[4733]: E0318 10:52:58.177180 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:52:58 crc kubenswrapper[4733]: I0318 10:52:58.439776 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2vhhc" Mar 18 10:52:58 crc kubenswrapper[4733]: I0318 10:52:58.439857 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2vhhc" Mar 18 10:52:58 crc kubenswrapper[4733]: I0318 10:52:58.504563 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2vhhc" Mar 18 10:52:59 crc kubenswrapper[4733]: I0318 10:52:59.324037 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2vhhc" Mar 18 10:52:59 crc kubenswrapper[4733]: I0318 10:52:59.383507 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2vhhc"] Mar 18 10:53:01 crc kubenswrapper[4733]: I0318 10:53:01.269077 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2vhhc" podUID="c884f6b7-6551-4a77-b19f-d0ea8c634eb2" containerName="registry-server" containerID="cri-o://f289f6deea9d24ec4de7af790bd1a748a15b66a6de7ff132e8b3a144800ff738" gracePeriod=2 Mar 18 10:53:01 crc kubenswrapper[4733]: I0318 10:53:01.764157 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2vhhc" Mar 18 10:53:01 crc kubenswrapper[4733]: I0318 10:53:01.834262 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c884f6b7-6551-4a77-b19f-d0ea8c634eb2-catalog-content\") pod \"c884f6b7-6551-4a77-b19f-d0ea8c634eb2\" (UID: \"c884f6b7-6551-4a77-b19f-d0ea8c634eb2\") " Mar 18 10:53:01 crc kubenswrapper[4733]: I0318 10:53:01.834445 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c884f6b7-6551-4a77-b19f-d0ea8c634eb2-utilities\") pod \"c884f6b7-6551-4a77-b19f-d0ea8c634eb2\" (UID: \"c884f6b7-6551-4a77-b19f-d0ea8c634eb2\") " Mar 18 10:53:01 crc kubenswrapper[4733]: I0318 10:53:01.834545 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r46zw\" (UniqueName: \"kubernetes.io/projected/c884f6b7-6551-4a77-b19f-d0ea8c634eb2-kube-api-access-r46zw\") pod \"c884f6b7-6551-4a77-b19f-d0ea8c634eb2\" (UID: \"c884f6b7-6551-4a77-b19f-d0ea8c634eb2\") " Mar 18 10:53:01 crc kubenswrapper[4733]: I0318 10:53:01.835519 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c884f6b7-6551-4a77-b19f-d0ea8c634eb2-utilities" (OuterVolumeSpecName: "utilities") pod "c884f6b7-6551-4a77-b19f-d0ea8c634eb2" (UID: "c884f6b7-6551-4a77-b19f-d0ea8c634eb2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:53:01 crc kubenswrapper[4733]: I0318 10:53:01.843697 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c884f6b7-6551-4a77-b19f-d0ea8c634eb2-kube-api-access-r46zw" (OuterVolumeSpecName: "kube-api-access-r46zw") pod "c884f6b7-6551-4a77-b19f-d0ea8c634eb2" (UID: "c884f6b7-6551-4a77-b19f-d0ea8c634eb2"). InnerVolumeSpecName "kube-api-access-r46zw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:53:01 crc kubenswrapper[4733]: I0318 10:53:01.910375 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c884f6b7-6551-4a77-b19f-d0ea8c634eb2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c884f6b7-6551-4a77-b19f-d0ea8c634eb2" (UID: "c884f6b7-6551-4a77-b19f-d0ea8c634eb2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:53:01 crc kubenswrapper[4733]: I0318 10:53:01.936057 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r46zw\" (UniqueName: \"kubernetes.io/projected/c884f6b7-6551-4a77-b19f-d0ea8c634eb2-kube-api-access-r46zw\") on node \"crc\" DevicePath \"\"" Mar 18 10:53:01 crc kubenswrapper[4733]: I0318 10:53:01.936099 4733 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c884f6b7-6551-4a77-b19f-d0ea8c634eb2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 10:53:01 crc kubenswrapper[4733]: I0318 10:53:01.936110 4733 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c884f6b7-6551-4a77-b19f-d0ea8c634eb2-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 10:53:02 crc kubenswrapper[4733]: I0318 10:53:02.280642 4733 generic.go:334] "Generic (PLEG): container finished" podID="c884f6b7-6551-4a77-b19f-d0ea8c634eb2" containerID="f289f6deea9d24ec4de7af790bd1a748a15b66a6de7ff132e8b3a144800ff738" exitCode=0 Mar 18 10:53:02 crc kubenswrapper[4733]: I0318 10:53:02.280720 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2vhhc" Mar 18 10:53:02 crc kubenswrapper[4733]: I0318 10:53:02.280701 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2vhhc" event={"ID":"c884f6b7-6551-4a77-b19f-d0ea8c634eb2","Type":"ContainerDied","Data":"f289f6deea9d24ec4de7af790bd1a748a15b66a6de7ff132e8b3a144800ff738"} Mar 18 10:53:02 crc kubenswrapper[4733]: I0318 10:53:02.281231 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2vhhc" event={"ID":"c884f6b7-6551-4a77-b19f-d0ea8c634eb2","Type":"ContainerDied","Data":"bfb7ae10f9639c8b35d73989ab0d314c8fd91b3fa87651193e323db772fbad71"} Mar 18 10:53:02 crc kubenswrapper[4733]: I0318 10:53:02.281269 4733 scope.go:117] "RemoveContainer" containerID="f289f6deea9d24ec4de7af790bd1a748a15b66a6de7ff132e8b3a144800ff738" Mar 18 10:53:02 crc kubenswrapper[4733]: I0318 10:53:02.322220 4733 scope.go:117] "RemoveContainer" containerID="4f56f4c122771f35159f24bcb1a971cd9aaf2bdd30567d91aef109eb055c2b50" Mar 18 10:53:02 crc kubenswrapper[4733]: I0318 10:53:02.329867 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2vhhc"] Mar 18 10:53:02 crc kubenswrapper[4733]: I0318 10:53:02.349673 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2vhhc"] Mar 18 10:53:02 crc kubenswrapper[4733]: I0318 10:53:02.360266 4733 scope.go:117] "RemoveContainer" containerID="0a9f0b6652df7606f5074c262197297d4120c34ea16c27776883a78daecb640d" Mar 18 10:53:02 crc kubenswrapper[4733]: I0318 10:53:02.385667 4733 scope.go:117] "RemoveContainer" containerID="f289f6deea9d24ec4de7af790bd1a748a15b66a6de7ff132e8b3a144800ff738" Mar 18 10:53:02 crc kubenswrapper[4733]: E0318 10:53:02.386075 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f289f6deea9d24ec4de7af790bd1a748a15b66a6de7ff132e8b3a144800ff738\": container with ID starting with f289f6deea9d24ec4de7af790bd1a748a15b66a6de7ff132e8b3a144800ff738 not found: ID does not exist" containerID="f289f6deea9d24ec4de7af790bd1a748a15b66a6de7ff132e8b3a144800ff738" Mar 18 10:53:02 crc kubenswrapper[4733]: I0318 10:53:02.386110 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f289f6deea9d24ec4de7af790bd1a748a15b66a6de7ff132e8b3a144800ff738"} err="failed to get container status \"f289f6deea9d24ec4de7af790bd1a748a15b66a6de7ff132e8b3a144800ff738\": rpc error: code = NotFound desc = could not find container \"f289f6deea9d24ec4de7af790bd1a748a15b66a6de7ff132e8b3a144800ff738\": container with ID starting with f289f6deea9d24ec4de7af790bd1a748a15b66a6de7ff132e8b3a144800ff738 not found: ID does not exist" Mar 18 10:53:02 crc kubenswrapper[4733]: I0318 10:53:02.386135 4733 scope.go:117] "RemoveContainer" containerID="4f56f4c122771f35159f24bcb1a971cd9aaf2bdd30567d91aef109eb055c2b50" Mar 18 10:53:02 crc kubenswrapper[4733]: E0318 10:53:02.386565 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f56f4c122771f35159f24bcb1a971cd9aaf2bdd30567d91aef109eb055c2b50\": container with ID starting with 4f56f4c122771f35159f24bcb1a971cd9aaf2bdd30567d91aef109eb055c2b50 not found: ID does not exist" containerID="4f56f4c122771f35159f24bcb1a971cd9aaf2bdd30567d91aef109eb055c2b50" Mar 18 10:53:02 crc kubenswrapper[4733]: I0318 10:53:02.386589 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f56f4c122771f35159f24bcb1a971cd9aaf2bdd30567d91aef109eb055c2b50"} err="failed to get container status \"4f56f4c122771f35159f24bcb1a971cd9aaf2bdd30567d91aef109eb055c2b50\": rpc error: code = NotFound desc = could not find container \"4f56f4c122771f35159f24bcb1a971cd9aaf2bdd30567d91aef109eb055c2b50\": container with ID starting with 4f56f4c122771f35159f24bcb1a971cd9aaf2bdd30567d91aef109eb055c2b50 not found: ID does not exist" Mar 18 10:53:02 crc kubenswrapper[4733]: I0318 10:53:02.386607 4733 scope.go:117] "RemoveContainer" containerID="0a9f0b6652df7606f5074c262197297d4120c34ea16c27776883a78daecb640d" Mar 18 10:53:02 crc kubenswrapper[4733]: E0318 10:53:02.386841 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a9f0b6652df7606f5074c262197297d4120c34ea16c27776883a78daecb640d\": container with ID starting with 0a9f0b6652df7606f5074c262197297d4120c34ea16c27776883a78daecb640d not found: ID does not exist" containerID="0a9f0b6652df7606f5074c262197297d4120c34ea16c27776883a78daecb640d" Mar 18 10:53:02 crc kubenswrapper[4733]: I0318 10:53:02.386869 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a9f0b6652df7606f5074c262197297d4120c34ea16c27776883a78daecb640d"} err="failed to get container status \"0a9f0b6652df7606f5074c262197297d4120c34ea16c27776883a78daecb640d\": rpc error: code = NotFound desc = could not find container \"0a9f0b6652df7606f5074c262197297d4120c34ea16c27776883a78daecb640d\": container with ID starting with 0a9f0b6652df7606f5074c262197297d4120c34ea16c27776883a78daecb640d not found: ID does not exist" Mar 18 10:53:03 crc kubenswrapper[4733]: I0318 10:53:03.197808 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c884f6b7-6551-4a77-b19f-d0ea8c634eb2" path="/var/lib/kubelet/pods/c884f6b7-6551-4a77-b19f-d0ea8c634eb2/volumes" Mar 18 10:53:04 crc kubenswrapper[4733]: I0318 10:53:04.176007 4733 scope.go:117] "RemoveContainer" containerID="13079617ef56fbdc98c390ba5bdaff3c5530411f54f691fbeb11894744ecac48" Mar 18 10:53:04 crc kubenswrapper[4733]: E0318 10:53:04.176534 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 10:53:07 crc kubenswrapper[4733]: I0318 10:53:07.175463 4733 scope.go:117] "RemoveContainer" containerID="52c377e9a60c9ca96c08e610b060d46b5fcfe8f4ca8351f71d96116255ccee60" Mar 18 10:53:07 crc kubenswrapper[4733]: E0318 10:53:07.176461 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:53:09 crc kubenswrapper[4733]: I0318 10:53:09.176561 4733 scope.go:117] "RemoveContainer" containerID="615d6075a16d1723238d5f484c97fecdaa694488b8494a55c0a3d329cf030b8f" Mar 18 10:53:09 crc kubenswrapper[4733]: E0318 10:53:09.177393 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:53:18 crc kubenswrapper[4733]: I0318 10:53:18.176200 4733 scope.go:117] "RemoveContainer" containerID="52c377e9a60c9ca96c08e610b060d46b5fcfe8f4ca8351f71d96116255ccee60" Mar 18 10:53:18 crc kubenswrapper[4733]: I0318 10:53:18.177141 4733 scope.go:117] "RemoveContainer" containerID="13079617ef56fbdc98c390ba5bdaff3c5530411f54f691fbeb11894744ecac48" Mar 18 10:53:18 crc kubenswrapper[4733]: E0318 10:53:18.177433 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:53:18 crc kubenswrapper[4733]: E0318 10:53:18.177669 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 10:53:23 crc kubenswrapper[4733]: I0318 10:53:23.176111 4733 scope.go:117] "RemoveContainer" containerID="615d6075a16d1723238d5f484c97fecdaa694488b8494a55c0a3d329cf030b8f" Mar 18 10:53:23 crc kubenswrapper[4733]: E0318 10:53:23.177277 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:53:29 crc kubenswrapper[4733]: I0318 10:53:29.179286 4733 scope.go:117] "RemoveContainer" containerID="13079617ef56fbdc98c390ba5bdaff3c5530411f54f691fbeb11894744ecac48" Mar 18 10:53:29 crc kubenswrapper[4733]: E0318 10:53:29.181585 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 10:53:31 crc kubenswrapper[4733]: I0318 10:53:31.181657 4733 scope.go:117] "RemoveContainer" containerID="52c377e9a60c9ca96c08e610b060d46b5fcfe8f4ca8351f71d96116255ccee60" Mar 18 10:53:31 crc kubenswrapper[4733]: I0318 10:53:31.577765 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f0570ce4-1455-4698-85cf-01f7108d9e7f","Type":"ContainerStarted","Data":"ce9239548d170b75a5ed09a485a37b22892a226ac679cb077b3445d3c2f2c187"} Mar 18 10:53:31 crc kubenswrapper[4733]: I0318 10:53:31.578876 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 18 10:53:34 crc kubenswrapper[4733]: I0318 10:53:34.175766 4733 scope.go:117] "RemoveContainer" containerID="615d6075a16d1723238d5f484c97fecdaa694488b8494a55c0a3d329cf030b8f" Mar 18 10:53:34 crc kubenswrapper[4733]: E0318 10:53:34.176513 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:53:36 crc kubenswrapper[4733]: I0318 10:53:36.633272 4733 generic.go:334] "Generic (PLEG): container finished" podID="f0570ce4-1455-4698-85cf-01f7108d9e7f" containerID="ce9239548d170b75a5ed09a485a37b22892a226ac679cb077b3445d3c2f2c187" exitCode=0 Mar 18 10:53:36 crc kubenswrapper[4733]: I0318 10:53:36.633332 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f0570ce4-1455-4698-85cf-01f7108d9e7f","Type":"ContainerDied","Data":"ce9239548d170b75a5ed09a485a37b22892a226ac679cb077b3445d3c2f2c187"} Mar 18 10:53:36 crc kubenswrapper[4733]: I0318 10:53:36.634558 4733 scope.go:117] "RemoveContainer" containerID="52c377e9a60c9ca96c08e610b060d46b5fcfe8f4ca8351f71d96116255ccee60" Mar 18 10:53:36 crc kubenswrapper[4733]: I0318 10:53:36.635628 4733 scope.go:117] "RemoveContainer" containerID="ce9239548d170b75a5ed09a485a37b22892a226ac679cb077b3445d3c2f2c187" Mar 18 10:53:36 crc kubenswrapper[4733]: E0318 10:53:36.636042 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:53:44 crc kubenswrapper[4733]: I0318 10:53:44.175063 4733 scope.go:117] "RemoveContainer" containerID="13079617ef56fbdc98c390ba5bdaff3c5530411f54f691fbeb11894744ecac48" Mar 18 10:53:44 crc kubenswrapper[4733]: E0318 10:53:44.177258 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 10:53:45 crc kubenswrapper[4733]: I0318 10:53:45.176498 4733 scope.go:117] "RemoveContainer" containerID="615d6075a16d1723238d5f484c97fecdaa694488b8494a55c0a3d329cf030b8f" Mar 18 10:53:45 crc kubenswrapper[4733]: I0318 10:53:45.727655 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b4a4e3e2-bd4d-4f8d-97bc-51267378ab03","Type":"ContainerStarted","Data":"fb87c31929d690ace0713c6e835580b64a2fb69bcf3837bfb62aeeeefbe16b5c"} Mar 18 10:53:45 crc kubenswrapper[4733]: I0318 10:53:45.728177 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 18 10:53:49 crc kubenswrapper[4733]: I0318 10:53:49.783158 4733 generic.go:334] "Generic (PLEG): container finished" podID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" containerID="fb87c31929d690ace0713c6e835580b64a2fb69bcf3837bfb62aeeeefbe16b5c" exitCode=0 Mar 18 10:53:49 crc kubenswrapper[4733]: I0318 10:53:49.783330 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b4a4e3e2-bd4d-4f8d-97bc-51267378ab03","Type":"ContainerDied","Data":"fb87c31929d690ace0713c6e835580b64a2fb69bcf3837bfb62aeeeefbe16b5c"} Mar 18 10:53:49 crc kubenswrapper[4733]: I0318 10:53:49.784075 4733 scope.go:117] "RemoveContainer" containerID="615d6075a16d1723238d5f484c97fecdaa694488b8494a55c0a3d329cf030b8f" Mar 18 10:53:49 crc kubenswrapper[4733]: I0318 10:53:49.785513 4733 scope.go:117] "RemoveContainer" containerID="fb87c31929d690ace0713c6e835580b64a2fb69bcf3837bfb62aeeeefbe16b5c" Mar 18 10:53:49 crc kubenswrapper[4733]: E0318 10:53:49.786622 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:53:50 crc kubenswrapper[4733]: I0318 10:53:50.176177 4733 scope.go:117] "RemoveContainer" containerID="ce9239548d170b75a5ed09a485a37b22892a226ac679cb077b3445d3c2f2c187" Mar 18 10:53:50 crc kubenswrapper[4733]: E0318 10:53:50.176624 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:53:59 crc kubenswrapper[4733]: I0318 10:53:59.176267 4733 scope.go:117] "RemoveContainer" containerID="13079617ef56fbdc98c390ba5bdaff3c5530411f54f691fbeb11894744ecac48" Mar 18 10:53:59 crc kubenswrapper[4733]: E0318 10:53:59.177170 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 10:54:00 crc kubenswrapper[4733]: I0318 10:54:00.167053 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563854-8xm7m"] Mar 18 10:54:00 crc kubenswrapper[4733]: E0318 10:54:00.167763 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c884f6b7-6551-4a77-b19f-d0ea8c634eb2" containerName="extract-utilities" Mar 18 10:54:00 crc kubenswrapper[4733]: I0318 10:54:00.167808 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="c884f6b7-6551-4a77-b19f-d0ea8c634eb2" containerName="extract-utilities" Mar 18 10:54:00 crc kubenswrapper[4733]: E0318 10:54:00.167876 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c884f6b7-6551-4a77-b19f-d0ea8c634eb2" containerName="registry-server" Mar 18 10:54:00 crc kubenswrapper[4733]: I0318 10:54:00.167896 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="c884f6b7-6551-4a77-b19f-d0ea8c634eb2" containerName="registry-server" Mar 18 10:54:00 crc kubenswrapper[4733]: E0318 10:54:00.167937 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c884f6b7-6551-4a77-b19f-d0ea8c634eb2" containerName="extract-content" Mar 18 10:54:00 crc kubenswrapper[4733]: I0318 10:54:00.167955 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="c884f6b7-6551-4a77-b19f-d0ea8c634eb2" containerName="extract-content" Mar 18 10:54:00 crc kubenswrapper[4733]: I0318 10:54:00.168360 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="c884f6b7-6551-4a77-b19f-d0ea8c634eb2" containerName="registry-server" Mar 18 10:54:00 crc kubenswrapper[4733]: I0318 10:54:00.169295 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563854-8xm7m" Mar 18 10:54:00 crc kubenswrapper[4733]: I0318 10:54:00.174417 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:54:00 crc kubenswrapper[4733]: I0318 10:54:00.174434 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wmd5k" Mar 18 10:54:00 crc kubenswrapper[4733]: I0318 10:54:00.177244 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:54:00 crc kubenswrapper[4733]: I0318 10:54:00.190026 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563854-8xm7m"] Mar 18 10:54:00 crc kubenswrapper[4733]: I0318 10:54:00.246081 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zl8r\" (UniqueName: \"kubernetes.io/projected/e2194f8e-8219-4d20-9657-bab035e9ce0b-kube-api-access-5zl8r\") pod \"auto-csr-approver-29563854-8xm7m\" (UID: \"e2194f8e-8219-4d20-9657-bab035e9ce0b\") " pod="openshift-infra/auto-csr-approver-29563854-8xm7m" Mar 18 10:54:00 crc kubenswrapper[4733]: I0318 10:54:00.347607 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zl8r\" (UniqueName: \"kubernetes.io/projected/e2194f8e-8219-4d20-9657-bab035e9ce0b-kube-api-access-5zl8r\") pod \"auto-csr-approver-29563854-8xm7m\" (UID: \"e2194f8e-8219-4d20-9657-bab035e9ce0b\") " pod="openshift-infra/auto-csr-approver-29563854-8xm7m" Mar 18 10:54:00 crc kubenswrapper[4733]: I0318 10:54:00.386633 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zl8r\" (UniqueName: \"kubernetes.io/projected/e2194f8e-8219-4d20-9657-bab035e9ce0b-kube-api-access-5zl8r\") pod \"auto-csr-approver-29563854-8xm7m\" (UID: \"e2194f8e-8219-4d20-9657-bab035e9ce0b\") " pod="openshift-infra/auto-csr-approver-29563854-8xm7m" Mar 18 10:54:00 crc kubenswrapper[4733]: I0318 10:54:00.498402 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563854-8xm7m" Mar 18 10:54:00 crc kubenswrapper[4733]: I0318 10:54:00.981816 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563854-8xm7m"] Mar 18 10:54:00 crc kubenswrapper[4733]: W0318 10:54:00.982988 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2194f8e_8219_4d20_9657_bab035e9ce0b.slice/crio-b460879da7c6f247784fa731b833b199cebd1e8920e884c338530badccb40f7d WatchSource:0}: Error finding container b460879da7c6f247784fa731b833b199cebd1e8920e884c338530badccb40f7d: Status 404 returned error can't find the container with id b460879da7c6f247784fa731b833b199cebd1e8920e884c338530badccb40f7d Mar 18 10:54:01 crc kubenswrapper[4733]: I0318 10:54:01.948557 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563854-8xm7m" event={"ID":"e2194f8e-8219-4d20-9657-bab035e9ce0b","Type":"ContainerStarted","Data":"b460879da7c6f247784fa731b833b199cebd1e8920e884c338530badccb40f7d"} Mar 18 10:54:02 crc kubenswrapper[4733]: I0318 10:54:02.959712 4733 generic.go:334] "Generic (PLEG): container finished" podID="e2194f8e-8219-4d20-9657-bab035e9ce0b" containerID="0646c2eb1d4076069ba17429b100767c9ea92208b7525c26c0789773916b849f" exitCode=0 Mar 18 10:54:02 crc kubenswrapper[4733]: I0318 10:54:02.959815 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563854-8xm7m" event={"ID":"e2194f8e-8219-4d20-9657-bab035e9ce0b","Type":"ContainerDied","Data":"0646c2eb1d4076069ba17429b100767c9ea92208b7525c26c0789773916b849f"} Mar 18 10:54:03 crc kubenswrapper[4733]: I0318 10:54:03.176134 4733 scope.go:117] "RemoveContainer" containerID="ce9239548d170b75a5ed09a485a37b22892a226ac679cb077b3445d3c2f2c187" Mar 18 10:54:03 crc kubenswrapper[4733]: E0318 10:54:03.177050 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:54:04 crc kubenswrapper[4733]: I0318 10:54:04.313230 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563854-8xm7m" Mar 18 10:54:04 crc kubenswrapper[4733]: I0318 10:54:04.335248 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zl8r\" (UniqueName: \"kubernetes.io/projected/e2194f8e-8219-4d20-9657-bab035e9ce0b-kube-api-access-5zl8r\") pod \"e2194f8e-8219-4d20-9657-bab035e9ce0b\" (UID: \"e2194f8e-8219-4d20-9657-bab035e9ce0b\") " Mar 18 10:54:04 crc kubenswrapper[4733]: I0318 10:54:04.346582 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2194f8e-8219-4d20-9657-bab035e9ce0b-kube-api-access-5zl8r" (OuterVolumeSpecName: "kube-api-access-5zl8r") pod "e2194f8e-8219-4d20-9657-bab035e9ce0b" (UID: "e2194f8e-8219-4d20-9657-bab035e9ce0b"). InnerVolumeSpecName "kube-api-access-5zl8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:54:04 crc kubenswrapper[4733]: I0318 10:54:04.437077 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zl8r\" (UniqueName: \"kubernetes.io/projected/e2194f8e-8219-4d20-9657-bab035e9ce0b-kube-api-access-5zl8r\") on node \"crc\" DevicePath \"\"" Mar 18 10:54:04 crc kubenswrapper[4733]: I0318 10:54:04.979346 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563854-8xm7m" event={"ID":"e2194f8e-8219-4d20-9657-bab035e9ce0b","Type":"ContainerDied","Data":"b460879da7c6f247784fa731b833b199cebd1e8920e884c338530badccb40f7d"} Mar 18 10:54:04 crc kubenswrapper[4733]: I0318 10:54:04.979387 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b460879da7c6f247784fa731b833b199cebd1e8920e884c338530badccb40f7d" Mar 18 10:54:04 crc kubenswrapper[4733]: I0318 10:54:04.979425 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563854-8xm7m" Mar 18 10:54:05 crc kubenswrapper[4733]: I0318 10:54:05.175602 4733 scope.go:117] "RemoveContainer" containerID="fb87c31929d690ace0713c6e835580b64a2fb69bcf3837bfb62aeeeefbe16b5c" Mar 18 10:54:05 crc kubenswrapper[4733]: E0318 10:54:05.175821 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:54:05 crc kubenswrapper[4733]: I0318 10:54:05.399558 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563848-2tfjq"] Mar 18 10:54:05 crc kubenswrapper[4733]: I0318 10:54:05.405700 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563848-2tfjq"] Mar 18 10:54:07 crc kubenswrapper[4733]: I0318 10:54:07.186311 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bc1a1de-2965-422f-8ac4-77358d0d7df6" path="/var/lib/kubelet/pods/1bc1a1de-2965-422f-8ac4-77358d0d7df6/volumes" Mar 18 10:54:10 crc kubenswrapper[4733]: I0318 10:54:10.987870 4733 scope.go:117] "RemoveContainer" containerID="6fe3d8a40e1ba17924153a168a35cf8fd5e9cc3d1fdeb9b0f70b81b8350f5f56" Mar 18 10:54:14 crc kubenswrapper[4733]: I0318 10:54:14.175853 4733 scope.go:117] "RemoveContainer" containerID="13079617ef56fbdc98c390ba5bdaff3c5530411f54f691fbeb11894744ecac48" Mar 18 10:54:14 crc kubenswrapper[4733]: E0318 10:54:14.176860 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 10:54:14 crc kubenswrapper[4733]: I0318 10:54:14.177872 4733 scope.go:117] "RemoveContainer" containerID="ce9239548d170b75a5ed09a485a37b22892a226ac679cb077b3445d3c2f2c187" Mar 18 10:54:14 crc kubenswrapper[4733]: E0318 10:54:14.178243 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:54:17 crc kubenswrapper[4733]: I0318 10:54:17.175380 4733 scope.go:117] "RemoveContainer" containerID="fb87c31929d690ace0713c6e835580b64a2fb69bcf3837bfb62aeeeefbe16b5c" Mar 18 10:54:17 crc kubenswrapper[4733]: E0318 10:54:17.177233 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:54:25 crc kubenswrapper[4733]: I0318 10:54:25.175420 4733 scope.go:117] "RemoveContainer" containerID="ce9239548d170b75a5ed09a485a37b22892a226ac679cb077b3445d3c2f2c187" Mar 18 10:54:25 crc kubenswrapper[4733]: E0318 10:54:25.176611 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:54:27 crc kubenswrapper[4733]: I0318 10:54:27.175987 4733 scope.go:117] "RemoveContainer" containerID="13079617ef56fbdc98c390ba5bdaff3c5530411f54f691fbeb11894744ecac48" Mar 18 10:54:27 crc kubenswrapper[4733]: E0318 10:54:27.176425 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 10:54:32 crc kubenswrapper[4733]: I0318 10:54:32.176571 4733 scope.go:117] "RemoveContainer" containerID="fb87c31929d690ace0713c6e835580b64a2fb69bcf3837bfb62aeeeefbe16b5c" Mar 18 10:54:32 crc kubenswrapper[4733]: E0318 10:54:32.177566 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:54:37 crc kubenswrapper[4733]: I0318 10:54:37.176092 4733 scope.go:117] "RemoveContainer" containerID="ce9239548d170b75a5ed09a485a37b22892a226ac679cb077b3445d3c2f2c187" Mar 18 10:54:37 crc kubenswrapper[4733]: E0318 10:54:37.177363 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:54:41 crc kubenswrapper[4733]: I0318 10:54:41.182358 4733 scope.go:117] "RemoveContainer" containerID="13079617ef56fbdc98c390ba5bdaff3c5530411f54f691fbeb11894744ecac48" Mar 18 10:54:41 crc kubenswrapper[4733]: E0318 10:54:41.182892 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 10:54:44 crc kubenswrapper[4733]: I0318 10:54:44.176296 4733 scope.go:117] "RemoveContainer" containerID="fb87c31929d690ace0713c6e835580b64a2fb69bcf3837bfb62aeeeefbe16b5c" Mar 18 10:54:44 crc kubenswrapper[4733]: E0318 10:54:44.177319 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:54:48 crc kubenswrapper[4733]: I0318 10:54:48.176002 4733 scope.go:117] "RemoveContainer" containerID="ce9239548d170b75a5ed09a485a37b22892a226ac679cb077b3445d3c2f2c187" Mar 18 10:54:48 crc kubenswrapper[4733]: E0318 10:54:48.176711 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:54:55 crc kubenswrapper[4733]: I0318 10:54:55.175542 4733 scope.go:117] "RemoveContainer" containerID="13079617ef56fbdc98c390ba5bdaff3c5530411f54f691fbeb11894744ecac48" Mar 18 10:54:55 crc kubenswrapper[4733]: E0318 10:54:55.176545 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 10:54:58 crc kubenswrapper[4733]: I0318 10:54:58.176977 4733 scope.go:117] "RemoveContainer" containerID="fb87c31929d690ace0713c6e835580b64a2fb69bcf3837bfb62aeeeefbe16b5c" Mar 18 10:54:58 crc kubenswrapper[4733]: E0318 10:54:58.177997 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:55:02 crc kubenswrapper[4733]: I0318 10:55:02.176794 4733 scope.go:117] "RemoveContainer" containerID="ce9239548d170b75a5ed09a485a37b22892a226ac679cb077b3445d3c2f2c187" Mar 18 10:55:02 crc kubenswrapper[4733]: E0318 10:55:02.177696 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:55:10 crc kubenswrapper[4733]: I0318 10:55:10.175642 4733 scope.go:117] "RemoveContainer" containerID="13079617ef56fbdc98c390ba5bdaff3c5530411f54f691fbeb11894744ecac48" Mar 18 10:55:10 crc kubenswrapper[4733]: E0318 10:55:10.178133 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 10:55:11 crc kubenswrapper[4733]: I0318 10:55:11.180286 4733 scope.go:117] "RemoveContainer" containerID="fb87c31929d690ace0713c6e835580b64a2fb69bcf3837bfb62aeeeefbe16b5c" Mar 18 10:55:11 crc kubenswrapper[4733]: E0318 10:55:11.181009 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:55:13 crc kubenswrapper[4733]: I0318 10:55:13.176085 4733 scope.go:117] "RemoveContainer" containerID="ce9239548d170b75a5ed09a485a37b22892a226ac679cb077b3445d3c2f2c187" Mar 18 10:55:13 crc kubenswrapper[4733]: E0318 10:55:13.176537 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:55:25 crc kubenswrapper[4733]: I0318 10:55:25.177017 4733 scope.go:117] "RemoveContainer" containerID="13079617ef56fbdc98c390ba5bdaff3c5530411f54f691fbeb11894744ecac48" Mar 18 10:55:25 crc kubenswrapper[4733]: E0318 10:55:25.178029 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 10:55:25 crc kubenswrapper[4733]: I0318 10:55:25.178223 4733 scope.go:117] "RemoveContainer" containerID="fb87c31929d690ace0713c6e835580b64a2fb69bcf3837bfb62aeeeefbe16b5c" Mar 18 10:55:25 crc kubenswrapper[4733]: E0318 10:55:25.178783 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:55:27 crc kubenswrapper[4733]: I0318 10:55:27.175485 4733 scope.go:117] "RemoveContainer" containerID="ce9239548d170b75a5ed09a485a37b22892a226ac679cb077b3445d3c2f2c187" Mar 18 10:55:27 crc kubenswrapper[4733]: E0318 10:55:27.176290 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:55:36 crc kubenswrapper[4733]: I0318 10:55:36.176585 4733 scope.go:117] "RemoveContainer" containerID="fb87c31929d690ace0713c6e835580b64a2fb69bcf3837bfb62aeeeefbe16b5c" Mar 18 10:55:36 crc kubenswrapper[4733]: E0318 10:55:36.177713 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:55:38 crc kubenswrapper[4733]: I0318 10:55:38.177311 4733 scope.go:117] "RemoveContainer" containerID="13079617ef56fbdc98c390ba5bdaff3c5530411f54f691fbeb11894744ecac48" Mar 18 10:55:38 crc kubenswrapper[4733]: E0318 10:55:38.178674 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 10:55:39 crc kubenswrapper[4733]: I0318 10:55:39.176475 4733 scope.go:117] "RemoveContainer" containerID="ce9239548d170b75a5ed09a485a37b22892a226ac679cb077b3445d3c2f2c187" Mar 18 10:55:39 crc kubenswrapper[4733]: E0318 10:55:39.177456 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:55:51 crc kubenswrapper[4733]: I0318 10:55:51.185110 4733 scope.go:117] "RemoveContainer" containerID="fb87c31929d690ace0713c6e835580b64a2fb69bcf3837bfb62aeeeefbe16b5c" Mar 18 10:55:51 crc kubenswrapper[4733]: E0318 10:55:51.187689 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:55:52 crc kubenswrapper[4733]: I0318 10:55:52.175664 4733 scope.go:117] "RemoveContainer" containerID="13079617ef56fbdc98c390ba5bdaff3c5530411f54f691fbeb11894744ecac48" Mar 18 10:55:52 crc kubenswrapper[4733]: E0318 10:55:52.176095 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 10:55:54 crc kubenswrapper[4733]: I0318 10:55:54.175655 4733 scope.go:117] "RemoveContainer" containerID="ce9239548d170b75a5ed09a485a37b22892a226ac679cb077b3445d3c2f2c187" Mar 18 10:55:54 crc kubenswrapper[4733]: E0318 10:55:54.176305 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:56:00 crc kubenswrapper[4733]: I0318 10:56:00.154624 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563856-qhf5l"] Mar 18 10:56:00 crc kubenswrapper[4733]: E0318 10:56:00.155067 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2194f8e-8219-4d20-9657-bab035e9ce0b" containerName="oc" Mar 18 10:56:00 crc kubenswrapper[4733]: I0318 10:56:00.155087 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2194f8e-8219-4d20-9657-bab035e9ce0b" containerName="oc" Mar 18 10:56:00 crc kubenswrapper[4733]: I0318 10:56:00.155428 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2194f8e-8219-4d20-9657-bab035e9ce0b" containerName="oc" Mar 18 10:56:00 crc kubenswrapper[4733]: I0318 10:56:00.156336 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563856-qhf5l" Mar 18 10:56:00 crc kubenswrapper[4733]: I0318 10:56:00.161958 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:56:00 crc kubenswrapper[4733]: I0318 10:56:00.162835 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:56:00 crc kubenswrapper[4733]: I0318 10:56:00.163752 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wmd5k" Mar 18 10:56:00 crc kubenswrapper[4733]: I0318 10:56:00.169937 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563856-qhf5l"] Mar 18 10:56:00 crc kubenswrapper[4733]: I0318 10:56:00.226037 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmhtd\" (UniqueName: \"kubernetes.io/projected/e1a4900e-15be-4e3f-a8a2-3eb582acbc20-kube-api-access-vmhtd\") pod \"auto-csr-approver-29563856-qhf5l\" (UID: \"e1a4900e-15be-4e3f-a8a2-3eb582acbc20\") " pod="openshift-infra/auto-csr-approver-29563856-qhf5l" Mar 18 10:56:00 crc kubenswrapper[4733]: I0318 10:56:00.328026 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmhtd\" (UniqueName: \"kubernetes.io/projected/e1a4900e-15be-4e3f-a8a2-3eb582acbc20-kube-api-access-vmhtd\") pod \"auto-csr-approver-29563856-qhf5l\" (UID: \"e1a4900e-15be-4e3f-a8a2-3eb582acbc20\") " pod="openshift-infra/auto-csr-approver-29563856-qhf5l" Mar 18 10:56:00 crc kubenswrapper[4733]: I0318 10:56:00.353943 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmhtd\" (UniqueName: \"kubernetes.io/projected/e1a4900e-15be-4e3f-a8a2-3eb582acbc20-kube-api-access-vmhtd\") pod \"auto-csr-approver-29563856-qhf5l\" (UID: \"e1a4900e-15be-4e3f-a8a2-3eb582acbc20\") " pod="openshift-infra/auto-csr-approver-29563856-qhf5l" Mar 18 10:56:00 crc kubenswrapper[4733]: I0318 10:56:00.518792 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563856-qhf5l" Mar 18 10:56:00 crc kubenswrapper[4733]: I0318 10:56:00.805171 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563856-qhf5l"] Mar 18 10:56:00 crc kubenswrapper[4733]: W0318 10:56:00.807728 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1a4900e_15be_4e3f_a8a2_3eb582acbc20.slice/crio-74c5ad720d80d58ad8720f159874e34d6dba44d12a6a557c0ec232278f2fae2e WatchSource:0}: Error finding container 74c5ad720d80d58ad8720f159874e34d6dba44d12a6a557c0ec232278f2fae2e: Status 404 returned error can't find the container with id 74c5ad720d80d58ad8720f159874e34d6dba44d12a6a557c0ec232278f2fae2e Mar 18 10:56:00 crc kubenswrapper[4733]: I0318 10:56:00.810676 4733 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 10:56:01 crc kubenswrapper[4733]: I0318 10:56:01.156725 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563856-qhf5l" event={"ID":"e1a4900e-15be-4e3f-a8a2-3eb582acbc20","Type":"ContainerStarted","Data":"74c5ad720d80d58ad8720f159874e34d6dba44d12a6a557c0ec232278f2fae2e"} Mar 18 10:56:02 crc kubenswrapper[4733]: I0318 10:56:02.165715 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563856-qhf5l" event={"ID":"e1a4900e-15be-4e3f-a8a2-3eb582acbc20","Type":"ContainerStarted","Data":"9c09a3cb9db31583aa867b9b2e7873c25af33c3cb06cde66bc60959b1e039850"} Mar 18 10:56:02 crc kubenswrapper[4733]: I0318 10:56:02.187684 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563856-qhf5l" podStartSLOduration=1.223335041 podStartE2EDuration="2.187665025s" podCreationTimestamp="2026-03-18 10:56:00 +0000 UTC" firstStartedPulling="2026-03-18 10:56:00.810167164 +0000 UTC m=+2600.301901519" lastFinishedPulling="2026-03-18 10:56:01.774497148 +0000 UTC m=+2601.266231503" observedRunningTime="2026-03-18 10:56:02.178024194 +0000 UTC m=+2601.669758529" watchObservedRunningTime="2026-03-18 10:56:02.187665025 +0000 UTC m=+2601.679399360" Mar 18 10:56:02 crc kubenswrapper[4733]: E0318 10:56:02.533777 4733 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1a4900e_15be_4e3f_a8a2_3eb582acbc20.slice/crio-conmon-9c09a3cb9db31583aa867b9b2e7873c25af33c3cb06cde66bc60959b1e039850.scope\": RecentStats: unable to find data in memory cache]" Mar 18 10:56:03 crc kubenswrapper[4733]: I0318 10:56:03.176890 4733 scope.go:117] "RemoveContainer" containerID="13079617ef56fbdc98c390ba5bdaff3c5530411f54f691fbeb11894744ecac48" Mar 18 10:56:03 crc kubenswrapper[4733]: E0318 10:56:03.180811 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 10:56:03 crc kubenswrapper[4733]: I0318 10:56:03.185532 4733 generic.go:334] "Generic (PLEG): container finished" podID="e1a4900e-15be-4e3f-a8a2-3eb582acbc20" containerID="9c09a3cb9db31583aa867b9b2e7873c25af33c3cb06cde66bc60959b1e039850" exitCode=0 Mar 18 10:56:03 crc kubenswrapper[4733]: I0318 10:56:03.203267 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563856-qhf5l" event={"ID":"e1a4900e-15be-4e3f-a8a2-3eb582acbc20","Type":"ContainerDied","Data":"9c09a3cb9db31583aa867b9b2e7873c25af33c3cb06cde66bc60959b1e039850"} Mar 18 10:56:04 crc kubenswrapper[4733]: I0318 10:56:04.533377 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563856-qhf5l" Mar 18 10:56:04 crc kubenswrapper[4733]: I0318 10:56:04.709640 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmhtd\" (UniqueName: \"kubernetes.io/projected/e1a4900e-15be-4e3f-a8a2-3eb582acbc20-kube-api-access-vmhtd\") pod \"e1a4900e-15be-4e3f-a8a2-3eb582acbc20\" (UID: \"e1a4900e-15be-4e3f-a8a2-3eb582acbc20\") " Mar 18 10:56:04 crc kubenswrapper[4733]: I0318 10:56:04.719777 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1a4900e-15be-4e3f-a8a2-3eb582acbc20-kube-api-access-vmhtd" (OuterVolumeSpecName: "kube-api-access-vmhtd") pod "e1a4900e-15be-4e3f-a8a2-3eb582acbc20" (UID: "e1a4900e-15be-4e3f-a8a2-3eb582acbc20"). InnerVolumeSpecName "kube-api-access-vmhtd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:56:04 crc kubenswrapper[4733]: I0318 10:56:04.812026 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmhtd\" (UniqueName: \"kubernetes.io/projected/e1a4900e-15be-4e3f-a8a2-3eb582acbc20-kube-api-access-vmhtd\") on node \"crc\" DevicePath \"\"" Mar 18 10:56:05 crc kubenswrapper[4733]: I0318 10:56:05.176416 4733 scope.go:117] "RemoveContainer" containerID="fb87c31929d690ace0713c6e835580b64a2fb69bcf3837bfb62aeeeefbe16b5c" Mar 18 10:56:05 crc kubenswrapper[4733]: E0318 10:56:05.176781 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:56:05 crc kubenswrapper[4733]: I0318 10:56:05.207614 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563856-qhf5l" event={"ID":"e1a4900e-15be-4e3f-a8a2-3eb582acbc20","Type":"ContainerDied","Data":"74c5ad720d80d58ad8720f159874e34d6dba44d12a6a557c0ec232278f2fae2e"} Mar 18 10:56:05 crc kubenswrapper[4733]: I0318 10:56:05.207661 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74c5ad720d80d58ad8720f159874e34d6dba44d12a6a557c0ec232278f2fae2e" Mar 18 10:56:05 crc kubenswrapper[4733]: I0318 10:56:05.207690 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563856-qhf5l" Mar 18 10:56:05 crc kubenswrapper[4733]: I0318 10:56:05.620367 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563850-8gwpw"] Mar 18 10:56:05 crc kubenswrapper[4733]: I0318 10:56:05.631876 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563850-8gwpw"] Mar 18 10:56:07 crc kubenswrapper[4733]: I0318 10:56:07.186267 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2166cf23-2a65-4b17-922e-3131be1d6d8b" path="/var/lib/kubelet/pods/2166cf23-2a65-4b17-922e-3131be1d6d8b/volumes" Mar 18 10:56:10 crc kubenswrapper[4733]: I0318 10:56:10.114853 4733 scope.go:117] "RemoveContainer" containerID="ce9239548d170b75a5ed09a485a37b22892a226ac679cb077b3445d3c2f2c187" Mar 18 10:56:10 crc kubenswrapper[4733]: E0318 10:56:10.118835 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:56:11 crc kubenswrapper[4733]: I0318 10:56:11.107843 4733 scope.go:117] "RemoveContainer" containerID="649f36c8155821a228e5fee55c54c1d5edbde655cee7563ac249384dedef675b" Mar 18 10:56:15 crc kubenswrapper[4733]: I0318 10:56:15.275464 4733 scope.go:117] "RemoveContainer" containerID="13079617ef56fbdc98c390ba5bdaff3c5530411f54f691fbeb11894744ecac48" Mar 18 10:56:16 crc kubenswrapper[4733]: I0318 10:56:16.175998 4733 scope.go:117] "RemoveContainer" containerID="fb87c31929d690ace0713c6e835580b64a2fb69bcf3837bfb62aeeeefbe16b5c" Mar 18 10:56:16 crc kubenswrapper[4733]: E0318 10:56:16.177420 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:56:16 crc kubenswrapper[4733]: I0318 10:56:16.317070 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" event={"ID":"6f75e1c5-e0c5-43df-944f-77b734070793","Type":"ContainerStarted","Data":"32198f7b4110f4b23718a4e872dd512bdbf76e8166cae4cab128ee6761e36a56"} Mar 18 10:56:21 crc kubenswrapper[4733]: I0318 10:56:21.183828 4733 scope.go:117] "RemoveContainer" containerID="ce9239548d170b75a5ed09a485a37b22892a226ac679cb077b3445d3c2f2c187" Mar 18 10:56:21 crc kubenswrapper[4733]: E0318 10:56:21.185022 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:56:31 crc kubenswrapper[4733]: I0318 10:56:31.184477 4733 scope.go:117] "RemoveContainer" containerID="fb87c31929d690ace0713c6e835580b64a2fb69bcf3837bfb62aeeeefbe16b5c" Mar 18 10:56:31 crc kubenswrapper[4733]: E0318 10:56:31.185539 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:56:34 crc kubenswrapper[4733]: I0318 10:56:34.175666 4733 scope.go:117] "RemoveContainer" containerID="ce9239548d170b75a5ed09a485a37b22892a226ac679cb077b3445d3c2f2c187" Mar 18 10:56:34 crc kubenswrapper[4733]: E0318 10:56:34.176171 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:56:45 crc kubenswrapper[4733]: I0318 10:56:45.176084 4733 scope.go:117] "RemoveContainer" containerID="fb87c31929d690ace0713c6e835580b64a2fb69bcf3837bfb62aeeeefbe16b5c" Mar 18 10:56:45 crc kubenswrapper[4733]: E0318 10:56:45.177409 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:56:49 crc kubenswrapper[4733]: I0318 10:56:49.176556 4733 scope.go:117] "RemoveContainer" containerID="ce9239548d170b75a5ed09a485a37b22892a226ac679cb077b3445d3c2f2c187" Mar 18 10:56:49 crc kubenswrapper[4733]: E0318 10:56:49.177613 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:56:58 crc kubenswrapper[4733]: I0318 10:56:58.176347 4733 scope.go:117] "RemoveContainer" containerID="fb87c31929d690ace0713c6e835580b64a2fb69bcf3837bfb62aeeeefbe16b5c" Mar 18 10:56:58 crc kubenswrapper[4733]: E0318 10:56:58.177261 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:57:03 crc kubenswrapper[4733]: I0318 10:57:03.176680 4733 scope.go:117] "RemoveContainer" containerID="ce9239548d170b75a5ed09a485a37b22892a226ac679cb077b3445d3c2f2c187" Mar 18 10:57:03 crc kubenswrapper[4733]: E0318 10:57:03.177736 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:57:09 crc kubenswrapper[4733]: I0318 10:57:09.176907 4733 scope.go:117] "RemoveContainer" containerID="fb87c31929d690ace0713c6e835580b64a2fb69bcf3837bfb62aeeeefbe16b5c" Mar 18 10:57:09 crc kubenswrapper[4733]: E0318 10:57:09.177911 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:57:18 crc kubenswrapper[4733]: I0318 10:57:18.175504 4733 scope.go:117] "RemoveContainer" containerID="ce9239548d170b75a5ed09a485a37b22892a226ac679cb077b3445d3c2f2c187" Mar 18 10:57:18 crc kubenswrapper[4733]: E0318 10:57:18.176469 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:57:22 crc kubenswrapper[4733]: I0318 10:57:22.176861 4733 scope.go:117] "RemoveContainer" containerID="fb87c31929d690ace0713c6e835580b64a2fb69bcf3837bfb62aeeeefbe16b5c" Mar 18 10:57:22 crc kubenswrapper[4733]: E0318 10:57:22.177591 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:57:30 crc kubenswrapper[4733]: I0318 10:57:30.175606 4733 scope.go:117] "RemoveContainer" containerID="ce9239548d170b75a5ed09a485a37b22892a226ac679cb077b3445d3c2f2c187" Mar 18 10:57:30 crc kubenswrapper[4733]: E0318 10:57:30.176234 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:57:34 crc kubenswrapper[4733]: I0318 10:57:34.175959 4733 scope.go:117] "RemoveContainer" containerID="fb87c31929d690ace0713c6e835580b64a2fb69bcf3837bfb62aeeeefbe16b5c" Mar 18 10:57:34 crc kubenswrapper[4733]: E0318 10:57:34.176825 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:57:45 crc kubenswrapper[4733]: I0318 10:57:45.175910 4733 scope.go:117] "RemoveContainer" containerID="ce9239548d170b75a5ed09a485a37b22892a226ac679cb077b3445d3c2f2c187" Mar 18 10:57:45 crc kubenswrapper[4733]: E0318 10:57:45.176949 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:57:49 crc kubenswrapper[4733]: I0318 10:57:49.176156 4733 scope.go:117] "RemoveContainer" containerID="fb87c31929d690ace0713c6e835580b64a2fb69bcf3837bfb62aeeeefbe16b5c" Mar 18 10:57:49 crc kubenswrapper[4733]: E0318 10:57:49.176782 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:57:58 crc kubenswrapper[4733]: I0318 10:57:58.176016 4733 scope.go:117] "RemoveContainer" containerID="ce9239548d170b75a5ed09a485a37b22892a226ac679cb077b3445d3c2f2c187" Mar 18 10:57:58 crc kubenswrapper[4733]: E0318 10:57:58.177101 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:58:00 crc kubenswrapper[4733]: I0318 10:58:00.152415 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563858-bxq6r"] Mar 18 10:58:00 crc kubenswrapper[4733]: E0318 10:58:00.152916 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1a4900e-15be-4e3f-a8a2-3eb582acbc20" containerName="oc" Mar 18 10:58:00 crc kubenswrapper[4733]: I0318 10:58:00.152940 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1a4900e-15be-4e3f-a8a2-3eb582acbc20" containerName="oc" Mar 18 10:58:00 crc kubenswrapper[4733]: I0318 10:58:00.153345 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1a4900e-15be-4e3f-a8a2-3eb582acbc20" containerName="oc" Mar 18 10:58:00 crc kubenswrapper[4733]: I0318 10:58:00.154138 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563858-bxq6r" Mar 18 10:58:00 crc kubenswrapper[4733]: I0318 10:58:00.157531 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 10:58:00 crc kubenswrapper[4733]: I0318 10:58:00.157890 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wmd5k" Mar 18 10:58:00 crc kubenswrapper[4733]: I0318 10:58:00.159097 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 10:58:00 crc kubenswrapper[4733]: I0318 10:58:00.164718 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563858-bxq6r"] Mar 18 10:58:00 crc kubenswrapper[4733]: I0318 10:58:00.347564 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9pdm\" (UniqueName: \"kubernetes.io/projected/7bc61f2d-1837-4253-a3a3-91d8acc950f8-kube-api-access-h9pdm\") pod \"auto-csr-approver-29563858-bxq6r\" (UID: \"7bc61f2d-1837-4253-a3a3-91d8acc950f8\") " pod="openshift-infra/auto-csr-approver-29563858-bxq6r" Mar 18 10:58:00 crc kubenswrapper[4733]: I0318 10:58:00.449476 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9pdm\" (UniqueName: \"kubernetes.io/projected/7bc61f2d-1837-4253-a3a3-91d8acc950f8-kube-api-access-h9pdm\") pod \"auto-csr-approver-29563858-bxq6r\" (UID: \"7bc61f2d-1837-4253-a3a3-91d8acc950f8\") " pod="openshift-infra/auto-csr-approver-29563858-bxq6r" Mar 18 10:58:00 crc kubenswrapper[4733]: I0318 10:58:00.486904 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9pdm\" (UniqueName: \"kubernetes.io/projected/7bc61f2d-1837-4253-a3a3-91d8acc950f8-kube-api-access-h9pdm\") pod \"auto-csr-approver-29563858-bxq6r\" (UID: \"7bc61f2d-1837-4253-a3a3-91d8acc950f8\") " pod="openshift-infra/auto-csr-approver-29563858-bxq6r" Mar 18 10:58:00 crc kubenswrapper[4733]: I0318 10:58:00.780547 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563858-bxq6r" Mar 18 10:58:01 crc kubenswrapper[4733]: I0318 10:58:01.182595 4733 scope.go:117] "RemoveContainer" containerID="fb87c31929d690ace0713c6e835580b64a2fb69bcf3837bfb62aeeeefbe16b5c" Mar 18 10:58:01 crc kubenswrapper[4733]: E0318 10:58:01.183616 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:58:01 crc kubenswrapper[4733]: I0318 10:58:01.336284 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563858-bxq6r"] Mar 18 10:58:01 crc kubenswrapper[4733]: I0318 10:58:01.551505 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563858-bxq6r" event={"ID":"7bc61f2d-1837-4253-a3a3-91d8acc950f8","Type":"ContainerStarted","Data":"572677742ca6f2dd1a03ef173b7a9bb884073e34767cd09079a938a5f114183a"} Mar 18 10:58:03 crc kubenswrapper[4733]: I0318 10:58:03.571515 4733 generic.go:334] "Generic (PLEG): container finished" podID="7bc61f2d-1837-4253-a3a3-91d8acc950f8" containerID="ae5d77ede52fa11bd913773d4add1f40cd6fcaf6154c4236eccd984879ea57ff" exitCode=0 Mar 18 10:58:03 crc kubenswrapper[4733]: I0318 10:58:03.571611 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563858-bxq6r" event={"ID":"7bc61f2d-1837-4253-a3a3-91d8acc950f8","Type":"ContainerDied","Data":"ae5d77ede52fa11bd913773d4add1f40cd6fcaf6154c4236eccd984879ea57ff"} Mar 18 10:58:05 crc kubenswrapper[4733]: I0318 10:58:05.013042 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563858-bxq6r" Mar 18 10:58:05 crc kubenswrapper[4733]: I0318 10:58:05.140444 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9pdm\" (UniqueName: \"kubernetes.io/projected/7bc61f2d-1837-4253-a3a3-91d8acc950f8-kube-api-access-h9pdm\") pod \"7bc61f2d-1837-4253-a3a3-91d8acc950f8\" (UID: \"7bc61f2d-1837-4253-a3a3-91d8acc950f8\") " Mar 18 10:58:05 crc kubenswrapper[4733]: I0318 10:58:05.147245 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bc61f2d-1837-4253-a3a3-91d8acc950f8-kube-api-access-h9pdm" (OuterVolumeSpecName: "kube-api-access-h9pdm") pod "7bc61f2d-1837-4253-a3a3-91d8acc950f8" (UID: "7bc61f2d-1837-4253-a3a3-91d8acc950f8"). InnerVolumeSpecName "kube-api-access-h9pdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:58:05 crc kubenswrapper[4733]: I0318 10:58:05.243077 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9pdm\" (UniqueName: \"kubernetes.io/projected/7bc61f2d-1837-4253-a3a3-91d8acc950f8-kube-api-access-h9pdm\") on node \"crc\" DevicePath \"\"" Mar 18 10:58:05 crc kubenswrapper[4733]: I0318 10:58:05.607396 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563858-bxq6r" event={"ID":"7bc61f2d-1837-4253-a3a3-91d8acc950f8","Type":"ContainerDied","Data":"572677742ca6f2dd1a03ef173b7a9bb884073e34767cd09079a938a5f114183a"} Mar 18 10:58:05 crc kubenswrapper[4733]: I0318 10:58:05.607713 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="572677742ca6f2dd1a03ef173b7a9bb884073e34767cd09079a938a5f114183a" Mar 18 10:58:05 crc kubenswrapper[4733]: I0318 10:58:05.607641 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563858-bxq6r" Mar 18 10:58:06 crc kubenswrapper[4733]: I0318 10:58:06.111459 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563852-nxds9"] Mar 18 10:58:06 crc kubenswrapper[4733]: I0318 10:58:06.121554 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563852-nxds9"] Mar 18 10:58:07 crc kubenswrapper[4733]: I0318 10:58:07.192592 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21264fda-07b1-4a7f-ac61-432c6dc9a230" path="/var/lib/kubelet/pods/21264fda-07b1-4a7f-ac61-432c6dc9a230/volumes" Mar 18 10:58:11 crc kubenswrapper[4733]: I0318 10:58:11.204407 4733 scope.go:117] "RemoveContainer" containerID="ce9239548d170b75a5ed09a485a37b22892a226ac679cb077b3445d3c2f2c187" Mar 18 10:58:11 crc kubenswrapper[4733]: E0318 10:58:11.205357 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:58:11 crc kubenswrapper[4733]: I0318 10:58:11.235120 4733 scope.go:117] "RemoveContainer" containerID="6234fbb28241739a6b36f7e66aab35dc25489c2ddedad91a0ad07ea33e77be17" Mar 18 10:58:14 crc kubenswrapper[4733]: I0318 10:58:14.175844 4733 scope.go:117] "RemoveContainer" containerID="fb87c31929d690ace0713c6e835580b64a2fb69bcf3837bfb62aeeeefbe16b5c" Mar 18 10:58:14 crc kubenswrapper[4733]: E0318 10:58:14.176666 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:58:23 crc kubenswrapper[4733]: I0318 10:58:23.176116 4733 scope.go:117] "RemoveContainer" containerID="ce9239548d170b75a5ed09a485a37b22892a226ac679cb077b3445d3c2f2c187" Mar 18 10:58:23 crc kubenswrapper[4733]: E0318 10:58:23.177109 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:58:27 crc kubenswrapper[4733]: I0318 10:58:27.175751 4733 scope.go:117] "RemoveContainer" containerID="fb87c31929d690ace0713c6e835580b64a2fb69bcf3837bfb62aeeeefbe16b5c" Mar 18 10:58:27 crc kubenswrapper[4733]: E0318 10:58:27.176216 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:58:35 crc kubenswrapper[4733]: I0318 10:58:35.178311 4733 scope.go:117] "RemoveContainer" containerID="ce9239548d170b75a5ed09a485a37b22892a226ac679cb077b3445d3c2f2c187" Mar 18 10:58:35 crc kubenswrapper[4733]: E0318 10:58:35.179333 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:58:41 crc kubenswrapper[4733]: I0318 10:58:41.188650 4733 scope.go:117] "RemoveContainer" containerID="fb87c31929d690ace0713c6e835580b64a2fb69bcf3837bfb62aeeeefbe16b5c" Mar 18 10:58:41 crc kubenswrapper[4733]: E0318 10:58:41.189879 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:58:43 crc kubenswrapper[4733]: I0318 10:58:43.571572 4733 patch_prober.go:28] interesting pod/machine-config-daemon-2h7dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:58:43 crc kubenswrapper[4733]: I0318 10:58:43.571990 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:58:47 crc kubenswrapper[4733]: I0318 10:58:47.195804 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dpxzp"] Mar 18 10:58:47 crc kubenswrapper[4733]: E0318 10:58:47.196717 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bc61f2d-1837-4253-a3a3-91d8acc950f8" containerName="oc" Mar 18 10:58:47 crc kubenswrapper[4733]: I0318 10:58:47.196739 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bc61f2d-1837-4253-a3a3-91d8acc950f8" containerName="oc" Mar 18 10:58:47 crc kubenswrapper[4733]: I0318 10:58:47.197102 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bc61f2d-1837-4253-a3a3-91d8acc950f8" containerName="oc" Mar 18 10:58:47 crc kubenswrapper[4733]: I0318 10:58:47.209934 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dpxzp" Mar 18 10:58:47 crc kubenswrapper[4733]: I0318 10:58:47.219148 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dpxzp"] Mar 18 10:58:47 crc kubenswrapper[4733]: I0318 10:58:47.343762 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2b9d483-b743-44ab-bf0b-f0b22f133576-catalog-content\") pod \"community-operators-dpxzp\" (UID: \"d2b9d483-b743-44ab-bf0b-f0b22f133576\") " pod="openshift-marketplace/community-operators-dpxzp" Mar 18 10:58:47 crc kubenswrapper[4733]: I0318 10:58:47.344155 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4v5l\" (UniqueName: \"kubernetes.io/projected/d2b9d483-b743-44ab-bf0b-f0b22f133576-kube-api-access-q4v5l\") pod \"community-operators-dpxzp\" (UID: \"d2b9d483-b743-44ab-bf0b-f0b22f133576\") " pod="openshift-marketplace/community-operators-dpxzp" Mar 18 10:58:47 crc kubenswrapper[4733]: I0318 10:58:47.344352 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2b9d483-b743-44ab-bf0b-f0b22f133576-utilities\") pod \"community-operators-dpxzp\" (UID: \"d2b9d483-b743-44ab-bf0b-f0b22f133576\") " pod="openshift-marketplace/community-operators-dpxzp" Mar 18 10:58:47 crc kubenswrapper[4733]: I0318 10:58:47.445978 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2b9d483-b743-44ab-bf0b-f0b22f133576-catalog-content\") pod \"community-operators-dpxzp\" (UID: \"d2b9d483-b743-44ab-bf0b-f0b22f133576\") " pod="openshift-marketplace/community-operators-dpxzp" Mar 18 10:58:47 crc kubenswrapper[4733]: I0318 10:58:47.446170 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4v5l\" (UniqueName: \"kubernetes.io/projected/d2b9d483-b743-44ab-bf0b-f0b22f133576-kube-api-access-q4v5l\") pod \"community-operators-dpxzp\" (UID: \"d2b9d483-b743-44ab-bf0b-f0b22f133576\") " pod="openshift-marketplace/community-operators-dpxzp" Mar 18 10:58:47 crc kubenswrapper[4733]: I0318 10:58:47.446303 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2b9d483-b743-44ab-bf0b-f0b22f133576-utilities\") pod \"community-operators-dpxzp\" (UID: \"d2b9d483-b743-44ab-bf0b-f0b22f133576\") " pod="openshift-marketplace/community-operators-dpxzp" Mar 18 10:58:47 crc kubenswrapper[4733]: I0318 10:58:47.446786 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2b9d483-b743-44ab-bf0b-f0b22f133576-utilities\") pod \"community-operators-dpxzp\" (UID: \"d2b9d483-b743-44ab-bf0b-f0b22f133576\") " pod="openshift-marketplace/community-operators-dpxzp" Mar 18 10:58:47 crc kubenswrapper[4733]: I0318 10:58:47.446854 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2b9d483-b743-44ab-bf0b-f0b22f133576-catalog-content\") pod \"community-operators-dpxzp\" (UID: \"d2b9d483-b743-44ab-bf0b-f0b22f133576\") " pod="openshift-marketplace/community-operators-dpxzp" Mar 18 10:58:47 crc kubenswrapper[4733]: I0318 10:58:47.482235 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4v5l\" (UniqueName: \"kubernetes.io/projected/d2b9d483-b743-44ab-bf0b-f0b22f133576-kube-api-access-q4v5l\") pod \"community-operators-dpxzp\" (UID: \"d2b9d483-b743-44ab-bf0b-f0b22f133576\") " pod="openshift-marketplace/community-operators-dpxzp" Mar 18 10:58:47 crc kubenswrapper[4733]: I0318 10:58:47.563472 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dpxzp" Mar 18 10:58:48 crc kubenswrapper[4733]: I0318 10:58:48.135447 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dpxzp"] Mar 18 10:58:48 crc kubenswrapper[4733]: W0318 10:58:48.145417 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2b9d483_b743_44ab_bf0b_f0b22f133576.slice/crio-5d45d05271d7475d8f987b38020b7a838549e3626fa27dfe3061a1df224ec8fe WatchSource:0}: Error finding container 5d45d05271d7475d8f987b38020b7a838549e3626fa27dfe3061a1df224ec8fe: Status 404 returned error can't find the container with id 5d45d05271d7475d8f987b38020b7a838549e3626fa27dfe3061a1df224ec8fe Mar 18 10:58:48 crc kubenswrapper[4733]: I0318 10:58:48.175973 4733 scope.go:117] "RemoveContainer" containerID="ce9239548d170b75a5ed09a485a37b22892a226ac679cb077b3445d3c2f2c187" Mar 18 10:58:49 crc kubenswrapper[4733]: I0318 10:58:49.026403 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f0570ce4-1455-4698-85cf-01f7108d9e7f","Type":"ContainerStarted","Data":"42f5e854566e19360cf16fa02f3e09efbcbafeba0ef62811eb321face5cd1f9f"} Mar 18 10:58:49 crc kubenswrapper[4733]: I0318 10:58:49.026998 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 18 10:58:49 crc kubenswrapper[4733]: I0318 10:58:49.031061 4733 generic.go:334] "Generic (PLEG): container finished" podID="d2b9d483-b743-44ab-bf0b-f0b22f133576" containerID="d16fb3f865634c313d04cd15fe257f949ffe967e94771f738a23b25fe3ead680" exitCode=0 Mar 18 10:58:49 crc kubenswrapper[4733]: I0318 10:58:49.031091 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dpxzp" event={"ID":"d2b9d483-b743-44ab-bf0b-f0b22f133576","Type":"ContainerDied","Data":"d16fb3f865634c313d04cd15fe257f949ffe967e94771f738a23b25fe3ead680"} Mar 18 10:58:49 crc kubenswrapper[4733]: I0318 10:58:49.031133 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dpxzp" event={"ID":"d2b9d483-b743-44ab-bf0b-f0b22f133576","Type":"ContainerStarted","Data":"5d45d05271d7475d8f987b38020b7a838549e3626fa27dfe3061a1df224ec8fe"} Mar 18 10:58:51 crc kubenswrapper[4733]: I0318 10:58:51.057508 4733 generic.go:334] "Generic (PLEG): container finished" podID="d2b9d483-b743-44ab-bf0b-f0b22f133576" containerID="00990b4a98758f8d96d9c56f65f35d6f4152d627814babba7b1c188d594f8df9" exitCode=0 Mar 18 10:58:51 crc kubenswrapper[4733]: I0318 10:58:51.058598 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dpxzp" event={"ID":"d2b9d483-b743-44ab-bf0b-f0b22f133576","Type":"ContainerDied","Data":"00990b4a98758f8d96d9c56f65f35d6f4152d627814babba7b1c188d594f8df9"} Mar 18 10:58:52 crc kubenswrapper[4733]: I0318 10:58:52.073454 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dpxzp" event={"ID":"d2b9d483-b743-44ab-bf0b-f0b22f133576","Type":"ContainerStarted","Data":"44b4592be70be18f7abf508277d2b60867509d702745fc39e315eb1db7399b16"} Mar 18 10:58:52 crc kubenswrapper[4733]: I0318 10:58:52.115177 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dpxzp" podStartSLOduration=2.62362291 podStartE2EDuration="5.115150355s" podCreationTimestamp="2026-03-18 10:58:47 +0000 UTC" firstStartedPulling="2026-03-18 10:58:49.032406064 +0000 UTC m=+2768.524140419" lastFinishedPulling="2026-03-18 10:58:51.523933499 +0000 UTC m=+2771.015667864" observedRunningTime="2026-03-18 10:58:52.101786559 +0000 UTC m=+2771.593520934" watchObservedRunningTime="2026-03-18 10:58:52.115150355 +0000 UTC m=+2771.606884710" Mar 18 10:58:53 crc kubenswrapper[4733]: I0318 10:58:53.088906 4733 generic.go:334] "Generic (PLEG): container finished" podID="f0570ce4-1455-4698-85cf-01f7108d9e7f" containerID="42f5e854566e19360cf16fa02f3e09efbcbafeba0ef62811eb321face5cd1f9f" exitCode=0 Mar 18 10:58:53 crc kubenswrapper[4733]: I0318 10:58:53.088969 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f0570ce4-1455-4698-85cf-01f7108d9e7f","Type":"ContainerDied","Data":"42f5e854566e19360cf16fa02f3e09efbcbafeba0ef62811eb321face5cd1f9f"} Mar 18 10:58:53 crc kubenswrapper[4733]: I0318 10:58:53.089507 4733 scope.go:117] "RemoveContainer" containerID="ce9239548d170b75a5ed09a485a37b22892a226ac679cb077b3445d3c2f2c187" Mar 18 10:58:53 crc kubenswrapper[4733]: I0318 10:58:53.091076 4733 scope.go:117] "RemoveContainer" containerID="42f5e854566e19360cf16fa02f3e09efbcbafeba0ef62811eb321face5cd1f9f" Mar 18 10:58:53 crc kubenswrapper[4733]: E0318 10:58:53.091644 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:58:54 crc kubenswrapper[4733]: I0318 10:58:54.175587 4733 scope.go:117] "RemoveContainer" containerID="fb87c31929d690ace0713c6e835580b64a2fb69bcf3837bfb62aeeeefbe16b5c" Mar 18 10:58:55 crc kubenswrapper[4733]: I0318 10:58:55.115132 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b4a4e3e2-bd4d-4f8d-97bc-51267378ab03","Type":"ContainerStarted","Data":"bd29705735db3b754afb9922232f3ff6fa404f8d375f4f93c086696cc3583373"} Mar 18 10:58:55 crc kubenswrapper[4733]: I0318 10:58:55.115707 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 18 10:58:57 crc kubenswrapper[4733]: I0318 10:58:57.563899 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dpxzp" Mar 18 10:58:57 crc kubenswrapper[4733]: I0318 10:58:57.564311 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dpxzp" Mar 18 10:58:57 crc kubenswrapper[4733]: I0318 10:58:57.670470 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dpxzp" Mar 18 10:58:58 crc kubenswrapper[4733]: I0318 10:58:58.223333 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dpxzp" Mar 18 10:58:58 crc kubenswrapper[4733]: I0318 10:58:58.295830 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dpxzp"] Mar 18 10:58:59 crc kubenswrapper[4733]: I0318 10:58:59.168746 4733 generic.go:334] "Generic (PLEG): container finished" podID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" containerID="bd29705735db3b754afb9922232f3ff6fa404f8d375f4f93c086696cc3583373" exitCode=0 Mar 18 10:58:59 crc kubenswrapper[4733]: I0318 10:58:59.169025 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b4a4e3e2-bd4d-4f8d-97bc-51267378ab03","Type":"ContainerDied","Data":"bd29705735db3b754afb9922232f3ff6fa404f8d375f4f93c086696cc3583373"} Mar 18 10:58:59 crc kubenswrapper[4733]: I0318 10:58:59.169447 4733 scope.go:117] "RemoveContainer" containerID="fb87c31929d690ace0713c6e835580b64a2fb69bcf3837bfb62aeeeefbe16b5c" Mar 18 10:58:59 crc kubenswrapper[4733]: I0318 10:58:59.170299 4733 scope.go:117] "RemoveContainer" containerID="bd29705735db3b754afb9922232f3ff6fa404f8d375f4f93c086696cc3583373" Mar 18 10:58:59 crc kubenswrapper[4733]: E0318 10:58:59.170658 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:59:00 crc kubenswrapper[4733]: I0318 10:59:00.185897 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-dpxzp" podUID="d2b9d483-b743-44ab-bf0b-f0b22f133576" containerName="registry-server" containerID="cri-o://44b4592be70be18f7abf508277d2b60867509d702745fc39e315eb1db7399b16" gracePeriod=2 Mar 18 10:59:00 crc kubenswrapper[4733]: I0318 10:59:00.622600 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dpxzp" Mar 18 10:59:00 crc kubenswrapper[4733]: I0318 10:59:00.694477 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2b9d483-b743-44ab-bf0b-f0b22f133576-catalog-content\") pod \"d2b9d483-b743-44ab-bf0b-f0b22f133576\" (UID: \"d2b9d483-b743-44ab-bf0b-f0b22f133576\") " Mar 18 10:59:00 crc kubenswrapper[4733]: I0318 10:59:00.694577 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4v5l\" (UniqueName: \"kubernetes.io/projected/d2b9d483-b743-44ab-bf0b-f0b22f133576-kube-api-access-q4v5l\") pod \"d2b9d483-b743-44ab-bf0b-f0b22f133576\" (UID: \"d2b9d483-b743-44ab-bf0b-f0b22f133576\") " Mar 18 10:59:00 crc kubenswrapper[4733]: I0318 10:59:00.694656 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2b9d483-b743-44ab-bf0b-f0b22f133576-utilities\") pod \"d2b9d483-b743-44ab-bf0b-f0b22f133576\" (UID: \"d2b9d483-b743-44ab-bf0b-f0b22f133576\") " Mar 18 10:59:00 crc kubenswrapper[4733]: I0318 10:59:00.695851 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2b9d483-b743-44ab-bf0b-f0b22f133576-utilities" (OuterVolumeSpecName: "utilities") pod "d2b9d483-b743-44ab-bf0b-f0b22f133576" (UID: "d2b9d483-b743-44ab-bf0b-f0b22f133576"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:59:00 crc kubenswrapper[4733]: I0318 10:59:00.704118 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2b9d483-b743-44ab-bf0b-f0b22f133576-kube-api-access-q4v5l" (OuterVolumeSpecName: "kube-api-access-q4v5l") pod "d2b9d483-b743-44ab-bf0b-f0b22f133576" (UID: "d2b9d483-b743-44ab-bf0b-f0b22f133576"). InnerVolumeSpecName "kube-api-access-q4v5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 10:59:00 crc kubenswrapper[4733]: I0318 10:59:00.757139 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2b9d483-b743-44ab-bf0b-f0b22f133576-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d2b9d483-b743-44ab-bf0b-f0b22f133576" (UID: "d2b9d483-b743-44ab-bf0b-f0b22f133576"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 10:59:00 crc kubenswrapper[4733]: I0318 10:59:00.797216 4733 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2b9d483-b743-44ab-bf0b-f0b22f133576-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 10:59:00 crc kubenswrapper[4733]: I0318 10:59:00.797244 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4v5l\" (UniqueName: \"kubernetes.io/projected/d2b9d483-b743-44ab-bf0b-f0b22f133576-kube-api-access-q4v5l\") on node \"crc\" DevicePath \"\"" Mar 18 10:59:00 crc kubenswrapper[4733]: I0318 10:59:00.797258 4733 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2b9d483-b743-44ab-bf0b-f0b22f133576-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 10:59:01 crc kubenswrapper[4733]: I0318 10:59:01.203745 4733 generic.go:334] "Generic (PLEG): container finished" podID="d2b9d483-b743-44ab-bf0b-f0b22f133576" containerID="44b4592be70be18f7abf508277d2b60867509d702745fc39e315eb1db7399b16" exitCode=0 Mar 18 10:59:01 crc kubenswrapper[4733]: I0318 10:59:01.203860 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dpxzp" event={"ID":"d2b9d483-b743-44ab-bf0b-f0b22f133576","Type":"ContainerDied","Data":"44b4592be70be18f7abf508277d2b60867509d702745fc39e315eb1db7399b16"} Mar 18 10:59:01 crc kubenswrapper[4733]: I0318 10:59:01.204289 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dpxzp" event={"ID":"d2b9d483-b743-44ab-bf0b-f0b22f133576","Type":"ContainerDied","Data":"5d45d05271d7475d8f987b38020b7a838549e3626fa27dfe3061a1df224ec8fe"} Mar 18 10:59:01 crc kubenswrapper[4733]: I0318 10:59:01.204331 4733 scope.go:117] "RemoveContainer" containerID="44b4592be70be18f7abf508277d2b60867509d702745fc39e315eb1db7399b16" Mar 18 10:59:01 crc kubenswrapper[4733]: I0318 10:59:01.203981 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dpxzp" Mar 18 10:59:01 crc kubenswrapper[4733]: I0318 10:59:01.237723 4733 scope.go:117] "RemoveContainer" containerID="00990b4a98758f8d96d9c56f65f35d6f4152d627814babba7b1c188d594f8df9" Mar 18 10:59:01 crc kubenswrapper[4733]: I0318 10:59:01.263116 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dpxzp"] Mar 18 10:59:01 crc kubenswrapper[4733]: I0318 10:59:01.273329 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-dpxzp"] Mar 18 10:59:01 crc kubenswrapper[4733]: I0318 10:59:01.282764 4733 scope.go:117] "RemoveContainer" containerID="d16fb3f865634c313d04cd15fe257f949ffe967e94771f738a23b25fe3ead680" Mar 18 10:59:01 crc kubenswrapper[4733]: I0318 10:59:01.327415 4733 scope.go:117] "RemoveContainer" containerID="44b4592be70be18f7abf508277d2b60867509d702745fc39e315eb1db7399b16" Mar 18 10:59:01 crc kubenswrapper[4733]: E0318 10:59:01.327937 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44b4592be70be18f7abf508277d2b60867509d702745fc39e315eb1db7399b16\": container with ID starting with 44b4592be70be18f7abf508277d2b60867509d702745fc39e315eb1db7399b16 not found: ID does not exist" containerID="44b4592be70be18f7abf508277d2b60867509d702745fc39e315eb1db7399b16" Mar 18 10:59:01 crc kubenswrapper[4733]: I0318 10:59:01.327991 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44b4592be70be18f7abf508277d2b60867509d702745fc39e315eb1db7399b16"} err="failed to get container status \"44b4592be70be18f7abf508277d2b60867509d702745fc39e315eb1db7399b16\": rpc error: code = NotFound desc = could not find container \"44b4592be70be18f7abf508277d2b60867509d702745fc39e315eb1db7399b16\": container with ID starting with 44b4592be70be18f7abf508277d2b60867509d702745fc39e315eb1db7399b16 not found: ID does not exist" Mar 18 10:59:01 crc kubenswrapper[4733]: I0318 10:59:01.328026 4733 scope.go:117] "RemoveContainer" containerID="00990b4a98758f8d96d9c56f65f35d6f4152d627814babba7b1c188d594f8df9" Mar 18 10:59:01 crc kubenswrapper[4733]: E0318 10:59:01.328559 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00990b4a98758f8d96d9c56f65f35d6f4152d627814babba7b1c188d594f8df9\": container with ID starting with 00990b4a98758f8d96d9c56f65f35d6f4152d627814babba7b1c188d594f8df9 not found: ID does not exist" containerID="00990b4a98758f8d96d9c56f65f35d6f4152d627814babba7b1c188d594f8df9" Mar 18 10:59:01 crc kubenswrapper[4733]: I0318 10:59:01.328600 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00990b4a98758f8d96d9c56f65f35d6f4152d627814babba7b1c188d594f8df9"} err="failed to get container status \"00990b4a98758f8d96d9c56f65f35d6f4152d627814babba7b1c188d594f8df9\": rpc error: code = NotFound desc = could not find container \"00990b4a98758f8d96d9c56f65f35d6f4152d627814babba7b1c188d594f8df9\": container with ID starting with 00990b4a98758f8d96d9c56f65f35d6f4152d627814babba7b1c188d594f8df9 not found: ID does not exist" Mar 18 10:59:01 crc kubenswrapper[4733]: I0318 10:59:01.328624 4733 scope.go:117] "RemoveContainer" containerID="d16fb3f865634c313d04cd15fe257f949ffe967e94771f738a23b25fe3ead680" Mar 18 10:59:01 crc kubenswrapper[4733]: E0318 10:59:01.329029 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d16fb3f865634c313d04cd15fe257f949ffe967e94771f738a23b25fe3ead680\": container with ID starting with d16fb3f865634c313d04cd15fe257f949ffe967e94771f738a23b25fe3ead680 not found: ID does not exist" containerID="d16fb3f865634c313d04cd15fe257f949ffe967e94771f738a23b25fe3ead680" Mar 18 10:59:01 crc kubenswrapper[4733]: I0318 10:59:01.329100 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d16fb3f865634c313d04cd15fe257f949ffe967e94771f738a23b25fe3ead680"} err="failed to get container status \"d16fb3f865634c313d04cd15fe257f949ffe967e94771f738a23b25fe3ead680\": rpc error: code = NotFound desc = could not find container \"d16fb3f865634c313d04cd15fe257f949ffe967e94771f738a23b25fe3ead680\": container with ID starting with d16fb3f865634c313d04cd15fe257f949ffe967e94771f738a23b25fe3ead680 not found: ID does not exist" Mar 18 10:59:03 crc kubenswrapper[4733]: I0318 10:59:03.194821 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2b9d483-b743-44ab-bf0b-f0b22f133576" path="/var/lib/kubelet/pods/d2b9d483-b743-44ab-bf0b-f0b22f133576/volumes" Mar 18 10:59:05 crc kubenswrapper[4733]: I0318 10:59:05.176151 4733 scope.go:117] "RemoveContainer" containerID="42f5e854566e19360cf16fa02f3e09efbcbafeba0ef62811eb321face5cd1f9f" Mar 18 10:59:05 crc kubenswrapper[4733]: E0318 10:59:05.176653 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:59:11 crc kubenswrapper[4733]: I0318 10:59:11.184602 4733 scope.go:117] "RemoveContainer" containerID="bd29705735db3b754afb9922232f3ff6fa404f8d375f4f93c086696cc3583373" Mar 18 10:59:11 crc kubenswrapper[4733]: E0318 10:59:11.185832 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:59:13 crc kubenswrapper[4733]: I0318 10:59:13.571737 4733 patch_prober.go:28] interesting pod/machine-config-daemon-2h7dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:59:13 crc kubenswrapper[4733]: I0318 10:59:13.572148 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:59:16 crc kubenswrapper[4733]: I0318 10:59:16.176098 4733 scope.go:117] "RemoveContainer" containerID="42f5e854566e19360cf16fa02f3e09efbcbafeba0ef62811eb321face5cd1f9f" Mar 18 10:59:16 crc kubenswrapper[4733]: E0318 10:59:16.177041 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:59:22 crc kubenswrapper[4733]: I0318 10:59:22.176163 4733 scope.go:117] "RemoveContainer" containerID="bd29705735db3b754afb9922232f3ff6fa404f8d375f4f93c086696cc3583373" Mar 18 10:59:22 crc kubenswrapper[4733]: E0318 10:59:22.177531 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:59:29 crc kubenswrapper[4733]: I0318 10:59:29.176170 4733 scope.go:117] "RemoveContainer" containerID="42f5e854566e19360cf16fa02f3e09efbcbafeba0ef62811eb321face5cd1f9f" Mar 18 10:59:29 crc kubenswrapper[4733]: E0318 10:59:29.177099 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:59:37 crc kubenswrapper[4733]: I0318 10:59:37.175424 4733 scope.go:117] "RemoveContainer" containerID="bd29705735db3b754afb9922232f3ff6fa404f8d375f4f93c086696cc3583373" Mar 18 10:59:37 crc kubenswrapper[4733]: E0318 10:59:37.176346 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:59:42 crc kubenswrapper[4733]: I0318 10:59:42.175272 4733 scope.go:117] "RemoveContainer" containerID="42f5e854566e19360cf16fa02f3e09efbcbafeba0ef62811eb321face5cd1f9f" Mar 18 10:59:42 crc kubenswrapper[4733]: E0318 10:59:42.176114 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 10:59:43 crc kubenswrapper[4733]: I0318 10:59:43.570914 4733 patch_prober.go:28] interesting pod/machine-config-daemon-2h7dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 10:59:43 crc kubenswrapper[4733]: I0318 10:59:43.571006 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 10:59:43 crc kubenswrapper[4733]: I0318 10:59:43.571100 4733 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" Mar 18 10:59:43 crc kubenswrapper[4733]: I0318 10:59:43.572142 4733 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"32198f7b4110f4b23718a4e872dd512bdbf76e8166cae4cab128ee6761e36a56"} pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 10:59:43 crc kubenswrapper[4733]: I0318 10:59:43.572343 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" containerName="machine-config-daemon" containerID="cri-o://32198f7b4110f4b23718a4e872dd512bdbf76e8166cae4cab128ee6761e36a56" gracePeriod=600 Mar 18 10:59:44 crc kubenswrapper[4733]: I0318 10:59:44.662852 4733 generic.go:334] "Generic (PLEG): container finished" podID="6f75e1c5-e0c5-43df-944f-77b734070793" containerID="32198f7b4110f4b23718a4e872dd512bdbf76e8166cae4cab128ee6761e36a56" exitCode=0 Mar 18 10:59:44 crc kubenswrapper[4733]: I0318 10:59:44.662911 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" event={"ID":"6f75e1c5-e0c5-43df-944f-77b734070793","Type":"ContainerDied","Data":"32198f7b4110f4b23718a4e872dd512bdbf76e8166cae4cab128ee6761e36a56"} Mar 18 10:59:44 crc kubenswrapper[4733]: I0318 10:59:44.663456 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" event={"ID":"6f75e1c5-e0c5-43df-944f-77b734070793","Type":"ContainerStarted","Data":"6b9a340729099e48708bb3e49a96ed003cdb26d857ad4f772c65d5062fdefcf9"} Mar 18 10:59:44 crc kubenswrapper[4733]: I0318 10:59:44.663481 4733 scope.go:117] "RemoveContainer" containerID="13079617ef56fbdc98c390ba5bdaff3c5530411f54f691fbeb11894744ecac48" Mar 18 10:59:52 crc kubenswrapper[4733]: I0318 10:59:52.176753 4733 scope.go:117] "RemoveContainer" containerID="bd29705735db3b754afb9922232f3ff6fa404f8d375f4f93c086696cc3583373" Mar 18 10:59:52 crc kubenswrapper[4733]: E0318 10:59:52.177914 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 10:59:57 crc kubenswrapper[4733]: I0318 10:59:57.175388 4733 scope.go:117] "RemoveContainer" containerID="42f5e854566e19360cf16fa02f3e09efbcbafeba0ef62811eb321face5cd1f9f" Mar 18 10:59:57 crc kubenswrapper[4733]: E0318 10:59:57.177337 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 11:00:00 crc kubenswrapper[4733]: I0318 11:00:00.177828 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563860-dfn6s"] Mar 18 11:00:00 crc kubenswrapper[4733]: E0318 11:00:00.179031 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2b9d483-b743-44ab-bf0b-f0b22f133576" containerName="registry-server" Mar 18 11:00:00 crc kubenswrapper[4733]: I0318 11:00:00.179053 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2b9d483-b743-44ab-bf0b-f0b22f133576" containerName="registry-server" Mar 18 11:00:00 crc kubenswrapper[4733]: E0318 11:00:00.179085 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2b9d483-b743-44ab-bf0b-f0b22f133576" containerName="extract-utilities" Mar 18 11:00:00 crc kubenswrapper[4733]: I0318 11:00:00.179100 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2b9d483-b743-44ab-bf0b-f0b22f133576" containerName="extract-utilities" Mar 18 11:00:00 crc kubenswrapper[4733]: E0318 11:00:00.179162 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2b9d483-b743-44ab-bf0b-f0b22f133576" containerName="extract-content" Mar 18 11:00:00 crc kubenswrapper[4733]: I0318 11:00:00.179177 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2b9d483-b743-44ab-bf0b-f0b22f133576" containerName="extract-content" Mar 18 11:00:00 crc kubenswrapper[4733]: I0318 11:00:00.179546 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2b9d483-b743-44ab-bf0b-f0b22f133576" containerName="registry-server" Mar 18 11:00:00 crc kubenswrapper[4733]: I0318 11:00:00.180560 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563860-dfn6s" Mar 18 11:00:00 crc kubenswrapper[4733]: I0318 11:00:00.183861 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 11:00:00 crc kubenswrapper[4733]: I0318 11:00:00.189783 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563860-2k4dj"] Mar 18 11:00:00 crc kubenswrapper[4733]: I0318 11:00:00.191664 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563860-2k4dj" Mar 18 11:00:00 crc kubenswrapper[4733]: I0318 11:00:00.193699 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 11:00:00 crc kubenswrapper[4733]: I0318 11:00:00.194018 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wmd5k" Mar 18 11:00:00 crc kubenswrapper[4733]: I0318 11:00:00.200451 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 11:00:00 crc kubenswrapper[4733]: I0318 11:00:00.201311 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 11:00:00 crc kubenswrapper[4733]: I0318 11:00:00.212695 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563860-dfn6s"] Mar 18 11:00:00 crc kubenswrapper[4733]: I0318 11:00:00.219347 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563860-2k4dj"] Mar 18 11:00:00 crc kubenswrapper[4733]: I0318 11:00:00.271921 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgjxh\" (UniqueName: \"kubernetes.io/projected/bf991012-f463-47ee-83f1-98bd34adcf1e-kube-api-access-sgjxh\") pod \"collect-profiles-29563860-dfn6s\" (UID: \"bf991012-f463-47ee-83f1-98bd34adcf1e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563860-dfn6s" Mar 18 11:00:00 crc kubenswrapper[4733]: I0318 11:00:00.272045 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bf991012-f463-47ee-83f1-98bd34adcf1e-config-volume\") pod \"collect-profiles-29563860-dfn6s\" (UID: \"bf991012-f463-47ee-83f1-98bd34adcf1e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563860-dfn6s" Mar 18 11:00:00 crc kubenswrapper[4733]: I0318 11:00:00.272122 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bf991012-f463-47ee-83f1-98bd34adcf1e-secret-volume\") pod \"collect-profiles-29563860-dfn6s\" (UID: \"bf991012-f463-47ee-83f1-98bd34adcf1e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563860-dfn6s" Mar 18 11:00:00 crc kubenswrapper[4733]: I0318 11:00:00.373655 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bf991012-f463-47ee-83f1-98bd34adcf1e-config-volume\") pod \"collect-profiles-29563860-dfn6s\" (UID: \"bf991012-f463-47ee-83f1-98bd34adcf1e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563860-dfn6s" Mar 18 11:00:00 crc kubenswrapper[4733]: I0318 11:00:00.373730 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bf991012-f463-47ee-83f1-98bd34adcf1e-secret-volume\") pod \"collect-profiles-29563860-dfn6s\" (UID: \"bf991012-f463-47ee-83f1-98bd34adcf1e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563860-dfn6s" Mar 18 11:00:00 crc kubenswrapper[4733]: I0318 11:00:00.373757 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqq4v\" (UniqueName: \"kubernetes.io/projected/6e41c70a-6d8d-47a8-9caf-57f46a60f96a-kube-api-access-jqq4v\") pod \"auto-csr-approver-29563860-2k4dj\" (UID: \"6e41c70a-6d8d-47a8-9caf-57f46a60f96a\") " pod="openshift-infra/auto-csr-approver-29563860-2k4dj" Mar 18 11:00:00 crc kubenswrapper[4733]: I0318 11:00:00.373789 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgjxh\" (UniqueName: \"kubernetes.io/projected/bf991012-f463-47ee-83f1-98bd34adcf1e-kube-api-access-sgjxh\") pod \"collect-profiles-29563860-dfn6s\" (UID: \"bf991012-f463-47ee-83f1-98bd34adcf1e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563860-dfn6s" Mar 18 11:00:00 crc kubenswrapper[4733]: I0318 11:00:00.374470 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bf991012-f463-47ee-83f1-98bd34adcf1e-config-volume\") pod \"collect-profiles-29563860-dfn6s\" (UID: \"bf991012-f463-47ee-83f1-98bd34adcf1e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563860-dfn6s" Mar 18 11:00:00 crc kubenswrapper[4733]: I0318 11:00:00.390844 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgjxh\" (UniqueName: \"kubernetes.io/projected/bf991012-f463-47ee-83f1-98bd34adcf1e-kube-api-access-sgjxh\") pod \"collect-profiles-29563860-dfn6s\" (UID: \"bf991012-f463-47ee-83f1-98bd34adcf1e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563860-dfn6s" Mar 18 11:00:00 crc kubenswrapper[4733]: I0318 11:00:00.393652 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bf991012-f463-47ee-83f1-98bd34adcf1e-secret-volume\") pod \"collect-profiles-29563860-dfn6s\" (UID: \"bf991012-f463-47ee-83f1-98bd34adcf1e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563860-dfn6s" Mar 18 11:00:00 crc kubenswrapper[4733]: I0318 11:00:00.474928 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqq4v\" (UniqueName: \"kubernetes.io/projected/6e41c70a-6d8d-47a8-9caf-57f46a60f96a-kube-api-access-jqq4v\") pod \"auto-csr-approver-29563860-2k4dj\" (UID: \"6e41c70a-6d8d-47a8-9caf-57f46a60f96a\") " pod="openshift-infra/auto-csr-approver-29563860-2k4dj" Mar 18 11:00:00 crc kubenswrapper[4733]: I0318 11:00:00.492654 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqq4v\" (UniqueName: \"kubernetes.io/projected/6e41c70a-6d8d-47a8-9caf-57f46a60f96a-kube-api-access-jqq4v\") pod \"auto-csr-approver-29563860-2k4dj\" (UID: \"6e41c70a-6d8d-47a8-9caf-57f46a60f96a\") " pod="openshift-infra/auto-csr-approver-29563860-2k4dj" Mar 18 11:00:00 crc kubenswrapper[4733]: I0318 11:00:00.510885 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563860-dfn6s" Mar 18 11:00:00 crc kubenswrapper[4733]: I0318 11:00:00.535534 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563860-2k4dj" Mar 18 11:00:00 crc kubenswrapper[4733]: I0318 11:00:00.985199 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563860-dfn6s"] Mar 18 11:00:00 crc kubenswrapper[4733]: W0318 11:00:00.993111 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf991012_f463_47ee_83f1_98bd34adcf1e.slice/crio-064af1ddf7399134cda4e62b057d6eb2b8148ac3892e6bd388e41a03a7f1c70d WatchSource:0}: Error finding container 064af1ddf7399134cda4e62b057d6eb2b8148ac3892e6bd388e41a03a7f1c70d: Status 404 returned error can't find the container with id 064af1ddf7399134cda4e62b057d6eb2b8148ac3892e6bd388e41a03a7f1c70d Mar 18 11:00:01 crc kubenswrapper[4733]: I0318 11:00:01.057273 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563860-2k4dj"] Mar 18 11:00:01 crc kubenswrapper[4733]: W0318 11:00:01.061065 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e41c70a_6d8d_47a8_9caf_57f46a60f96a.slice/crio-d414ba428ab0652aa66cf66c5272a3ace20f3bd338bfc094c7e21c52ebfae593 WatchSource:0}: Error finding container d414ba428ab0652aa66cf66c5272a3ace20f3bd338bfc094c7e21c52ebfae593: Status 404 returned error can't find the container with id d414ba428ab0652aa66cf66c5272a3ace20f3bd338bfc094c7e21c52ebfae593 Mar 18 11:00:01 crc kubenswrapper[4733]: I0318 11:00:01.843257 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563860-2k4dj" event={"ID":"6e41c70a-6d8d-47a8-9caf-57f46a60f96a","Type":"ContainerStarted","Data":"d414ba428ab0652aa66cf66c5272a3ace20f3bd338bfc094c7e21c52ebfae593"} Mar 18 11:00:01 crc kubenswrapper[4733]: I0318 11:00:01.845388 4733 generic.go:334] "Generic (PLEG): container finished" podID="bf991012-f463-47ee-83f1-98bd34adcf1e" containerID="2799cecba73aefe0ec40d27bcbc9d2213abfeaa8dc27c148695183a3c55debbd" exitCode=0 Mar 18 11:00:01 crc kubenswrapper[4733]: I0318 11:00:01.845452 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563860-dfn6s" event={"ID":"bf991012-f463-47ee-83f1-98bd34adcf1e","Type":"ContainerDied","Data":"2799cecba73aefe0ec40d27bcbc9d2213abfeaa8dc27c148695183a3c55debbd"} Mar 18 11:00:01 crc kubenswrapper[4733]: I0318 11:00:01.845534 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563860-dfn6s" event={"ID":"bf991012-f463-47ee-83f1-98bd34adcf1e","Type":"ContainerStarted","Data":"064af1ddf7399134cda4e62b057d6eb2b8148ac3892e6bd388e41a03a7f1c70d"} Mar 18 11:00:03 crc kubenswrapper[4733]: I0318 11:00:03.170115 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563860-dfn6s" Mar 18 11:00:03 crc kubenswrapper[4733]: I0318 11:00:03.345549 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bf991012-f463-47ee-83f1-98bd34adcf1e-secret-volume\") pod \"bf991012-f463-47ee-83f1-98bd34adcf1e\" (UID: \"bf991012-f463-47ee-83f1-98bd34adcf1e\") " Mar 18 11:00:03 crc kubenswrapper[4733]: I0318 11:00:03.345632 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgjxh\" (UniqueName: \"kubernetes.io/projected/bf991012-f463-47ee-83f1-98bd34adcf1e-kube-api-access-sgjxh\") pod \"bf991012-f463-47ee-83f1-98bd34adcf1e\" (UID: \"bf991012-f463-47ee-83f1-98bd34adcf1e\") " Mar 18 11:00:03 crc kubenswrapper[4733]: I0318 11:00:03.345818 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bf991012-f463-47ee-83f1-98bd34adcf1e-config-volume\") pod \"bf991012-f463-47ee-83f1-98bd34adcf1e\" (UID: \"bf991012-f463-47ee-83f1-98bd34adcf1e\") " Mar 18 11:00:03 crc kubenswrapper[4733]: I0318 11:00:03.346665 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf991012-f463-47ee-83f1-98bd34adcf1e-config-volume" (OuterVolumeSpecName: "config-volume") pod "bf991012-f463-47ee-83f1-98bd34adcf1e" (UID: "bf991012-f463-47ee-83f1-98bd34adcf1e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 11:00:03 crc kubenswrapper[4733]: I0318 11:00:03.351392 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf991012-f463-47ee-83f1-98bd34adcf1e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "bf991012-f463-47ee-83f1-98bd34adcf1e" (UID: "bf991012-f463-47ee-83f1-98bd34adcf1e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 11:00:03 crc kubenswrapper[4733]: I0318 11:00:03.352386 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf991012-f463-47ee-83f1-98bd34adcf1e-kube-api-access-sgjxh" (OuterVolumeSpecName: "kube-api-access-sgjxh") pod "bf991012-f463-47ee-83f1-98bd34adcf1e" (UID: "bf991012-f463-47ee-83f1-98bd34adcf1e"). InnerVolumeSpecName "kube-api-access-sgjxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:00:03 crc kubenswrapper[4733]: I0318 11:00:03.447381 4733 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bf991012-f463-47ee-83f1-98bd34adcf1e-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 11:00:03 crc kubenswrapper[4733]: I0318 11:00:03.447625 4733 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bf991012-f463-47ee-83f1-98bd34adcf1e-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 11:00:03 crc kubenswrapper[4733]: I0318 11:00:03.447639 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgjxh\" (UniqueName: \"kubernetes.io/projected/bf991012-f463-47ee-83f1-98bd34adcf1e-kube-api-access-sgjxh\") on node \"crc\" DevicePath \"\"" Mar 18 11:00:03 crc kubenswrapper[4733]: I0318 11:00:03.862379 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563860-dfn6s" event={"ID":"bf991012-f463-47ee-83f1-98bd34adcf1e","Type":"ContainerDied","Data":"064af1ddf7399134cda4e62b057d6eb2b8148ac3892e6bd388e41a03a7f1c70d"} Mar 18 11:00:03 crc kubenswrapper[4733]: I0318 11:00:03.862425 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="064af1ddf7399134cda4e62b057d6eb2b8148ac3892e6bd388e41a03a7f1c70d" Mar 18 11:00:03 crc kubenswrapper[4733]: I0318 11:00:03.862448 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563860-dfn6s" Mar 18 11:00:04 crc kubenswrapper[4733]: I0318 11:00:04.175506 4733 scope.go:117] "RemoveContainer" containerID="bd29705735db3b754afb9922232f3ff6fa404f8d375f4f93c086696cc3583373" Mar 18 11:00:04 crc kubenswrapper[4733]: E0318 11:00:04.175967 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 11:00:04 crc kubenswrapper[4733]: I0318 11:00:04.267913 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563815-tsrs6"] Mar 18 11:00:04 crc kubenswrapper[4733]: I0318 11:00:04.279559 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563815-tsrs6"] Mar 18 11:00:05 crc kubenswrapper[4733]: I0318 11:00:05.186106 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d915f7d2-5b4d-4017-a839-b615a182fafb" path="/var/lib/kubelet/pods/d915f7d2-5b4d-4017-a839-b615a182fafb/volumes" Mar 18 11:00:10 crc kubenswrapper[4733]: I0318 11:00:10.175925 4733 scope.go:117] "RemoveContainer" containerID="42f5e854566e19360cf16fa02f3e09efbcbafeba0ef62811eb321face5cd1f9f" Mar 18 11:00:10 crc kubenswrapper[4733]: E0318 11:00:10.176735 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 11:00:11 crc kubenswrapper[4733]: I0318 11:00:11.372121 4733 scope.go:117] "RemoveContainer" containerID="6068780e861c95e2a5524c6995b5943bf2eb924f4e716f49bfa978772d8dc58d" Mar 18 11:00:18 crc kubenswrapper[4733]: I0318 11:00:18.890784 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-r62h8"] Mar 18 11:00:18 crc kubenswrapper[4733]: E0318 11:00:18.891752 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf991012-f463-47ee-83f1-98bd34adcf1e" containerName="collect-profiles" Mar 18 11:00:18 crc kubenswrapper[4733]: I0318 11:00:18.891770 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf991012-f463-47ee-83f1-98bd34adcf1e" containerName="collect-profiles" Mar 18 11:00:18 crc kubenswrapper[4733]: I0318 11:00:18.892021 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf991012-f463-47ee-83f1-98bd34adcf1e" containerName="collect-profiles" Mar 18 11:00:18 crc kubenswrapper[4733]: I0318 11:00:18.893567 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r62h8" Mar 18 11:00:18 crc kubenswrapper[4733]: I0318 11:00:18.914410 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r62h8"] Mar 18 11:00:19 crc kubenswrapper[4733]: I0318 11:00:19.014057 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9b1aa26-b953-4a50-a4b6-0edc261c5036-catalog-content\") pod \"redhat-operators-r62h8\" (UID: \"c9b1aa26-b953-4a50-a4b6-0edc261c5036\") " pod="openshift-marketplace/redhat-operators-r62h8" Mar 18 11:00:19 crc kubenswrapper[4733]: I0318 11:00:19.014123 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfptj\" (UniqueName: \"kubernetes.io/projected/c9b1aa26-b953-4a50-a4b6-0edc261c5036-kube-api-access-qfptj\") pod \"redhat-operators-r62h8\" (UID: \"c9b1aa26-b953-4a50-a4b6-0edc261c5036\") " pod="openshift-marketplace/redhat-operators-r62h8" Mar 18 11:00:19 crc kubenswrapper[4733]: I0318 11:00:19.014467 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9b1aa26-b953-4a50-a4b6-0edc261c5036-utilities\") pod \"redhat-operators-r62h8\" (UID: \"c9b1aa26-b953-4a50-a4b6-0edc261c5036\") " pod="openshift-marketplace/redhat-operators-r62h8" Mar 18 11:00:19 crc kubenswrapper[4733]: I0318 11:00:19.115686 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9b1aa26-b953-4a50-a4b6-0edc261c5036-catalog-content\") pod \"redhat-operators-r62h8\" (UID: \"c9b1aa26-b953-4a50-a4b6-0edc261c5036\") " pod="openshift-marketplace/redhat-operators-r62h8" Mar 18 11:00:19 crc kubenswrapper[4733]: I0318 11:00:19.115744 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfptj\" (UniqueName: \"kubernetes.io/projected/c9b1aa26-b953-4a50-a4b6-0edc261c5036-kube-api-access-qfptj\") pod \"redhat-operators-r62h8\" (UID: \"c9b1aa26-b953-4a50-a4b6-0edc261c5036\") " pod="openshift-marketplace/redhat-operators-r62h8" Mar 18 11:00:19 crc kubenswrapper[4733]: I0318 11:00:19.115818 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9b1aa26-b953-4a50-a4b6-0edc261c5036-utilities\") pod \"redhat-operators-r62h8\" (UID: \"c9b1aa26-b953-4a50-a4b6-0edc261c5036\") " pod="openshift-marketplace/redhat-operators-r62h8" Mar 18 11:00:19 crc kubenswrapper[4733]: I0318 11:00:19.116395 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9b1aa26-b953-4a50-a4b6-0edc261c5036-catalog-content\") pod \"redhat-operators-r62h8\" (UID: \"c9b1aa26-b953-4a50-a4b6-0edc261c5036\") " pod="openshift-marketplace/redhat-operators-r62h8" Mar 18 11:00:19 crc kubenswrapper[4733]: I0318 11:00:19.116414 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9b1aa26-b953-4a50-a4b6-0edc261c5036-utilities\") pod \"redhat-operators-r62h8\" (UID: \"c9b1aa26-b953-4a50-a4b6-0edc261c5036\") " pod="openshift-marketplace/redhat-operators-r62h8" Mar 18 11:00:19 crc kubenswrapper[4733]: I0318 11:00:19.141436 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfptj\" (UniqueName: \"kubernetes.io/projected/c9b1aa26-b953-4a50-a4b6-0edc261c5036-kube-api-access-qfptj\") pod \"redhat-operators-r62h8\" (UID: \"c9b1aa26-b953-4a50-a4b6-0edc261c5036\") " pod="openshift-marketplace/redhat-operators-r62h8" Mar 18 11:00:19 crc kubenswrapper[4733]: I0318 11:00:19.175580 4733 scope.go:117] "RemoveContainer" containerID="bd29705735db3b754afb9922232f3ff6fa404f8d375f4f93c086696cc3583373" Mar 18 11:00:19 crc kubenswrapper[4733]: E0318 11:00:19.176001 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 11:00:19 crc kubenswrapper[4733]: I0318 11:00:19.222053 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r62h8" Mar 18 11:00:19 crc kubenswrapper[4733]: I0318 11:00:19.683912 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r62h8"] Mar 18 11:00:20 crc kubenswrapper[4733]: I0318 11:00:20.002108 4733 generic.go:334] "Generic (PLEG): container finished" podID="6e41c70a-6d8d-47a8-9caf-57f46a60f96a" containerID="ee8a5931d088bb90e3f8edd41217a30f581b7d88c4f982136e16b0f2b145d28c" exitCode=0 Mar 18 11:00:20 crc kubenswrapper[4733]: I0318 11:00:20.002432 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563860-2k4dj" event={"ID":"6e41c70a-6d8d-47a8-9caf-57f46a60f96a","Type":"ContainerDied","Data":"ee8a5931d088bb90e3f8edd41217a30f581b7d88c4f982136e16b0f2b145d28c"} Mar 18 11:00:20 crc kubenswrapper[4733]: I0318 11:00:20.004126 4733 generic.go:334] "Generic (PLEG): container finished" podID="c9b1aa26-b953-4a50-a4b6-0edc261c5036" containerID="f0919ae919a0dce1024c8f5a9092475badc3c61b34605b876a869ff90201c317" exitCode=0 Mar 18 11:00:20 crc kubenswrapper[4733]: I0318 11:00:20.004154 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r62h8" event={"ID":"c9b1aa26-b953-4a50-a4b6-0edc261c5036","Type":"ContainerDied","Data":"f0919ae919a0dce1024c8f5a9092475badc3c61b34605b876a869ff90201c317"} Mar 18 11:00:20 crc kubenswrapper[4733]: I0318 11:00:20.004168 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r62h8" event={"ID":"c9b1aa26-b953-4a50-a4b6-0edc261c5036","Type":"ContainerStarted","Data":"04995b1b3f36eef24f7f078917a3d8bd2c408f0c02eb5dd2635fe0b5f1c102f9"} Mar 18 11:00:21 crc kubenswrapper[4733]: I0318 11:00:21.385584 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563860-2k4dj" Mar 18 11:00:21 crc kubenswrapper[4733]: I0318 11:00:21.458249 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqq4v\" (UniqueName: \"kubernetes.io/projected/6e41c70a-6d8d-47a8-9caf-57f46a60f96a-kube-api-access-jqq4v\") pod \"6e41c70a-6d8d-47a8-9caf-57f46a60f96a\" (UID: \"6e41c70a-6d8d-47a8-9caf-57f46a60f96a\") " Mar 18 11:00:21 crc kubenswrapper[4733]: I0318 11:00:21.463388 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e41c70a-6d8d-47a8-9caf-57f46a60f96a-kube-api-access-jqq4v" (OuterVolumeSpecName: "kube-api-access-jqq4v") pod "6e41c70a-6d8d-47a8-9caf-57f46a60f96a" (UID: "6e41c70a-6d8d-47a8-9caf-57f46a60f96a"). InnerVolumeSpecName "kube-api-access-jqq4v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:00:21 crc kubenswrapper[4733]: I0318 11:00:21.559811 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqq4v\" (UniqueName: \"kubernetes.io/projected/6e41c70a-6d8d-47a8-9caf-57f46a60f96a-kube-api-access-jqq4v\") on node \"crc\" DevicePath \"\"" Mar 18 11:00:22 crc kubenswrapper[4733]: I0318 11:00:22.020473 4733 generic.go:334] "Generic (PLEG): container finished" podID="c9b1aa26-b953-4a50-a4b6-0edc261c5036" containerID="104ec438bd7e1a0d9263648df2d07431c5cfac4181c73f8b04e13a665edfe904" exitCode=0 Mar 18 11:00:22 crc kubenswrapper[4733]: I0318 11:00:22.020825 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r62h8" event={"ID":"c9b1aa26-b953-4a50-a4b6-0edc261c5036","Type":"ContainerDied","Data":"104ec438bd7e1a0d9263648df2d07431c5cfac4181c73f8b04e13a665edfe904"} Mar 18 11:00:22 crc kubenswrapper[4733]: I0318 11:00:22.023319 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563860-2k4dj" event={"ID":"6e41c70a-6d8d-47a8-9caf-57f46a60f96a","Type":"ContainerDied","Data":"d414ba428ab0652aa66cf66c5272a3ace20f3bd338bfc094c7e21c52ebfae593"} Mar 18 11:00:22 crc kubenswrapper[4733]: I0318 11:00:22.023365 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d414ba428ab0652aa66cf66c5272a3ace20f3bd338bfc094c7e21c52ebfae593" Mar 18 11:00:22 crc kubenswrapper[4733]: I0318 11:00:22.023435 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563860-2k4dj" Mar 18 11:00:22 crc kubenswrapper[4733]: I0318 11:00:22.464638 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563854-8xm7m"] Mar 18 11:00:22 crc kubenswrapper[4733]: I0318 11:00:22.475779 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563854-8xm7m"] Mar 18 11:00:23 crc kubenswrapper[4733]: I0318 11:00:23.190923 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2194f8e-8219-4d20-9657-bab035e9ce0b" path="/var/lib/kubelet/pods/e2194f8e-8219-4d20-9657-bab035e9ce0b/volumes" Mar 18 11:00:24 crc kubenswrapper[4733]: I0318 11:00:24.054791 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r62h8" event={"ID":"c9b1aa26-b953-4a50-a4b6-0edc261c5036","Type":"ContainerStarted","Data":"6d6befd46f4eb77f197da2b1ba26e4921971717d431e08ee39f2e013f94e03b8"} Mar 18 11:00:24 crc kubenswrapper[4733]: I0318 11:00:24.085924 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-r62h8" podStartSLOduration=2.927193043 podStartE2EDuration="6.085905467s" podCreationTimestamp="2026-03-18 11:00:18 +0000 UTC" firstStartedPulling="2026-03-18 11:00:20.005540102 +0000 UTC m=+2859.497274427" lastFinishedPulling="2026-03-18 11:00:23.164252536 +0000 UTC m=+2862.655986851" observedRunningTime="2026-03-18 11:00:24.076162782 +0000 UTC m=+2863.567897187" watchObservedRunningTime="2026-03-18 11:00:24.085905467 +0000 UTC m=+2863.577639802" Mar 18 11:00:24 crc kubenswrapper[4733]: I0318 11:00:24.177596 4733 scope.go:117] "RemoveContainer" containerID="42f5e854566e19360cf16fa02f3e09efbcbafeba0ef62811eb321face5cd1f9f" Mar 18 11:00:24 crc kubenswrapper[4733]: E0318 11:00:24.177835 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 11:00:29 crc kubenswrapper[4733]: I0318 11:00:29.222674 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-r62h8" Mar 18 11:00:29 crc kubenswrapper[4733]: I0318 11:00:29.223350 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-r62h8" Mar 18 11:00:30 crc kubenswrapper[4733]: I0318 11:00:30.280691 4733 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-r62h8" podUID="c9b1aa26-b953-4a50-a4b6-0edc261c5036" containerName="registry-server" probeResult="failure" output=< Mar 18 11:00:30 crc kubenswrapper[4733]: timeout: failed to connect service ":50051" within 1s Mar 18 11:00:30 crc kubenswrapper[4733]: > Mar 18 11:00:33 crc kubenswrapper[4733]: I0318 11:00:33.175984 4733 scope.go:117] "RemoveContainer" containerID="bd29705735db3b754afb9922232f3ff6fa404f8d375f4f93c086696cc3583373" Mar 18 11:00:33 crc kubenswrapper[4733]: E0318 11:00:33.176747 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 11:00:37 crc kubenswrapper[4733]: I0318 11:00:37.175379 4733 scope.go:117] "RemoveContainer" containerID="42f5e854566e19360cf16fa02f3e09efbcbafeba0ef62811eb321face5cd1f9f" Mar 18 11:00:37 crc kubenswrapper[4733]: E0318 11:00:37.177570 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 11:00:39 crc kubenswrapper[4733]: I0318 11:00:39.293454 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-r62h8" Mar 18 11:00:39 crc kubenswrapper[4733]: I0318 11:00:39.364282 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-r62h8" Mar 18 11:00:39 crc kubenswrapper[4733]: I0318 11:00:39.557252 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-r62h8"] Mar 18 11:00:41 crc kubenswrapper[4733]: I0318 11:00:41.213066 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-r62h8" podUID="c9b1aa26-b953-4a50-a4b6-0edc261c5036" containerName="registry-server" containerID="cri-o://6d6befd46f4eb77f197da2b1ba26e4921971717d431e08ee39f2e013f94e03b8" gracePeriod=2 Mar 18 11:00:41 crc kubenswrapper[4733]: I0318 11:00:41.801764 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r62h8" Mar 18 11:00:41 crc kubenswrapper[4733]: I0318 11:00:41.920803 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9b1aa26-b953-4a50-a4b6-0edc261c5036-utilities\") pod \"c9b1aa26-b953-4a50-a4b6-0edc261c5036\" (UID: \"c9b1aa26-b953-4a50-a4b6-0edc261c5036\") " Mar 18 11:00:41 crc kubenswrapper[4733]: I0318 11:00:41.920853 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9b1aa26-b953-4a50-a4b6-0edc261c5036-catalog-content\") pod \"c9b1aa26-b953-4a50-a4b6-0edc261c5036\" (UID: \"c9b1aa26-b953-4a50-a4b6-0edc261c5036\") " Mar 18 11:00:41 crc kubenswrapper[4733]: I0318 11:00:41.920900 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfptj\" (UniqueName: \"kubernetes.io/projected/c9b1aa26-b953-4a50-a4b6-0edc261c5036-kube-api-access-qfptj\") pod \"c9b1aa26-b953-4a50-a4b6-0edc261c5036\" (UID: \"c9b1aa26-b953-4a50-a4b6-0edc261c5036\") " Mar 18 11:00:41 crc kubenswrapper[4733]: I0318 11:00:41.922522 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9b1aa26-b953-4a50-a4b6-0edc261c5036-utilities" (OuterVolumeSpecName: "utilities") pod "c9b1aa26-b953-4a50-a4b6-0edc261c5036" (UID: "c9b1aa26-b953-4a50-a4b6-0edc261c5036"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 11:00:41 crc kubenswrapper[4733]: I0318 11:00:41.932685 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9b1aa26-b953-4a50-a4b6-0edc261c5036-kube-api-access-qfptj" (OuterVolumeSpecName: "kube-api-access-qfptj") pod "c9b1aa26-b953-4a50-a4b6-0edc261c5036" (UID: "c9b1aa26-b953-4a50-a4b6-0edc261c5036"). InnerVolumeSpecName "kube-api-access-qfptj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:00:42 crc kubenswrapper[4733]: I0318 11:00:42.022705 4733 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9b1aa26-b953-4a50-a4b6-0edc261c5036-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 11:00:42 crc kubenswrapper[4733]: I0318 11:00:42.022740 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfptj\" (UniqueName: \"kubernetes.io/projected/c9b1aa26-b953-4a50-a4b6-0edc261c5036-kube-api-access-qfptj\") on node \"crc\" DevicePath \"\"" Mar 18 11:00:42 crc kubenswrapper[4733]: I0318 11:00:42.099099 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9b1aa26-b953-4a50-a4b6-0edc261c5036-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c9b1aa26-b953-4a50-a4b6-0edc261c5036" (UID: "c9b1aa26-b953-4a50-a4b6-0edc261c5036"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 11:00:42 crc kubenswrapper[4733]: I0318 11:00:42.124479 4733 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9b1aa26-b953-4a50-a4b6-0edc261c5036-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 11:00:42 crc kubenswrapper[4733]: I0318 11:00:42.226828 4733 generic.go:334] "Generic (PLEG): container finished" podID="c9b1aa26-b953-4a50-a4b6-0edc261c5036" containerID="6d6befd46f4eb77f197da2b1ba26e4921971717d431e08ee39f2e013f94e03b8" exitCode=0 Mar 18 11:00:42 crc kubenswrapper[4733]: I0318 11:00:42.226877 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r62h8" Mar 18 11:00:42 crc kubenswrapper[4733]: I0318 11:00:42.226878 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r62h8" event={"ID":"c9b1aa26-b953-4a50-a4b6-0edc261c5036","Type":"ContainerDied","Data":"6d6befd46f4eb77f197da2b1ba26e4921971717d431e08ee39f2e013f94e03b8"} Mar 18 11:00:42 crc kubenswrapper[4733]: I0318 11:00:42.227011 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r62h8" event={"ID":"c9b1aa26-b953-4a50-a4b6-0edc261c5036","Type":"ContainerDied","Data":"04995b1b3f36eef24f7f078917a3d8bd2c408f0c02eb5dd2635fe0b5f1c102f9"} Mar 18 11:00:42 crc kubenswrapper[4733]: I0318 11:00:42.227045 4733 scope.go:117] "RemoveContainer" containerID="6d6befd46f4eb77f197da2b1ba26e4921971717d431e08ee39f2e013f94e03b8" Mar 18 11:00:42 crc kubenswrapper[4733]: I0318 11:00:42.261922 4733 scope.go:117] "RemoveContainer" containerID="104ec438bd7e1a0d9263648df2d07431c5cfac4181c73f8b04e13a665edfe904" Mar 18 11:00:42 crc kubenswrapper[4733]: I0318 11:00:42.282526 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-r62h8"] Mar 18 11:00:42 crc kubenswrapper[4733]: I0318 11:00:42.289562 4733 scope.go:117] "RemoveContainer" containerID="f0919ae919a0dce1024c8f5a9092475badc3c61b34605b876a869ff90201c317" Mar 18 11:00:42 crc kubenswrapper[4733]: I0318 11:00:42.292815 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-r62h8"] Mar 18 11:00:42 crc kubenswrapper[4733]: I0318 11:00:42.337821 4733 scope.go:117] "RemoveContainer" containerID="6d6befd46f4eb77f197da2b1ba26e4921971717d431e08ee39f2e013f94e03b8" Mar 18 11:00:42 crc kubenswrapper[4733]: E0318 11:00:42.338740 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d6befd46f4eb77f197da2b1ba26e4921971717d431e08ee39f2e013f94e03b8\": container with ID starting with 6d6befd46f4eb77f197da2b1ba26e4921971717d431e08ee39f2e013f94e03b8 not found: ID does not exist" containerID="6d6befd46f4eb77f197da2b1ba26e4921971717d431e08ee39f2e013f94e03b8" Mar 18 11:00:42 crc kubenswrapper[4733]: I0318 11:00:42.338781 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d6befd46f4eb77f197da2b1ba26e4921971717d431e08ee39f2e013f94e03b8"} err="failed to get container status \"6d6befd46f4eb77f197da2b1ba26e4921971717d431e08ee39f2e013f94e03b8\": rpc error: code = NotFound desc = could not find container \"6d6befd46f4eb77f197da2b1ba26e4921971717d431e08ee39f2e013f94e03b8\": container with ID starting with 6d6befd46f4eb77f197da2b1ba26e4921971717d431e08ee39f2e013f94e03b8 not found: ID does not exist" Mar 18 11:00:42 crc kubenswrapper[4733]: I0318 11:00:42.338805 4733 scope.go:117] "RemoveContainer" containerID="104ec438bd7e1a0d9263648df2d07431c5cfac4181c73f8b04e13a665edfe904" Mar 18 11:00:42 crc kubenswrapper[4733]: E0318 11:00:42.339205 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"104ec438bd7e1a0d9263648df2d07431c5cfac4181c73f8b04e13a665edfe904\": container with ID starting with 104ec438bd7e1a0d9263648df2d07431c5cfac4181c73f8b04e13a665edfe904 not found: ID does not exist" containerID="104ec438bd7e1a0d9263648df2d07431c5cfac4181c73f8b04e13a665edfe904" Mar 18 11:00:42 crc kubenswrapper[4733]: I0318 11:00:42.339308 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"104ec438bd7e1a0d9263648df2d07431c5cfac4181c73f8b04e13a665edfe904"} err="failed to get container status \"104ec438bd7e1a0d9263648df2d07431c5cfac4181c73f8b04e13a665edfe904\": rpc error: code = NotFound desc = could not find container \"104ec438bd7e1a0d9263648df2d07431c5cfac4181c73f8b04e13a665edfe904\": container with ID starting with 104ec438bd7e1a0d9263648df2d07431c5cfac4181c73f8b04e13a665edfe904 not found: ID does not exist" Mar 18 11:00:42 crc kubenswrapper[4733]: I0318 11:00:42.339411 4733 scope.go:117] "RemoveContainer" containerID="f0919ae919a0dce1024c8f5a9092475badc3c61b34605b876a869ff90201c317" Mar 18 11:00:42 crc kubenswrapper[4733]: E0318 11:00:42.339781 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0919ae919a0dce1024c8f5a9092475badc3c61b34605b876a869ff90201c317\": container with ID starting with f0919ae919a0dce1024c8f5a9092475badc3c61b34605b876a869ff90201c317 not found: ID does not exist" containerID="f0919ae919a0dce1024c8f5a9092475badc3c61b34605b876a869ff90201c317" Mar 18 11:00:42 crc kubenswrapper[4733]: I0318 11:00:42.339803 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0919ae919a0dce1024c8f5a9092475badc3c61b34605b876a869ff90201c317"} err="failed to get container status \"f0919ae919a0dce1024c8f5a9092475badc3c61b34605b876a869ff90201c317\": rpc error: code = NotFound desc = could not find container \"f0919ae919a0dce1024c8f5a9092475badc3c61b34605b876a869ff90201c317\": container with ID starting with f0919ae919a0dce1024c8f5a9092475badc3c61b34605b876a869ff90201c317 not found: ID does not exist" Mar 18 11:00:43 crc kubenswrapper[4733]: I0318 11:00:43.198046 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9b1aa26-b953-4a50-a4b6-0edc261c5036" path="/var/lib/kubelet/pods/c9b1aa26-b953-4a50-a4b6-0edc261c5036/volumes" Mar 18 11:00:48 crc kubenswrapper[4733]: I0318 11:00:48.175828 4733 scope.go:117] "RemoveContainer" containerID="bd29705735db3b754afb9922232f3ff6fa404f8d375f4f93c086696cc3583373" Mar 18 11:00:48 crc kubenswrapper[4733]: E0318 11:00:48.177847 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 11:00:49 crc kubenswrapper[4733]: I0318 11:00:49.175707 4733 scope.go:117] "RemoveContainer" containerID="42f5e854566e19360cf16fa02f3e09efbcbafeba0ef62811eb321face5cd1f9f" Mar 18 11:00:49 crc kubenswrapper[4733]: E0318 11:00:49.176773 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 11:00:59 crc kubenswrapper[4733]: I0318 11:00:59.176016 4733 scope.go:117] "RemoveContainer" containerID="bd29705735db3b754afb9922232f3ff6fa404f8d375f4f93c086696cc3583373" Mar 18 11:00:59 crc kubenswrapper[4733]: E0318 11:00:59.176889 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 11:01:02 crc kubenswrapper[4733]: I0318 11:01:02.175675 4733 scope.go:117] "RemoveContainer" containerID="42f5e854566e19360cf16fa02f3e09efbcbafeba0ef62811eb321face5cd1f9f" Mar 18 11:01:02 crc kubenswrapper[4733]: E0318 11:01:02.176139 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 11:01:05 crc kubenswrapper[4733]: I0318 11:01:05.985336 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-h4rcn"] Mar 18 11:01:05 crc kubenswrapper[4733]: E0318 11:01:05.985890 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9b1aa26-b953-4a50-a4b6-0edc261c5036" containerName="extract-utilities" Mar 18 11:01:05 crc kubenswrapper[4733]: I0318 11:01:05.985902 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9b1aa26-b953-4a50-a4b6-0edc261c5036" containerName="extract-utilities" Mar 18 11:01:05 crc kubenswrapper[4733]: E0318 11:01:05.985918 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9b1aa26-b953-4a50-a4b6-0edc261c5036" containerName="registry-server" Mar 18 11:01:05 crc kubenswrapper[4733]: I0318 11:01:05.985924 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9b1aa26-b953-4a50-a4b6-0edc261c5036" containerName="registry-server" Mar 18 11:01:05 crc kubenswrapper[4733]: E0318 11:01:05.985934 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9b1aa26-b953-4a50-a4b6-0edc261c5036" containerName="extract-content" Mar 18 11:01:05 crc kubenswrapper[4733]: I0318 11:01:05.985942 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9b1aa26-b953-4a50-a4b6-0edc261c5036" containerName="extract-content" Mar 18 11:01:05 crc kubenswrapper[4733]: E0318 11:01:05.985961 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e41c70a-6d8d-47a8-9caf-57f46a60f96a" containerName="oc" Mar 18 11:01:05 crc kubenswrapper[4733]: I0318 11:01:05.985966 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e41c70a-6d8d-47a8-9caf-57f46a60f96a" containerName="oc" Mar 18 11:01:05 crc kubenswrapper[4733]: I0318 11:01:05.986119 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9b1aa26-b953-4a50-a4b6-0edc261c5036" containerName="registry-server" Mar 18 11:01:05 crc kubenswrapper[4733]: I0318 11:01:05.986133 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e41c70a-6d8d-47a8-9caf-57f46a60f96a" containerName="oc" Mar 18 11:01:05 crc kubenswrapper[4733]: I0318 11:01:05.987212 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h4rcn" Mar 18 11:01:06 crc kubenswrapper[4733]: I0318 11:01:06.000283 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h4rcn"] Mar 18 11:01:06 crc kubenswrapper[4733]: I0318 11:01:06.063839 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5vxg\" (UniqueName: \"kubernetes.io/projected/ea07c343-7a05-49fc-b5d6-4cbeda6a5381-kube-api-access-b5vxg\") pod \"redhat-marketplace-h4rcn\" (UID: \"ea07c343-7a05-49fc-b5d6-4cbeda6a5381\") " pod="openshift-marketplace/redhat-marketplace-h4rcn" Mar 18 11:01:06 crc kubenswrapper[4733]: I0318 11:01:06.063907 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea07c343-7a05-49fc-b5d6-4cbeda6a5381-catalog-content\") pod \"redhat-marketplace-h4rcn\" (UID: \"ea07c343-7a05-49fc-b5d6-4cbeda6a5381\") " pod="openshift-marketplace/redhat-marketplace-h4rcn" Mar 18 11:01:06 crc kubenswrapper[4733]: I0318 11:01:06.063972 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea07c343-7a05-49fc-b5d6-4cbeda6a5381-utilities\") pod \"redhat-marketplace-h4rcn\" (UID: \"ea07c343-7a05-49fc-b5d6-4cbeda6a5381\") " pod="openshift-marketplace/redhat-marketplace-h4rcn" Mar 18 11:01:06 crc kubenswrapper[4733]: I0318 11:01:06.166041 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5vxg\" (UniqueName: \"kubernetes.io/projected/ea07c343-7a05-49fc-b5d6-4cbeda6a5381-kube-api-access-b5vxg\") pod \"redhat-marketplace-h4rcn\" (UID: \"ea07c343-7a05-49fc-b5d6-4cbeda6a5381\") " pod="openshift-marketplace/redhat-marketplace-h4rcn" Mar 18 11:01:06 crc kubenswrapper[4733]: I0318 11:01:06.166119 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea07c343-7a05-49fc-b5d6-4cbeda6a5381-catalog-content\") pod \"redhat-marketplace-h4rcn\" (UID: \"ea07c343-7a05-49fc-b5d6-4cbeda6a5381\") " pod="openshift-marketplace/redhat-marketplace-h4rcn" Mar 18 11:01:06 crc kubenswrapper[4733]: I0318 11:01:06.166207 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea07c343-7a05-49fc-b5d6-4cbeda6a5381-utilities\") pod \"redhat-marketplace-h4rcn\" (UID: \"ea07c343-7a05-49fc-b5d6-4cbeda6a5381\") " pod="openshift-marketplace/redhat-marketplace-h4rcn" Mar 18 11:01:06 crc kubenswrapper[4733]: I0318 11:01:06.166867 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea07c343-7a05-49fc-b5d6-4cbeda6a5381-catalog-content\") pod \"redhat-marketplace-h4rcn\" (UID: \"ea07c343-7a05-49fc-b5d6-4cbeda6a5381\") " pod="openshift-marketplace/redhat-marketplace-h4rcn" Mar 18 11:01:06 crc kubenswrapper[4733]: I0318 11:01:06.166928 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea07c343-7a05-49fc-b5d6-4cbeda6a5381-utilities\") pod \"redhat-marketplace-h4rcn\" (UID: \"ea07c343-7a05-49fc-b5d6-4cbeda6a5381\") " pod="openshift-marketplace/redhat-marketplace-h4rcn" Mar 18 11:01:06 crc kubenswrapper[4733]: I0318 11:01:06.190252 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5vxg\" (UniqueName: \"kubernetes.io/projected/ea07c343-7a05-49fc-b5d6-4cbeda6a5381-kube-api-access-b5vxg\") pod \"redhat-marketplace-h4rcn\" (UID: \"ea07c343-7a05-49fc-b5d6-4cbeda6a5381\") " pod="openshift-marketplace/redhat-marketplace-h4rcn" Mar 18 11:01:06 crc kubenswrapper[4733]: I0318 11:01:06.312260 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h4rcn" Mar 18 11:01:06 crc kubenswrapper[4733]: I0318 11:01:06.764159 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h4rcn"] Mar 18 11:01:07 crc kubenswrapper[4733]: I0318 11:01:07.468033 4733 generic.go:334] "Generic (PLEG): container finished" podID="ea07c343-7a05-49fc-b5d6-4cbeda6a5381" containerID="08a3de0b8d7156f387872d738c867d86fe0dccd04c3114c5181b8c0ec179b906" exitCode=0 Mar 18 11:01:07 crc kubenswrapper[4733]: I0318 11:01:07.468459 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h4rcn" event={"ID":"ea07c343-7a05-49fc-b5d6-4cbeda6a5381","Type":"ContainerDied","Data":"08a3de0b8d7156f387872d738c867d86fe0dccd04c3114c5181b8c0ec179b906"} Mar 18 11:01:07 crc kubenswrapper[4733]: I0318 11:01:07.468517 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h4rcn" event={"ID":"ea07c343-7a05-49fc-b5d6-4cbeda6a5381","Type":"ContainerStarted","Data":"4f2a66f9800ceda513c83b3bfc142b08aff0f8d480daf6fc4b3befe4af03ca24"} Mar 18 11:01:07 crc kubenswrapper[4733]: I0318 11:01:07.472246 4733 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 11:01:08 crc kubenswrapper[4733]: I0318 11:01:08.481119 4733 generic.go:334] "Generic (PLEG): container finished" podID="ea07c343-7a05-49fc-b5d6-4cbeda6a5381" containerID="72be83bbac810419f54dcde0d39f65ddb52ff733b3ac42abe2f4c4d0f624ae58" exitCode=0 Mar 18 11:01:08 crc kubenswrapper[4733]: I0318 11:01:08.481224 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h4rcn" event={"ID":"ea07c343-7a05-49fc-b5d6-4cbeda6a5381","Type":"ContainerDied","Data":"72be83bbac810419f54dcde0d39f65ddb52ff733b3ac42abe2f4c4d0f624ae58"} Mar 18 11:01:09 crc kubenswrapper[4733]: I0318 11:01:09.490348 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h4rcn" event={"ID":"ea07c343-7a05-49fc-b5d6-4cbeda6a5381","Type":"ContainerStarted","Data":"96095daa5aad339dd34e460bfe35bb63755effddfba9c220193ced729dacf8bb"} Mar 18 11:01:09 crc kubenswrapper[4733]: I0318 11:01:09.524897 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-h4rcn" podStartSLOduration=2.912135835 podStartE2EDuration="4.524878218s" podCreationTimestamp="2026-03-18 11:01:05 +0000 UTC" firstStartedPulling="2026-03-18 11:01:07.471980297 +0000 UTC m=+2906.963714622" lastFinishedPulling="2026-03-18 11:01:09.08472269 +0000 UTC m=+2908.576457005" observedRunningTime="2026-03-18 11:01:09.517630644 +0000 UTC m=+2909.009364979" watchObservedRunningTime="2026-03-18 11:01:09.524878218 +0000 UTC m=+2909.016612533" Mar 18 11:01:11 crc kubenswrapper[4733]: I0318 11:01:11.446721 4733 scope.go:117] "RemoveContainer" containerID="0646c2eb1d4076069ba17429b100767c9ea92208b7525c26c0789773916b849f" Mar 18 11:01:13 crc kubenswrapper[4733]: I0318 11:01:13.186361 4733 scope.go:117] "RemoveContainer" containerID="bd29705735db3b754afb9922232f3ff6fa404f8d375f4f93c086696cc3583373" Mar 18 11:01:13 crc kubenswrapper[4733]: E0318 11:01:13.186956 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 11:01:16 crc kubenswrapper[4733]: I0318 11:01:16.176715 4733 scope.go:117] "RemoveContainer" containerID="42f5e854566e19360cf16fa02f3e09efbcbafeba0ef62811eb321face5cd1f9f" Mar 18 11:01:16 crc kubenswrapper[4733]: E0318 11:01:16.177500 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 11:01:16 crc kubenswrapper[4733]: I0318 11:01:16.313127 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-h4rcn" Mar 18 11:01:16 crc kubenswrapper[4733]: I0318 11:01:16.313236 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-h4rcn" Mar 18 11:01:16 crc kubenswrapper[4733]: I0318 11:01:16.356416 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-h4rcn" Mar 18 11:01:16 crc kubenswrapper[4733]: I0318 11:01:16.616393 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-h4rcn" Mar 18 11:01:16 crc kubenswrapper[4733]: I0318 11:01:16.671851 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h4rcn"] Mar 18 11:01:18 crc kubenswrapper[4733]: I0318 11:01:18.576948 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-h4rcn" podUID="ea07c343-7a05-49fc-b5d6-4cbeda6a5381" containerName="registry-server" containerID="cri-o://96095daa5aad339dd34e460bfe35bb63755effddfba9c220193ced729dacf8bb" gracePeriod=2 Mar 18 11:01:19 crc kubenswrapper[4733]: I0318 11:01:19.132655 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h4rcn" Mar 18 11:01:19 crc kubenswrapper[4733]: I0318 11:01:19.195829 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea07c343-7a05-49fc-b5d6-4cbeda6a5381-utilities\") pod \"ea07c343-7a05-49fc-b5d6-4cbeda6a5381\" (UID: \"ea07c343-7a05-49fc-b5d6-4cbeda6a5381\") " Mar 18 11:01:19 crc kubenswrapper[4733]: I0318 11:01:19.195945 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5vxg\" (UniqueName: \"kubernetes.io/projected/ea07c343-7a05-49fc-b5d6-4cbeda6a5381-kube-api-access-b5vxg\") pod \"ea07c343-7a05-49fc-b5d6-4cbeda6a5381\" (UID: \"ea07c343-7a05-49fc-b5d6-4cbeda6a5381\") " Mar 18 11:01:19 crc kubenswrapper[4733]: I0318 11:01:19.196031 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea07c343-7a05-49fc-b5d6-4cbeda6a5381-catalog-content\") pod \"ea07c343-7a05-49fc-b5d6-4cbeda6a5381\" (UID: \"ea07c343-7a05-49fc-b5d6-4cbeda6a5381\") " Mar 18 11:01:19 crc kubenswrapper[4733]: I0318 11:01:19.196877 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea07c343-7a05-49fc-b5d6-4cbeda6a5381-utilities" (OuterVolumeSpecName: "utilities") pod "ea07c343-7a05-49fc-b5d6-4cbeda6a5381" (UID: "ea07c343-7a05-49fc-b5d6-4cbeda6a5381"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 11:01:19 crc kubenswrapper[4733]: I0318 11:01:19.205456 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea07c343-7a05-49fc-b5d6-4cbeda6a5381-kube-api-access-b5vxg" (OuterVolumeSpecName: "kube-api-access-b5vxg") pod "ea07c343-7a05-49fc-b5d6-4cbeda6a5381" (UID: "ea07c343-7a05-49fc-b5d6-4cbeda6a5381"). InnerVolumeSpecName "kube-api-access-b5vxg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:01:19 crc kubenswrapper[4733]: I0318 11:01:19.220682 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea07c343-7a05-49fc-b5d6-4cbeda6a5381-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ea07c343-7a05-49fc-b5d6-4cbeda6a5381" (UID: "ea07c343-7a05-49fc-b5d6-4cbeda6a5381"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 11:01:19 crc kubenswrapper[4733]: I0318 11:01:19.298672 4733 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea07c343-7a05-49fc-b5d6-4cbeda6a5381-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 11:01:19 crc kubenswrapper[4733]: I0318 11:01:19.298770 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5vxg\" (UniqueName: \"kubernetes.io/projected/ea07c343-7a05-49fc-b5d6-4cbeda6a5381-kube-api-access-b5vxg\") on node \"crc\" DevicePath \"\"" Mar 18 11:01:19 crc kubenswrapper[4733]: I0318 11:01:19.298833 4733 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea07c343-7a05-49fc-b5d6-4cbeda6a5381-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 11:01:19 crc kubenswrapper[4733]: I0318 11:01:19.592748 4733 generic.go:334] "Generic (PLEG): container finished" podID="ea07c343-7a05-49fc-b5d6-4cbeda6a5381" containerID="96095daa5aad339dd34e460bfe35bb63755effddfba9c220193ced729dacf8bb" exitCode=0 Mar 18 11:01:19 crc kubenswrapper[4733]: I0318 11:01:19.592804 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h4rcn" event={"ID":"ea07c343-7a05-49fc-b5d6-4cbeda6a5381","Type":"ContainerDied","Data":"96095daa5aad339dd34e460bfe35bb63755effddfba9c220193ced729dacf8bb"} Mar 18 11:01:19 crc kubenswrapper[4733]: I0318 11:01:19.592839 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h4rcn" event={"ID":"ea07c343-7a05-49fc-b5d6-4cbeda6a5381","Type":"ContainerDied","Data":"4f2a66f9800ceda513c83b3bfc142b08aff0f8d480daf6fc4b3befe4af03ca24"} Mar 18 11:01:19 crc kubenswrapper[4733]: I0318 11:01:19.592841 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h4rcn" Mar 18 11:01:19 crc kubenswrapper[4733]: I0318 11:01:19.592864 4733 scope.go:117] "RemoveContainer" containerID="96095daa5aad339dd34e460bfe35bb63755effddfba9c220193ced729dacf8bb" Mar 18 11:01:19 crc kubenswrapper[4733]: I0318 11:01:19.628412 4733 scope.go:117] "RemoveContainer" containerID="72be83bbac810419f54dcde0d39f65ddb52ff733b3ac42abe2f4c4d0f624ae58" Mar 18 11:01:19 crc kubenswrapper[4733]: I0318 11:01:19.655176 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h4rcn"] Mar 18 11:01:19 crc kubenswrapper[4733]: I0318 11:01:19.670170 4733 scope.go:117] "RemoveContainer" containerID="08a3de0b8d7156f387872d738c867d86fe0dccd04c3114c5181b8c0ec179b906" Mar 18 11:01:19 crc kubenswrapper[4733]: I0318 11:01:19.675149 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-h4rcn"] Mar 18 11:01:19 crc kubenswrapper[4733]: I0318 11:01:19.710020 4733 scope.go:117] "RemoveContainer" containerID="96095daa5aad339dd34e460bfe35bb63755effddfba9c220193ced729dacf8bb" Mar 18 11:01:19 crc kubenswrapper[4733]: E0318 11:01:19.710984 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96095daa5aad339dd34e460bfe35bb63755effddfba9c220193ced729dacf8bb\": container with ID starting with 96095daa5aad339dd34e460bfe35bb63755effddfba9c220193ced729dacf8bb not found: ID does not exist" containerID="96095daa5aad339dd34e460bfe35bb63755effddfba9c220193ced729dacf8bb" Mar 18 11:01:19 crc kubenswrapper[4733]: I0318 11:01:19.711039 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96095daa5aad339dd34e460bfe35bb63755effddfba9c220193ced729dacf8bb"} err="failed to get container status \"96095daa5aad339dd34e460bfe35bb63755effddfba9c220193ced729dacf8bb\": rpc error: code = NotFound desc = could not find container \"96095daa5aad339dd34e460bfe35bb63755effddfba9c220193ced729dacf8bb\": container with ID starting with 96095daa5aad339dd34e460bfe35bb63755effddfba9c220193ced729dacf8bb not found: ID does not exist" Mar 18 11:01:19 crc kubenswrapper[4733]: I0318 11:01:19.711072 4733 scope.go:117] "RemoveContainer" containerID="72be83bbac810419f54dcde0d39f65ddb52ff733b3ac42abe2f4c4d0f624ae58" Mar 18 11:01:19 crc kubenswrapper[4733]: E0318 11:01:19.711866 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72be83bbac810419f54dcde0d39f65ddb52ff733b3ac42abe2f4c4d0f624ae58\": container with ID starting with 72be83bbac810419f54dcde0d39f65ddb52ff733b3ac42abe2f4c4d0f624ae58 not found: ID does not exist" containerID="72be83bbac810419f54dcde0d39f65ddb52ff733b3ac42abe2f4c4d0f624ae58" Mar 18 11:01:19 crc kubenswrapper[4733]: I0318 11:01:19.711918 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72be83bbac810419f54dcde0d39f65ddb52ff733b3ac42abe2f4c4d0f624ae58"} err="failed to get container status \"72be83bbac810419f54dcde0d39f65ddb52ff733b3ac42abe2f4c4d0f624ae58\": rpc error: code = NotFound desc = could not find container \"72be83bbac810419f54dcde0d39f65ddb52ff733b3ac42abe2f4c4d0f624ae58\": container with ID starting with 72be83bbac810419f54dcde0d39f65ddb52ff733b3ac42abe2f4c4d0f624ae58 not found: ID does not exist" Mar 18 11:01:19 crc kubenswrapper[4733]: I0318 11:01:19.711948 4733 scope.go:117] "RemoveContainer" containerID="08a3de0b8d7156f387872d738c867d86fe0dccd04c3114c5181b8c0ec179b906" Mar 18 11:01:19 crc kubenswrapper[4733]: E0318 11:01:19.712325 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08a3de0b8d7156f387872d738c867d86fe0dccd04c3114c5181b8c0ec179b906\": container with ID starting with 08a3de0b8d7156f387872d738c867d86fe0dccd04c3114c5181b8c0ec179b906 not found: ID does not exist" containerID="08a3de0b8d7156f387872d738c867d86fe0dccd04c3114c5181b8c0ec179b906" Mar 18 11:01:19 crc kubenswrapper[4733]: I0318 11:01:19.712376 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08a3de0b8d7156f387872d738c867d86fe0dccd04c3114c5181b8c0ec179b906"} err="failed to get container status \"08a3de0b8d7156f387872d738c867d86fe0dccd04c3114c5181b8c0ec179b906\": rpc error: code = NotFound desc = could not find container \"08a3de0b8d7156f387872d738c867d86fe0dccd04c3114c5181b8c0ec179b906\": container with ID starting with 08a3de0b8d7156f387872d738c867d86fe0dccd04c3114c5181b8c0ec179b906 not found: ID does not exist" Mar 18 11:01:21 crc kubenswrapper[4733]: I0318 11:01:21.195930 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea07c343-7a05-49fc-b5d6-4cbeda6a5381" path="/var/lib/kubelet/pods/ea07c343-7a05-49fc-b5d6-4cbeda6a5381/volumes" Mar 18 11:01:26 crc kubenswrapper[4733]: I0318 11:01:26.176143 4733 scope.go:117] "RemoveContainer" containerID="bd29705735db3b754afb9922232f3ff6fa404f8d375f4f93c086696cc3583373" Mar 18 11:01:26 crc kubenswrapper[4733]: E0318 11:01:26.176899 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 11:01:28 crc kubenswrapper[4733]: I0318 11:01:28.176137 4733 scope.go:117] "RemoveContainer" containerID="42f5e854566e19360cf16fa02f3e09efbcbafeba0ef62811eb321face5cd1f9f" Mar 18 11:01:28 crc kubenswrapper[4733]: E0318 11:01:28.177041 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 11:01:41 crc kubenswrapper[4733]: I0318 11:01:41.185579 4733 scope.go:117] "RemoveContainer" containerID="bd29705735db3b754afb9922232f3ff6fa404f8d375f4f93c086696cc3583373" Mar 18 11:01:41 crc kubenswrapper[4733]: E0318 11:01:41.186718 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 11:01:42 crc kubenswrapper[4733]: I0318 11:01:42.175902 4733 scope.go:117] "RemoveContainer" containerID="42f5e854566e19360cf16fa02f3e09efbcbafeba0ef62811eb321face5cd1f9f" Mar 18 11:01:42 crc kubenswrapper[4733]: E0318 11:01:42.176265 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 11:01:43 crc kubenswrapper[4733]: I0318 11:01:43.571034 4733 patch_prober.go:28] interesting pod/machine-config-daemon-2h7dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 11:01:43 crc kubenswrapper[4733]: I0318 11:01:43.571475 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 11:01:43 crc kubenswrapper[4733]: I0318 11:01:43.699951 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-m754c/must-gather-dtbq6"] Mar 18 11:01:43 crc kubenswrapper[4733]: E0318 11:01:43.700469 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea07c343-7a05-49fc-b5d6-4cbeda6a5381" containerName="extract-content" Mar 18 11:01:43 crc kubenswrapper[4733]: I0318 11:01:43.700498 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea07c343-7a05-49fc-b5d6-4cbeda6a5381" containerName="extract-content" Mar 18 11:01:43 crc kubenswrapper[4733]: E0318 11:01:43.700524 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea07c343-7a05-49fc-b5d6-4cbeda6a5381" containerName="extract-utilities" Mar 18 11:01:43 crc kubenswrapper[4733]: I0318 11:01:43.700540 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea07c343-7a05-49fc-b5d6-4cbeda6a5381" containerName="extract-utilities" Mar 18 11:01:43 crc kubenswrapper[4733]: E0318 11:01:43.700557 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea07c343-7a05-49fc-b5d6-4cbeda6a5381" containerName="registry-server" Mar 18 11:01:43 crc kubenswrapper[4733]: I0318 11:01:43.700568 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea07c343-7a05-49fc-b5d6-4cbeda6a5381" containerName="registry-server" Mar 18 11:01:43 crc kubenswrapper[4733]: I0318 11:01:43.700842 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea07c343-7a05-49fc-b5d6-4cbeda6a5381" containerName="registry-server" Mar 18 11:01:43 crc kubenswrapper[4733]: I0318 11:01:43.702234 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m754c/must-gather-dtbq6" Mar 18 11:01:43 crc kubenswrapper[4733]: I0318 11:01:43.704604 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-m754c"/"openshift-service-ca.crt" Mar 18 11:01:43 crc kubenswrapper[4733]: I0318 11:01:43.707387 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-m754c"/"kube-root-ca.crt" Mar 18 11:01:43 crc kubenswrapper[4733]: I0318 11:01:43.720875 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-m754c/must-gather-dtbq6"] Mar 18 11:01:43 crc kubenswrapper[4733]: I0318 11:01:43.838523 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5542a33f-3466-419c-af8f-3391bcc3d241-must-gather-output\") pod \"must-gather-dtbq6\" (UID: \"5542a33f-3466-419c-af8f-3391bcc3d241\") " pod="openshift-must-gather-m754c/must-gather-dtbq6" Mar 18 11:01:43 crc kubenswrapper[4733]: I0318 11:01:43.838730 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57p52\" (UniqueName: \"kubernetes.io/projected/5542a33f-3466-419c-af8f-3391bcc3d241-kube-api-access-57p52\") pod \"must-gather-dtbq6\" (UID: \"5542a33f-3466-419c-af8f-3391bcc3d241\") " pod="openshift-must-gather-m754c/must-gather-dtbq6" Mar 18 11:01:43 crc kubenswrapper[4733]: I0318 11:01:43.939843 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5542a33f-3466-419c-af8f-3391bcc3d241-must-gather-output\") pod \"must-gather-dtbq6\" (UID: \"5542a33f-3466-419c-af8f-3391bcc3d241\") " pod="openshift-must-gather-m754c/must-gather-dtbq6" Mar 18 11:01:43 crc kubenswrapper[4733]: I0318 11:01:43.940011 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57p52\" (UniqueName: \"kubernetes.io/projected/5542a33f-3466-419c-af8f-3391bcc3d241-kube-api-access-57p52\") pod \"must-gather-dtbq6\" (UID: \"5542a33f-3466-419c-af8f-3391bcc3d241\") " pod="openshift-must-gather-m754c/must-gather-dtbq6" Mar 18 11:01:43 crc kubenswrapper[4733]: I0318 11:01:43.940298 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5542a33f-3466-419c-af8f-3391bcc3d241-must-gather-output\") pod \"must-gather-dtbq6\" (UID: \"5542a33f-3466-419c-af8f-3391bcc3d241\") " pod="openshift-must-gather-m754c/must-gather-dtbq6" Mar 18 11:01:43 crc kubenswrapper[4733]: I0318 11:01:43.958171 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57p52\" (UniqueName: \"kubernetes.io/projected/5542a33f-3466-419c-af8f-3391bcc3d241-kube-api-access-57p52\") pod \"must-gather-dtbq6\" (UID: \"5542a33f-3466-419c-af8f-3391bcc3d241\") " pod="openshift-must-gather-m754c/must-gather-dtbq6" Mar 18 11:01:44 crc kubenswrapper[4733]: I0318 11:01:44.022643 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m754c/must-gather-dtbq6" Mar 18 11:01:44 crc kubenswrapper[4733]: I0318 11:01:44.461964 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-m754c/must-gather-dtbq6"] Mar 18 11:01:44 crc kubenswrapper[4733]: I0318 11:01:44.833451 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-m754c/must-gather-dtbq6" event={"ID":"5542a33f-3466-419c-af8f-3391bcc3d241","Type":"ContainerStarted","Data":"20467b25239c2c45218415caa32f35ca4a630d25fd4fdaa101490d758d81bb69"} Mar 18 11:01:51 crc kubenswrapper[4733]: I0318 11:01:51.728096 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-m754c/crc-debug-xs7wl"] Mar 18 11:01:51 crc kubenswrapper[4733]: I0318 11:01:51.729580 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m754c/crc-debug-xs7wl" Mar 18 11:01:51 crc kubenswrapper[4733]: I0318 11:01:51.733710 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-m754c"/"default-dockercfg-rlbjp" Mar 18 11:01:51 crc kubenswrapper[4733]: I0318 11:01:51.824791 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fz9mx\" (UniqueName: \"kubernetes.io/projected/cd690373-efdd-4c07-89b0-d7283af1e3eb-kube-api-access-fz9mx\") pod \"crc-debug-xs7wl\" (UID: \"cd690373-efdd-4c07-89b0-d7283af1e3eb\") " pod="openshift-must-gather-m754c/crc-debug-xs7wl" Mar 18 11:01:51 crc kubenswrapper[4733]: I0318 11:01:51.824891 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cd690373-efdd-4c07-89b0-d7283af1e3eb-host\") pod \"crc-debug-xs7wl\" (UID: \"cd690373-efdd-4c07-89b0-d7283af1e3eb\") " pod="openshift-must-gather-m754c/crc-debug-xs7wl" Mar 18 11:01:51 crc kubenswrapper[4733]: I0318 11:01:51.898372 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-m754c/must-gather-dtbq6" event={"ID":"5542a33f-3466-419c-af8f-3391bcc3d241","Type":"ContainerStarted","Data":"1557b68a8a325a0ed8ba1781990482335f57ef77fd7aa971706340af195f6c7b"} Mar 18 11:01:51 crc kubenswrapper[4733]: I0318 11:01:51.898421 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-m754c/must-gather-dtbq6" event={"ID":"5542a33f-3466-419c-af8f-3391bcc3d241","Type":"ContainerStarted","Data":"83201b6bc07225e6541cd95837440f81a991cdbcc76e2856eb4af65abc081fb9"} Mar 18 11:01:51 crc kubenswrapper[4733]: I0318 11:01:51.922599 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-m754c/must-gather-dtbq6" podStartSLOduration=2.625467189 podStartE2EDuration="8.922582733s" podCreationTimestamp="2026-03-18 11:01:43 +0000 UTC" firstStartedPulling="2026-03-18 11:01:44.477491428 +0000 UTC m=+2943.969225793" lastFinishedPulling="2026-03-18 11:01:50.774607012 +0000 UTC m=+2950.266341337" observedRunningTime="2026-03-18 11:01:51.915800252 +0000 UTC m=+2951.407534587" watchObservedRunningTime="2026-03-18 11:01:51.922582733 +0000 UTC m=+2951.414317058" Mar 18 11:01:51 crc kubenswrapper[4733]: I0318 11:01:51.929104 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cd690373-efdd-4c07-89b0-d7283af1e3eb-host\") pod \"crc-debug-xs7wl\" (UID: \"cd690373-efdd-4c07-89b0-d7283af1e3eb\") " pod="openshift-must-gather-m754c/crc-debug-xs7wl" Mar 18 11:01:51 crc kubenswrapper[4733]: I0318 11:01:51.929264 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fz9mx\" (UniqueName: \"kubernetes.io/projected/cd690373-efdd-4c07-89b0-d7283af1e3eb-kube-api-access-fz9mx\") pod \"crc-debug-xs7wl\" (UID: \"cd690373-efdd-4c07-89b0-d7283af1e3eb\") " pod="openshift-must-gather-m754c/crc-debug-xs7wl" Mar 18 11:01:51 crc kubenswrapper[4733]: I0318 11:01:51.929282 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cd690373-efdd-4c07-89b0-d7283af1e3eb-host\") pod \"crc-debug-xs7wl\" (UID: \"cd690373-efdd-4c07-89b0-d7283af1e3eb\") " pod="openshift-must-gather-m754c/crc-debug-xs7wl" Mar 18 11:01:51 crc kubenswrapper[4733]: I0318 11:01:51.981505 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fz9mx\" (UniqueName: \"kubernetes.io/projected/cd690373-efdd-4c07-89b0-d7283af1e3eb-kube-api-access-fz9mx\") pod \"crc-debug-xs7wl\" (UID: \"cd690373-efdd-4c07-89b0-d7283af1e3eb\") " pod="openshift-must-gather-m754c/crc-debug-xs7wl" Mar 18 11:01:52 crc kubenswrapper[4733]: I0318 11:01:52.051172 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m754c/crc-debug-xs7wl" Mar 18 11:01:52 crc kubenswrapper[4733]: I0318 11:01:52.910090 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-m754c/crc-debug-xs7wl" event={"ID":"cd690373-efdd-4c07-89b0-d7283af1e3eb","Type":"ContainerStarted","Data":"c106f3e82925e7c333ae1706bfb1aa321da818a6e980a043c0f60386d545ad29"} Mar 18 11:01:53 crc kubenswrapper[4733]: I0318 11:01:53.183153 4733 scope.go:117] "RemoveContainer" containerID="bd29705735db3b754afb9922232f3ff6fa404f8d375f4f93c086696cc3583373" Mar 18 11:01:53 crc kubenswrapper[4733]: E0318 11:01:53.183390 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 11:01:54 crc kubenswrapper[4733]: I0318 11:01:54.175474 4733 scope.go:117] "RemoveContainer" containerID="42f5e854566e19360cf16fa02f3e09efbcbafeba0ef62811eb321face5cd1f9f" Mar 18 11:01:54 crc kubenswrapper[4733]: E0318 11:01:54.176062 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 11:02:00 crc kubenswrapper[4733]: I0318 11:02:00.139717 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563862-vmt2n"] Mar 18 11:02:00 crc kubenswrapper[4733]: I0318 11:02:00.140705 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563862-vmt2n" Mar 18 11:02:00 crc kubenswrapper[4733]: I0318 11:02:00.143563 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 11:02:00 crc kubenswrapper[4733]: I0318 11:02:00.143709 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 11:02:00 crc kubenswrapper[4733]: I0318 11:02:00.143804 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wmd5k" Mar 18 11:02:00 crc kubenswrapper[4733]: I0318 11:02:00.151088 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563862-vmt2n"] Mar 18 11:02:00 crc kubenswrapper[4733]: I0318 11:02:00.204328 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtn67\" (UniqueName: \"kubernetes.io/projected/ad06f0e3-f671-426a-b6d8-793e87745364-kube-api-access-rtn67\") pod \"auto-csr-approver-29563862-vmt2n\" (UID: \"ad06f0e3-f671-426a-b6d8-793e87745364\") " pod="openshift-infra/auto-csr-approver-29563862-vmt2n" Mar 18 11:02:00 crc kubenswrapper[4733]: I0318 11:02:00.305865 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtn67\" (UniqueName: \"kubernetes.io/projected/ad06f0e3-f671-426a-b6d8-793e87745364-kube-api-access-rtn67\") pod \"auto-csr-approver-29563862-vmt2n\" (UID: \"ad06f0e3-f671-426a-b6d8-793e87745364\") " pod="openshift-infra/auto-csr-approver-29563862-vmt2n" Mar 18 11:02:00 crc kubenswrapper[4733]: I0318 11:02:00.335057 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtn67\" (UniqueName: \"kubernetes.io/projected/ad06f0e3-f671-426a-b6d8-793e87745364-kube-api-access-rtn67\") pod \"auto-csr-approver-29563862-vmt2n\" (UID: \"ad06f0e3-f671-426a-b6d8-793e87745364\") " pod="openshift-infra/auto-csr-approver-29563862-vmt2n" Mar 18 11:02:00 crc kubenswrapper[4733]: I0318 11:02:00.464095 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563862-vmt2n" Mar 18 11:02:06 crc kubenswrapper[4733]: W0318 11:02:06.155866 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad06f0e3_f671_426a_b6d8_793e87745364.slice/crio-217b84ae69a58013cd13fbbc295a76316974fb46cf597690074f4cafa94bc5c7 WatchSource:0}: Error finding container 217b84ae69a58013cd13fbbc295a76316974fb46cf597690074f4cafa94bc5c7: Status 404 returned error can't find the container with id 217b84ae69a58013cd13fbbc295a76316974fb46cf597690074f4cafa94bc5c7 Mar 18 11:02:06 crc kubenswrapper[4733]: I0318 11:02:06.156393 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563862-vmt2n"] Mar 18 11:02:07 crc kubenswrapper[4733]: I0318 11:02:07.053331 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-m754c/crc-debug-xs7wl" event={"ID":"cd690373-efdd-4c07-89b0-d7283af1e3eb","Type":"ContainerStarted","Data":"d5fbd81f15e46718684cb2d4937c90c452a18fd504ebfaa9932047a8f5e25c6a"} Mar 18 11:02:07 crc kubenswrapper[4733]: I0318 11:02:07.055054 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563862-vmt2n" event={"ID":"ad06f0e3-f671-426a-b6d8-793e87745364","Type":"ContainerStarted","Data":"217b84ae69a58013cd13fbbc295a76316974fb46cf597690074f4cafa94bc5c7"} Mar 18 11:02:07 crc kubenswrapper[4733]: I0318 11:02:07.178886 4733 scope.go:117] "RemoveContainer" containerID="42f5e854566e19360cf16fa02f3e09efbcbafeba0ef62811eb321face5cd1f9f" Mar 18 11:02:07 crc kubenswrapper[4733]: E0318 11:02:07.179342 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 11:02:07 crc kubenswrapper[4733]: I0318 11:02:07.179473 4733 scope.go:117] "RemoveContainer" containerID="bd29705735db3b754afb9922232f3ff6fa404f8d375f4f93c086696cc3583373" Mar 18 11:02:07 crc kubenswrapper[4733]: E0318 11:02:07.179865 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 11:02:08 crc kubenswrapper[4733]: I0318 11:02:08.064752 4733 generic.go:334] "Generic (PLEG): container finished" podID="ad06f0e3-f671-426a-b6d8-793e87745364" containerID="8d69124510c80bc1e6643920b76340f2795eccc769fbdff3665f6edcc2793fe5" exitCode=0 Mar 18 11:02:08 crc kubenswrapper[4733]: I0318 11:02:08.064822 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563862-vmt2n" event={"ID":"ad06f0e3-f671-426a-b6d8-793e87745364","Type":"ContainerDied","Data":"8d69124510c80bc1e6643920b76340f2795eccc769fbdff3665f6edcc2793fe5"} Mar 18 11:02:08 crc kubenswrapper[4733]: I0318 11:02:08.088598 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-m754c/crc-debug-xs7wl" podStartSLOduration=3.2594349080000002 podStartE2EDuration="17.088572607s" podCreationTimestamp="2026-03-18 11:01:51 +0000 UTC" firstStartedPulling="2026-03-18 11:01:52.092250426 +0000 UTC m=+2951.583984751" lastFinishedPulling="2026-03-18 11:02:05.921388125 +0000 UTC m=+2965.413122450" observedRunningTime="2026-03-18 11:02:07.073998168 +0000 UTC m=+2966.565732513" watchObservedRunningTime="2026-03-18 11:02:08.088572607 +0000 UTC m=+2967.580306972" Mar 18 11:02:09 crc kubenswrapper[4733]: I0318 11:02:09.406194 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563862-vmt2n" Mar 18 11:02:09 crc kubenswrapper[4733]: I0318 11:02:09.553422 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtn67\" (UniqueName: \"kubernetes.io/projected/ad06f0e3-f671-426a-b6d8-793e87745364-kube-api-access-rtn67\") pod \"ad06f0e3-f671-426a-b6d8-793e87745364\" (UID: \"ad06f0e3-f671-426a-b6d8-793e87745364\") " Mar 18 11:02:09 crc kubenswrapper[4733]: I0318 11:02:09.558791 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad06f0e3-f671-426a-b6d8-793e87745364-kube-api-access-rtn67" (OuterVolumeSpecName: "kube-api-access-rtn67") pod "ad06f0e3-f671-426a-b6d8-793e87745364" (UID: "ad06f0e3-f671-426a-b6d8-793e87745364"). InnerVolumeSpecName "kube-api-access-rtn67". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:02:09 crc kubenswrapper[4733]: I0318 11:02:09.655666 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rtn67\" (UniqueName: \"kubernetes.io/projected/ad06f0e3-f671-426a-b6d8-793e87745364-kube-api-access-rtn67\") on node \"crc\" DevicePath \"\"" Mar 18 11:02:10 crc kubenswrapper[4733]: I0318 11:02:10.081559 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563862-vmt2n" event={"ID":"ad06f0e3-f671-426a-b6d8-793e87745364","Type":"ContainerDied","Data":"217b84ae69a58013cd13fbbc295a76316974fb46cf597690074f4cafa94bc5c7"} Mar 18 11:02:10 crc kubenswrapper[4733]: I0318 11:02:10.081604 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="217b84ae69a58013cd13fbbc295a76316974fb46cf597690074f4cafa94bc5c7" Mar 18 11:02:10 crc kubenswrapper[4733]: I0318 11:02:10.081609 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563862-vmt2n" Mar 18 11:02:10 crc kubenswrapper[4733]: I0318 11:02:10.503927 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563856-qhf5l"] Mar 18 11:02:10 crc kubenswrapper[4733]: I0318 11:02:10.510403 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563856-qhf5l"] Mar 18 11:02:11 crc kubenswrapper[4733]: I0318 11:02:11.192265 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1a4900e-15be-4e3f-a8a2-3eb582acbc20" path="/var/lib/kubelet/pods/e1a4900e-15be-4e3f-a8a2-3eb582acbc20/volumes" Mar 18 11:02:11 crc kubenswrapper[4733]: I0318 11:02:11.544457 4733 scope.go:117] "RemoveContainer" containerID="9c09a3cb9db31583aa867b9b2e7873c25af33c3cb06cde66bc60959b1e039850" Mar 18 11:02:13 crc kubenswrapper[4733]: I0318 11:02:13.570954 4733 patch_prober.go:28] interesting pod/machine-config-daemon-2h7dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 11:02:13 crc kubenswrapper[4733]: I0318 11:02:13.571258 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 11:02:20 crc kubenswrapper[4733]: I0318 11:02:20.176096 4733 scope.go:117] "RemoveContainer" containerID="bd29705735db3b754afb9922232f3ff6fa404f8d375f4f93c086696cc3583373" Mar 18 11:02:20 crc kubenswrapper[4733]: E0318 11:02:20.177253 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 11:02:21 crc kubenswrapper[4733]: I0318 11:02:21.190140 4733 scope.go:117] "RemoveContainer" containerID="42f5e854566e19360cf16fa02f3e09efbcbafeba0ef62811eb321face5cd1f9f" Mar 18 11:02:21 crc kubenswrapper[4733]: E0318 11:02:21.191037 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 11:02:25 crc kubenswrapper[4733]: I0318 11:02:25.632872 4733 generic.go:334] "Generic (PLEG): container finished" podID="cd690373-efdd-4c07-89b0-d7283af1e3eb" containerID="d5fbd81f15e46718684cb2d4937c90c452a18fd504ebfaa9932047a8f5e25c6a" exitCode=0 Mar 18 11:02:25 crc kubenswrapper[4733]: I0318 11:02:25.632965 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-m754c/crc-debug-xs7wl" event={"ID":"cd690373-efdd-4c07-89b0-d7283af1e3eb","Type":"ContainerDied","Data":"d5fbd81f15e46718684cb2d4937c90c452a18fd504ebfaa9932047a8f5e25c6a"} Mar 18 11:02:26 crc kubenswrapper[4733]: I0318 11:02:26.726381 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m754c/crc-debug-xs7wl" Mar 18 11:02:26 crc kubenswrapper[4733]: I0318 11:02:26.760239 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-m754c/crc-debug-xs7wl"] Mar 18 11:02:26 crc kubenswrapper[4733]: I0318 11:02:26.765982 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-m754c/crc-debug-xs7wl"] Mar 18 11:02:26 crc kubenswrapper[4733]: I0318 11:02:26.846250 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cd690373-efdd-4c07-89b0-d7283af1e3eb-host\") pod \"cd690373-efdd-4c07-89b0-d7283af1e3eb\" (UID: \"cd690373-efdd-4c07-89b0-d7283af1e3eb\") " Mar 18 11:02:26 crc kubenswrapper[4733]: I0318 11:02:26.846565 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fz9mx\" (UniqueName: \"kubernetes.io/projected/cd690373-efdd-4c07-89b0-d7283af1e3eb-kube-api-access-fz9mx\") pod \"cd690373-efdd-4c07-89b0-d7283af1e3eb\" (UID: \"cd690373-efdd-4c07-89b0-d7283af1e3eb\") " Mar 18 11:02:26 crc kubenswrapper[4733]: I0318 11:02:26.846446 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cd690373-efdd-4c07-89b0-d7283af1e3eb-host" (OuterVolumeSpecName: "host") pod "cd690373-efdd-4c07-89b0-d7283af1e3eb" (UID: "cd690373-efdd-4c07-89b0-d7283af1e3eb"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 11:02:26 crc kubenswrapper[4733]: I0318 11:02:26.846860 4733 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cd690373-efdd-4c07-89b0-d7283af1e3eb-host\") on node \"crc\" DevicePath \"\"" Mar 18 11:02:26 crc kubenswrapper[4733]: I0318 11:02:26.856377 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd690373-efdd-4c07-89b0-d7283af1e3eb-kube-api-access-fz9mx" (OuterVolumeSpecName: "kube-api-access-fz9mx") pod "cd690373-efdd-4c07-89b0-d7283af1e3eb" (UID: "cd690373-efdd-4c07-89b0-d7283af1e3eb"). InnerVolumeSpecName "kube-api-access-fz9mx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:02:26 crc kubenswrapper[4733]: I0318 11:02:26.948475 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fz9mx\" (UniqueName: \"kubernetes.io/projected/cd690373-efdd-4c07-89b0-d7283af1e3eb-kube-api-access-fz9mx\") on node \"crc\" DevicePath \"\"" Mar 18 11:02:27 crc kubenswrapper[4733]: I0318 11:02:27.185930 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd690373-efdd-4c07-89b0-d7283af1e3eb" path="/var/lib/kubelet/pods/cd690373-efdd-4c07-89b0-d7283af1e3eb/volumes" Mar 18 11:02:27 crc kubenswrapper[4733]: I0318 11:02:27.649453 4733 scope.go:117] "RemoveContainer" containerID="d5fbd81f15e46718684cb2d4937c90c452a18fd504ebfaa9932047a8f5e25c6a" Mar 18 11:02:27 crc kubenswrapper[4733]: I0318 11:02:27.649500 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m754c/crc-debug-xs7wl" Mar 18 11:02:27 crc kubenswrapper[4733]: I0318 11:02:27.952821 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-m754c/crc-debug-8v78b"] Mar 18 11:02:27 crc kubenswrapper[4733]: E0318 11:02:27.953216 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd690373-efdd-4c07-89b0-d7283af1e3eb" containerName="container-00" Mar 18 11:02:27 crc kubenswrapper[4733]: I0318 11:02:27.953233 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd690373-efdd-4c07-89b0-d7283af1e3eb" containerName="container-00" Mar 18 11:02:27 crc kubenswrapper[4733]: E0318 11:02:27.953278 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad06f0e3-f671-426a-b6d8-793e87745364" containerName="oc" Mar 18 11:02:27 crc kubenswrapper[4733]: I0318 11:02:27.953286 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad06f0e3-f671-426a-b6d8-793e87745364" containerName="oc" Mar 18 11:02:27 crc kubenswrapper[4733]: I0318 11:02:27.953459 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad06f0e3-f671-426a-b6d8-793e87745364" containerName="oc" Mar 18 11:02:27 crc kubenswrapper[4733]: I0318 11:02:27.953473 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd690373-efdd-4c07-89b0-d7283af1e3eb" containerName="container-00" Mar 18 11:02:27 crc kubenswrapper[4733]: I0318 11:02:27.954056 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m754c/crc-debug-8v78b" Mar 18 11:02:27 crc kubenswrapper[4733]: I0318 11:02:27.956656 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-m754c"/"default-dockercfg-rlbjp" Mar 18 11:02:27 crc kubenswrapper[4733]: I0318 11:02:27.964034 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gstb\" (UniqueName: \"kubernetes.io/projected/60216bd7-dbfa-4ef9-a60d-0941dd33ff44-kube-api-access-6gstb\") pod \"crc-debug-8v78b\" (UID: \"60216bd7-dbfa-4ef9-a60d-0941dd33ff44\") " pod="openshift-must-gather-m754c/crc-debug-8v78b" Mar 18 11:02:27 crc kubenswrapper[4733]: I0318 11:02:27.964107 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/60216bd7-dbfa-4ef9-a60d-0941dd33ff44-host\") pod \"crc-debug-8v78b\" (UID: \"60216bd7-dbfa-4ef9-a60d-0941dd33ff44\") " pod="openshift-must-gather-m754c/crc-debug-8v78b" Mar 18 11:02:28 crc kubenswrapper[4733]: I0318 11:02:28.065682 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gstb\" (UniqueName: \"kubernetes.io/projected/60216bd7-dbfa-4ef9-a60d-0941dd33ff44-kube-api-access-6gstb\") pod \"crc-debug-8v78b\" (UID: \"60216bd7-dbfa-4ef9-a60d-0941dd33ff44\") " pod="openshift-must-gather-m754c/crc-debug-8v78b" Mar 18 11:02:28 crc kubenswrapper[4733]: I0318 11:02:28.066004 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/60216bd7-dbfa-4ef9-a60d-0941dd33ff44-host\") pod \"crc-debug-8v78b\" (UID: \"60216bd7-dbfa-4ef9-a60d-0941dd33ff44\") " pod="openshift-must-gather-m754c/crc-debug-8v78b" Mar 18 11:02:28 crc kubenswrapper[4733]: I0318 11:02:28.066120 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/60216bd7-dbfa-4ef9-a60d-0941dd33ff44-host\") pod \"crc-debug-8v78b\" (UID: \"60216bd7-dbfa-4ef9-a60d-0941dd33ff44\") " pod="openshift-must-gather-m754c/crc-debug-8v78b" Mar 18 11:02:28 crc kubenswrapper[4733]: I0318 11:02:28.090188 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gstb\" (UniqueName: \"kubernetes.io/projected/60216bd7-dbfa-4ef9-a60d-0941dd33ff44-kube-api-access-6gstb\") pod \"crc-debug-8v78b\" (UID: \"60216bd7-dbfa-4ef9-a60d-0941dd33ff44\") " pod="openshift-must-gather-m754c/crc-debug-8v78b" Mar 18 11:02:28 crc kubenswrapper[4733]: I0318 11:02:28.278421 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m754c/crc-debug-8v78b" Mar 18 11:02:28 crc kubenswrapper[4733]: I0318 11:02:28.658695 4733 generic.go:334] "Generic (PLEG): container finished" podID="60216bd7-dbfa-4ef9-a60d-0941dd33ff44" containerID="de436dca9aa9677c82c7c01e372254e32bace5a1643be11aa46c20d0852dd35c" exitCode=1 Mar 18 11:02:28 crc kubenswrapper[4733]: I0318 11:02:28.658807 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-m754c/crc-debug-8v78b" event={"ID":"60216bd7-dbfa-4ef9-a60d-0941dd33ff44","Type":"ContainerDied","Data":"de436dca9aa9677c82c7c01e372254e32bace5a1643be11aa46c20d0852dd35c"} Mar 18 11:02:28 crc kubenswrapper[4733]: I0318 11:02:28.659021 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-m754c/crc-debug-8v78b" event={"ID":"60216bd7-dbfa-4ef9-a60d-0941dd33ff44","Type":"ContainerStarted","Data":"696ddf380a049e8177e4e3e6b95841c6aafcaa12c024e09fdf4c74f937470a0e"} Mar 18 11:02:28 crc kubenswrapper[4733]: I0318 11:02:28.692211 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-m754c/crc-debug-8v78b"] Mar 18 11:02:28 crc kubenswrapper[4733]: I0318 11:02:28.703164 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-m754c/crc-debug-8v78b"] Mar 18 11:02:29 crc kubenswrapper[4733]: I0318 11:02:29.747534 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m754c/crc-debug-8v78b" Mar 18 11:02:29 crc kubenswrapper[4733]: I0318 11:02:29.896112 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gstb\" (UniqueName: \"kubernetes.io/projected/60216bd7-dbfa-4ef9-a60d-0941dd33ff44-kube-api-access-6gstb\") pod \"60216bd7-dbfa-4ef9-a60d-0941dd33ff44\" (UID: \"60216bd7-dbfa-4ef9-a60d-0941dd33ff44\") " Mar 18 11:02:29 crc kubenswrapper[4733]: I0318 11:02:29.896537 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/60216bd7-dbfa-4ef9-a60d-0941dd33ff44-host\") pod \"60216bd7-dbfa-4ef9-a60d-0941dd33ff44\" (UID: \"60216bd7-dbfa-4ef9-a60d-0941dd33ff44\") " Mar 18 11:02:29 crc kubenswrapper[4733]: I0318 11:02:29.896663 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/60216bd7-dbfa-4ef9-a60d-0941dd33ff44-host" (OuterVolumeSpecName: "host") pod "60216bd7-dbfa-4ef9-a60d-0941dd33ff44" (UID: "60216bd7-dbfa-4ef9-a60d-0941dd33ff44"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 11:02:29 crc kubenswrapper[4733]: I0318 11:02:29.897270 4733 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/60216bd7-dbfa-4ef9-a60d-0941dd33ff44-host\") on node \"crc\" DevicePath \"\"" Mar 18 11:02:29 crc kubenswrapper[4733]: I0318 11:02:29.901888 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60216bd7-dbfa-4ef9-a60d-0941dd33ff44-kube-api-access-6gstb" (OuterVolumeSpecName: "kube-api-access-6gstb") pod "60216bd7-dbfa-4ef9-a60d-0941dd33ff44" (UID: "60216bd7-dbfa-4ef9-a60d-0941dd33ff44"). InnerVolumeSpecName "kube-api-access-6gstb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:02:29 crc kubenswrapper[4733]: I0318 11:02:29.998254 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gstb\" (UniqueName: \"kubernetes.io/projected/60216bd7-dbfa-4ef9-a60d-0941dd33ff44-kube-api-access-6gstb\") on node \"crc\" DevicePath \"\"" Mar 18 11:02:30 crc kubenswrapper[4733]: I0318 11:02:30.677206 4733 scope.go:117] "RemoveContainer" containerID="de436dca9aa9677c82c7c01e372254e32bace5a1643be11aa46c20d0852dd35c" Mar 18 11:02:30 crc kubenswrapper[4733]: I0318 11:02:30.677969 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m754c/crc-debug-8v78b" Mar 18 11:02:31 crc kubenswrapper[4733]: I0318 11:02:31.222461 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60216bd7-dbfa-4ef9-a60d-0941dd33ff44" path="/var/lib/kubelet/pods/60216bd7-dbfa-4ef9-a60d-0941dd33ff44/volumes" Mar 18 11:02:35 crc kubenswrapper[4733]: I0318 11:02:35.176120 4733 scope.go:117] "RemoveContainer" containerID="bd29705735db3b754afb9922232f3ff6fa404f8d375f4f93c086696cc3583373" Mar 18 11:02:35 crc kubenswrapper[4733]: E0318 11:02:35.178314 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 11:02:36 crc kubenswrapper[4733]: I0318 11:02:36.175625 4733 scope.go:117] "RemoveContainer" containerID="42f5e854566e19360cf16fa02f3e09efbcbafeba0ef62811eb321face5cd1f9f" Mar 18 11:02:36 crc kubenswrapper[4733]: E0318 11:02:36.176669 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 11:02:43 crc kubenswrapper[4733]: I0318 11:02:43.571105 4733 patch_prober.go:28] interesting pod/machine-config-daemon-2h7dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 11:02:43 crc kubenswrapper[4733]: I0318 11:02:43.571769 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 11:02:43 crc kubenswrapper[4733]: I0318 11:02:43.571833 4733 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" Mar 18 11:02:43 crc kubenswrapper[4733]: I0318 11:02:43.572759 4733 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6b9a340729099e48708bb3e49a96ed003cdb26d857ad4f772c65d5062fdefcf9"} pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 11:02:43 crc kubenswrapper[4733]: I0318 11:02:43.572848 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" containerName="machine-config-daemon" containerID="cri-o://6b9a340729099e48708bb3e49a96ed003cdb26d857ad4f772c65d5062fdefcf9" gracePeriod=600 Mar 18 11:02:43 crc kubenswrapper[4733]: E0318 11:02:43.722698 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 11:02:43 crc kubenswrapper[4733]: I0318 11:02:43.777919 4733 generic.go:334] "Generic (PLEG): container finished" podID="6f75e1c5-e0c5-43df-944f-77b734070793" containerID="6b9a340729099e48708bb3e49a96ed003cdb26d857ad4f772c65d5062fdefcf9" exitCode=0 Mar 18 11:02:43 crc kubenswrapper[4733]: I0318 11:02:43.777977 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" event={"ID":"6f75e1c5-e0c5-43df-944f-77b734070793","Type":"ContainerDied","Data":"6b9a340729099e48708bb3e49a96ed003cdb26d857ad4f772c65d5062fdefcf9"} Mar 18 11:02:43 crc kubenswrapper[4733]: I0318 11:02:43.778011 4733 scope.go:117] "RemoveContainer" containerID="32198f7b4110f4b23718a4e872dd512bdbf76e8166cae4cab128ee6761e36a56" Mar 18 11:02:43 crc kubenswrapper[4733]: I0318 11:02:43.778576 4733 scope.go:117] "RemoveContainer" containerID="6b9a340729099e48708bb3e49a96ed003cdb26d857ad4f772c65d5062fdefcf9" Mar 18 11:02:43 crc kubenswrapper[4733]: E0318 11:02:43.778893 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 11:02:46 crc kubenswrapper[4733]: I0318 11:02:46.232713 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5f59b8f679-z8m4g_5fcd9264-61af-4872-82e6-8b0e1667ac70/init/0.log" Mar 18 11:02:46 crc kubenswrapper[4733]: I0318 11:02:46.436184 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5f59b8f679-z8m4g_5fcd9264-61af-4872-82e6-8b0e1667ac70/dnsmasq-dns/0.log" Mar 18 11:02:46 crc kubenswrapper[4733]: I0318 11:02:46.486035 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5f59b8f679-z8m4g_5fcd9264-61af-4872-82e6-8b0e1667ac70/init/0.log" Mar 18 11:02:46 crc kubenswrapper[4733]: I0318 11:02:46.541565 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_55f0b230-09f2-4be2-aa1f-76a37f3fe30c/kube-state-metrics/0.log" Mar 18 11:02:46 crc kubenswrapper[4733]: I0318 11:02:46.724236 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_dd66892e-808c-405a-ac8e-366b6ca8b148/memcached/0.log" Mar 18 11:02:46 crc kubenswrapper[4733]: I0318 11:02:46.741151 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_0208d826-df0f-41c8-83a7-821a21b7b85d/mysql-bootstrap/0.log" Mar 18 11:02:46 crc kubenswrapper[4733]: I0318 11:02:46.916137 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_0208d826-df0f-41c8-83a7-821a21b7b85d/galera/0.log" Mar 18 11:02:46 crc kubenswrapper[4733]: I0318 11:02:46.929752 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_dc60b49b-96fa-40fd-a8e5-40c810f5ef80/mysql-bootstrap/0.log" Mar 18 11:02:46 crc kubenswrapper[4733]: I0318 11:02:46.975463 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_0208d826-df0f-41c8-83a7-821a21b7b85d/mysql-bootstrap/0.log" Mar 18 11:02:47 crc kubenswrapper[4733]: I0318 11:02:47.093667 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_dc60b49b-96fa-40fd-a8e5-40c810f5ef80/mysql-bootstrap/0.log" Mar 18 11:02:47 crc kubenswrapper[4733]: I0318 11:02:47.098408 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_dc60b49b-96fa-40fd-a8e5-40c810f5ef80/galera/0.log" Mar 18 11:02:47 crc kubenswrapper[4733]: I0318 11:02:47.145872 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-6trms_e7849feb-5f1b-4b67-a3f7-8a419ebda0bd/openstack-network-exporter/0.log" Mar 18 11:02:47 crc kubenswrapper[4733]: I0318 11:02:47.283605 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-ljrgt_d75a8d54-aca8-49cd-9062-6389baaf7a09/ovsdb-server-init/0.log" Mar 18 11:02:47 crc kubenswrapper[4733]: I0318 11:02:47.442577 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-ljrgt_d75a8d54-aca8-49cd-9062-6389baaf7a09/ovsdb-server-init/0.log" Mar 18 11:02:47 crc kubenswrapper[4733]: I0318 11:02:47.453835 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-ljrgt_d75a8d54-aca8-49cd-9062-6389baaf7a09/ovs-vswitchd/0.log" Mar 18 11:02:47 crc kubenswrapper[4733]: I0318 11:02:47.468138 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-ljrgt_d75a8d54-aca8-49cd-9062-6389baaf7a09/ovsdb-server/0.log" Mar 18 11:02:47 crc kubenswrapper[4733]: I0318 11:02:47.640251 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-rh64b_e3c842d3-b3dd-4cf2-9df0-16cea4061bc5/ovn-controller/0.log" Mar 18 11:02:47 crc kubenswrapper[4733]: I0318 11:02:47.641094 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_96c7007d-b722-4518-a298-269808d7dfc5/openstack-network-exporter/0.log" Mar 18 11:02:47 crc kubenswrapper[4733]: I0318 11:02:47.673834 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_96c7007d-b722-4518-a298-269808d7dfc5/ovn-northd/0.log" Mar 18 11:02:47 crc kubenswrapper[4733]: I0318 11:02:47.796710 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_a8c27598-870d-4de0-a986-47042d7d6f4c/openstack-network-exporter/0.log" Mar 18 11:02:47 crc kubenswrapper[4733]: I0318 11:02:47.823069 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_a8c27598-870d-4de0-a986-47042d7d6f4c/ovsdbserver-nb/0.log" Mar 18 11:02:47 crc kubenswrapper[4733]: I0318 11:02:47.961767 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_0868210e-9d93-4f63-b425-7db21f13cd90/openstack-network-exporter/0.log" Mar 18 11:02:47 crc kubenswrapper[4733]: I0318 11:02:47.972573 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_0868210e-9d93-4f63-b425-7db21f13cd90/ovsdbserver-sb/0.log" Mar 18 11:02:48 crc kubenswrapper[4733]: I0318 11:02:48.059624 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_b4a4e3e2-bd4d-4f8d-97bc-51267378ab03/setup-container/0.log" Mar 18 11:02:48 crc kubenswrapper[4733]: I0318 11:02:48.179902 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_b4a4e3e2-bd4d-4f8d-97bc-51267378ab03/setup-container/0.log" Mar 18 11:02:48 crc kubenswrapper[4733]: I0318 11:02:48.214657 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_b4a4e3e2-bd4d-4f8d-97bc-51267378ab03/rabbitmq/10.log" Mar 18 11:02:48 crc kubenswrapper[4733]: I0318 11:02:48.269708 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_b4a4e3e2-bd4d-4f8d-97bc-51267378ab03/rabbitmq/10.log" Mar 18 11:02:48 crc kubenswrapper[4733]: I0318 11:02:48.357135 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f0570ce4-1455-4698-85cf-01f7108d9e7f/setup-container/0.log" Mar 18 11:02:48 crc kubenswrapper[4733]: I0318 11:02:48.525039 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f0570ce4-1455-4698-85cf-01f7108d9e7f/setup-container/0.log" Mar 18 11:02:48 crc kubenswrapper[4733]: I0318 11:02:48.530629 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f0570ce4-1455-4698-85cf-01f7108d9e7f/rabbitmq/10.log" Mar 18 11:02:48 crc kubenswrapper[4733]: I0318 11:02:48.537091 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f0570ce4-1455-4698-85cf-01f7108d9e7f/rabbitmq/10.log" Mar 18 11:02:48 crc kubenswrapper[4733]: I0318 11:02:48.692878 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-nfmp2_5e3fc960-7783-4952-90c9-1551c780ae03/swift-ring-rebalance/0.log" Mar 18 11:02:48 crc kubenswrapper[4733]: I0318 11:02:48.767017 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4f94cfc9-67cc-474c-8d99-58a9d4e0273f/account-auditor/0.log" Mar 18 11:02:48 crc kubenswrapper[4733]: I0318 11:02:48.785864 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4f94cfc9-67cc-474c-8d99-58a9d4e0273f/account-reaper/0.log" Mar 18 11:02:48 crc kubenswrapper[4733]: I0318 11:02:48.892101 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4f94cfc9-67cc-474c-8d99-58a9d4e0273f/account-replicator/0.log" Mar 18 11:02:48 crc kubenswrapper[4733]: I0318 11:02:48.893572 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4f94cfc9-67cc-474c-8d99-58a9d4e0273f/account-server/0.log" Mar 18 11:02:48 crc kubenswrapper[4733]: I0318 11:02:48.931159 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4f94cfc9-67cc-474c-8d99-58a9d4e0273f/container-auditor/0.log" Mar 18 11:02:48 crc kubenswrapper[4733]: I0318 11:02:48.992499 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4f94cfc9-67cc-474c-8d99-58a9d4e0273f/container-replicator/0.log" Mar 18 11:02:49 crc kubenswrapper[4733]: I0318 11:02:49.047893 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4f94cfc9-67cc-474c-8d99-58a9d4e0273f/container-updater/0.log" Mar 18 11:02:49 crc kubenswrapper[4733]: I0318 11:02:49.076537 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4f94cfc9-67cc-474c-8d99-58a9d4e0273f/container-server/0.log" Mar 18 11:02:49 crc kubenswrapper[4733]: I0318 11:02:49.106908 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4f94cfc9-67cc-474c-8d99-58a9d4e0273f/object-auditor/0.log" Mar 18 11:02:49 crc kubenswrapper[4733]: I0318 11:02:49.176168 4733 scope.go:117] "RemoveContainer" containerID="bd29705735db3b754afb9922232f3ff6fa404f8d375f4f93c086696cc3583373" Mar 18 11:02:49 crc kubenswrapper[4733]: E0318 11:02:49.176502 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 11:02:49 crc kubenswrapper[4733]: I0318 11:02:49.177078 4733 scope.go:117] "RemoveContainer" containerID="42f5e854566e19360cf16fa02f3e09efbcbafeba0ef62811eb321face5cd1f9f" Mar 18 11:02:49 crc kubenswrapper[4733]: E0318 11:02:49.177291 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 11:02:49 crc kubenswrapper[4733]: I0318 11:02:49.189921 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4f94cfc9-67cc-474c-8d99-58a9d4e0273f/object-expirer/0.log" Mar 18 11:02:49 crc kubenswrapper[4733]: I0318 11:02:49.207178 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4f94cfc9-67cc-474c-8d99-58a9d4e0273f/object-replicator/0.log" Mar 18 11:02:49 crc kubenswrapper[4733]: I0318 11:02:49.279627 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4f94cfc9-67cc-474c-8d99-58a9d4e0273f/object-server/0.log" Mar 18 11:02:49 crc kubenswrapper[4733]: I0318 11:02:49.306608 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4f94cfc9-67cc-474c-8d99-58a9d4e0273f/object-updater/0.log" Mar 18 11:02:49 crc kubenswrapper[4733]: I0318 11:02:49.388504 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4f94cfc9-67cc-474c-8d99-58a9d4e0273f/rsync/0.log" Mar 18 11:02:49 crc kubenswrapper[4733]: I0318 11:02:49.410385 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4f94cfc9-67cc-474c-8d99-58a9d4e0273f/swift-recon-cron/0.log" Mar 18 11:02:59 crc kubenswrapper[4733]: I0318 11:02:59.175792 4733 scope.go:117] "RemoveContainer" containerID="6b9a340729099e48708bb3e49a96ed003cdb26d857ad4f772c65d5062fdefcf9" Mar 18 11:02:59 crc kubenswrapper[4733]: E0318 11:02:59.176550 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 11:03:02 crc kubenswrapper[4733]: I0318 11:03:02.175802 4733 scope.go:117] "RemoveContainer" containerID="bd29705735db3b754afb9922232f3ff6fa404f8d375f4f93c086696cc3583373" Mar 18 11:03:02 crc kubenswrapper[4733]: E0318 11:03:02.176333 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 11:03:02 crc kubenswrapper[4733]: I0318 11:03:02.176366 4733 scope.go:117] "RemoveContainer" containerID="42f5e854566e19360cf16fa02f3e09efbcbafeba0ef62811eb321face5cd1f9f" Mar 18 11:03:02 crc kubenswrapper[4733]: E0318 11:03:02.176504 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 11:03:05 crc kubenswrapper[4733]: I0318 11:03:05.791831 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_77b7da525d1abc8152b798823d798d773ced7ad76161af6957e3c157386hj67_53c111d7-ea42-4913-b378-ec44062b0691/util/0.log" Mar 18 11:03:05 crc kubenswrapper[4733]: I0318 11:03:05.968257 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_77b7da525d1abc8152b798823d798d773ced7ad76161af6957e3c157386hj67_53c111d7-ea42-4913-b378-ec44062b0691/util/0.log" Mar 18 11:03:06 crc kubenswrapper[4733]: I0318 11:03:06.012148 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_77b7da525d1abc8152b798823d798d773ced7ad76161af6957e3c157386hj67_53c111d7-ea42-4913-b378-ec44062b0691/pull/0.log" Mar 18 11:03:06 crc kubenswrapper[4733]: I0318 11:03:06.046648 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_77b7da525d1abc8152b798823d798d773ced7ad76161af6957e3c157386hj67_53c111d7-ea42-4913-b378-ec44062b0691/pull/0.log" Mar 18 11:03:06 crc kubenswrapper[4733]: I0318 11:03:06.152158 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_77b7da525d1abc8152b798823d798d773ced7ad76161af6957e3c157386hj67_53c111d7-ea42-4913-b378-ec44062b0691/util/0.log" Mar 18 11:03:06 crc kubenswrapper[4733]: I0318 11:03:06.168728 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_77b7da525d1abc8152b798823d798d773ced7ad76161af6957e3c157386hj67_53c111d7-ea42-4913-b378-ec44062b0691/pull/0.log" Mar 18 11:03:06 crc kubenswrapper[4733]: I0318 11:03:06.186203 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_77b7da525d1abc8152b798823d798d773ced7ad76161af6957e3c157386hj67_53c111d7-ea42-4913-b378-ec44062b0691/extract/0.log" Mar 18 11:03:06 crc kubenswrapper[4733]: I0318 11:03:06.342484 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-59bc569d95-sfv8v_0fb2ba68-fa0f-4483-afdf-2eb381c54320/manager/0.log" Mar 18 11:03:06 crc kubenswrapper[4733]: I0318 11:03:06.510605 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-588d4d986b-t8796_748f4855-3978-4ecd-805e-0fee34ce0094/manager/0.log" Mar 18 11:03:06 crc kubenswrapper[4733]: I0318 11:03:06.720613 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-79df6bcc97-ljvrt_bc0e28fc-cff0-4c39-8073-61d5d6481866/manager/0.log" Mar 18 11:03:06 crc kubenswrapper[4733]: I0318 11:03:06.815641 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-67dd5f86f5-cxlns_bd5ae902-d036-4e52-983d-aa3e1a86dca8/manager/0.log" Mar 18 11:03:06 crc kubenswrapper[4733]: I0318 11:03:06.957993 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-8464cc45fb-wkjtf_838f8a80-01c0-41d8-b431-2a23c9235fab/manager/0.log" Mar 18 11:03:07 crc kubenswrapper[4733]: I0318 11:03:07.088894 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d58dc466-v6zxn_8fe910c4-798b-4381-a71d-697459f7f79a/manager/0.log" Mar 18 11:03:07 crc kubenswrapper[4733]: I0318 11:03:07.221749 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-74c694b97b-j4snz_651c7dd5-3adc-48b4-b579-309258aa3735/manager/0.log" Mar 18 11:03:07 crc kubenswrapper[4733]: I0318 11:03:07.253509 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f787dddc9-pcscc_fd146b1e-59a9-4246-9520-f2d6f6cf6cd1/manager/0.log" Mar 18 11:03:07 crc kubenswrapper[4733]: I0318 11:03:07.446674 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-55f864c847-chmbd_ae8a8fbc-d425-4da5-afb3-438a85a43722/manager/0.log" Mar 18 11:03:07 crc kubenswrapper[4733]: I0318 11:03:07.459667 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-768b96df4c-tp4s7_de7565f5-677b-4aeb-90ab-0d632b28b295/manager/0.log" Mar 18 11:03:07 crc kubenswrapper[4733]: I0318 11:03:07.616026 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67ccfc9778-4xzlc_79dfdcde-0538-4777-959e-1daf2b6263de/manager/0.log" Mar 18 11:03:07 crc kubenswrapper[4733]: I0318 11:03:07.653585 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-767865f676-gkndg_216f9239-7d2e-483e-a89f-0955a518aa4a/manager/0.log" Mar 18 11:03:07 crc kubenswrapper[4733]: I0318 11:03:07.816457 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5d488d59fb-jmwdk_f93025ae-ebc3-4aed-bfde-e514d8b814ce/manager/0.log" Mar 18 11:03:07 crc kubenswrapper[4733]: I0318 11:03:07.858485 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5b9f45d989-22wt5_31999dbe-554e-4168-a902-1f62e82ce854/manager/0.log" Mar 18 11:03:08 crc kubenswrapper[4733]: I0318 11:03:08.011410 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-89d64c458-s6rbv_6eca2f16-53b8-4173-ace4-18b7292b1369/manager/0.log" Mar 18 11:03:08 crc kubenswrapper[4733]: I0318 11:03:08.142842 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-579f7bfb88-sfsb4_d1b10458-2335-4b46-9f63-c8a005096ff7/operator/0.log" Mar 18 11:03:08 crc kubenswrapper[4733]: I0318 11:03:08.338955 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-g2m9r_45605961-e7c2-4bd3-a670-d8541124408a/registry-server/0.log" Mar 18 11:03:08 crc kubenswrapper[4733]: I0318 11:03:08.495275 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-85877db48-qvlf2_a4b7e706-a9a7-490a-84a8-094d1d909ba8/manager/0.log" Mar 18 11:03:08 crc kubenswrapper[4733]: I0318 11:03:08.512988 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-884679f54-flv24_6762c515-b422-4157-a8ce-b9ca4781e134/manager/0.log" Mar 18 11:03:08 crc kubenswrapper[4733]: I0318 11:03:08.652066 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5784578c99-9txbj_4ad2d88a-c733-4409-b07b-5ff4661e1b68/manager/0.log" Mar 18 11:03:08 crc kubenswrapper[4733]: I0318 11:03:08.705088 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-k64ch_e64c7cd6-a04b-440e-ac47-40f672fbc333/operator/0.log" Mar 18 11:03:08 crc kubenswrapper[4733]: I0318 11:03:08.858669 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-c674c5965-v2pb2_6ea742ac-3be9-4067-ab5a-032365494fde/manager/0.log" Mar 18 11:03:08 crc kubenswrapper[4733]: I0318 11:03:08.953874 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-d6b694c5-fd4t7_759f85a1-4e24-4b61-879b-90801d648683/manager/0.log" Mar 18 11:03:09 crc kubenswrapper[4733]: I0318 11:03:09.031588 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-nskpj_6152e0d7-6362-4c7d-ba2b-4a1e55ca4f54/manager/0.log" Mar 18 11:03:09 crc kubenswrapper[4733]: I0318 11:03:09.145462 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c4d75f7f9-sqr4g_cd9234ed-fcbc-4d81-9034-27d39b3df6ee/manager/0.log" Mar 18 11:03:11 crc kubenswrapper[4733]: I0318 11:03:11.180523 4733 scope.go:117] "RemoveContainer" containerID="6b9a340729099e48708bb3e49a96ed003cdb26d857ad4f772c65d5062fdefcf9" Mar 18 11:03:11 crc kubenswrapper[4733]: E0318 11:03:11.181157 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 11:03:15 crc kubenswrapper[4733]: I0318 11:03:15.176132 4733 scope.go:117] "RemoveContainer" containerID="42f5e854566e19360cf16fa02f3e09efbcbafeba0ef62811eb321face5cd1f9f" Mar 18 11:03:15 crc kubenswrapper[4733]: E0318 11:03:15.176694 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 11:03:17 crc kubenswrapper[4733]: I0318 11:03:17.180915 4733 scope.go:117] "RemoveContainer" containerID="bd29705735db3b754afb9922232f3ff6fa404f8d375f4f93c086696cc3583373" Mar 18 11:03:17 crc kubenswrapper[4733]: E0318 11:03:17.181125 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 11:03:26 crc kubenswrapper[4733]: I0318 11:03:26.176964 4733 scope.go:117] "RemoveContainer" containerID="6b9a340729099e48708bb3e49a96ed003cdb26d857ad4f772c65d5062fdefcf9" Mar 18 11:03:26 crc kubenswrapper[4733]: E0318 11:03:26.178292 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 11:03:28 crc kubenswrapper[4733]: I0318 11:03:28.176267 4733 scope.go:117] "RemoveContainer" containerID="42f5e854566e19360cf16fa02f3e09efbcbafeba0ef62811eb321face5cd1f9f" Mar 18 11:03:28 crc kubenswrapper[4733]: E0318 11:03:28.176694 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 11:03:28 crc kubenswrapper[4733]: I0318 11:03:28.177239 4733 scope.go:117] "RemoveContainer" containerID="bd29705735db3b754afb9922232f3ff6fa404f8d375f4f93c086696cc3583373" Mar 18 11:03:28 crc kubenswrapper[4733]: E0318 11:03:28.177422 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 11:03:28 crc kubenswrapper[4733]: I0318 11:03:28.458622 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-4lbr5_f2b6c2ec-c07f-4d59-ba90-1ed2ec55d8a7/control-plane-machine-set-operator/0.log" Mar 18 11:03:28 crc kubenswrapper[4733]: I0318 11:03:28.631944 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-nbftd_0c02459c-3d75-4363-a010-3e9639bb9b4e/kube-rbac-proxy/0.log" Mar 18 11:03:28 crc kubenswrapper[4733]: I0318 11:03:28.647209 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-nbftd_0c02459c-3d75-4363-a010-3e9639bb9b4e/machine-api-operator/0.log" Mar 18 11:03:37 crc kubenswrapper[4733]: I0318 11:03:37.175467 4733 scope.go:117] "RemoveContainer" containerID="6b9a340729099e48708bb3e49a96ed003cdb26d857ad4f772c65d5062fdefcf9" Mar 18 11:03:37 crc kubenswrapper[4733]: E0318 11:03:37.176689 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 11:03:39 crc kubenswrapper[4733]: I0318 11:03:39.179362 4733 scope.go:117] "RemoveContainer" containerID="42f5e854566e19360cf16fa02f3e09efbcbafeba0ef62811eb321face5cd1f9f" Mar 18 11:03:39 crc kubenswrapper[4733]: E0318 11:03:39.187617 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 11:03:40 crc kubenswrapper[4733]: I0318 11:03:40.176219 4733 scope.go:117] "RemoveContainer" containerID="bd29705735db3b754afb9922232f3ff6fa404f8d375f4f93c086696cc3583373" Mar 18 11:03:40 crc kubenswrapper[4733]: E0318 11:03:40.176403 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 11:03:40 crc kubenswrapper[4733]: I0318 11:03:40.964946 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-rd2dh_ce77d29d-b82e-46be-a694-b6eea5da9379/cert-manager-controller/0.log" Mar 18 11:03:41 crc kubenswrapper[4733]: I0318 11:03:41.155740 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-8ds68_585c06be-86bd-48b7-954e-9aec01b08874/cert-manager-cainjector/0.log" Mar 18 11:03:41 crc kubenswrapper[4733]: I0318 11:03:41.218884 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-2nr27_534b0ac6-c9b1-4940-9e6e-ed36de1ec1e8/cert-manager-webhook/0.log" Mar 18 11:03:51 crc kubenswrapper[4733]: I0318 11:03:51.181636 4733 scope.go:117] "RemoveContainer" containerID="6b9a340729099e48708bb3e49a96ed003cdb26d857ad4f772c65d5062fdefcf9" Mar 18 11:03:51 crc kubenswrapper[4733]: E0318 11:03:51.182283 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 11:03:52 crc kubenswrapper[4733]: I0318 11:03:52.176013 4733 scope.go:117] "RemoveContainer" containerID="bd29705735db3b754afb9922232f3ff6fa404f8d375f4f93c086696cc3583373" Mar 18 11:03:52 crc kubenswrapper[4733]: I0318 11:03:52.176506 4733 scope.go:117] "RemoveContainer" containerID="42f5e854566e19360cf16fa02f3e09efbcbafeba0ef62811eb321face5cd1f9f" Mar 18 11:03:52 crc kubenswrapper[4733]: E0318 11:03:52.176711 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 11:03:52 crc kubenswrapper[4733]: E0318 11:03:52.176939 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 11:03:53 crc kubenswrapper[4733]: I0318 11:03:53.480114 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-btpf9_95b678ac-c7be-4c57-8663-05b207f43338/nmstate-console-plugin/0.log" Mar 18 11:03:53 crc kubenswrapper[4733]: I0318 11:03:53.539826 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-8jncr_4c5d76ae-c917-4ba7-91d7-332a8e578245/nmstate-handler/0.log" Mar 18 11:03:53 crc kubenswrapper[4733]: I0318 11:03:53.589891 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-7swn6_eb2e5225-c943-4b06-b2de-90ab1168242b/kube-rbac-proxy/0.log" Mar 18 11:03:53 crc kubenswrapper[4733]: I0318 11:03:53.667749 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-7swn6_eb2e5225-c943-4b06-b2de-90ab1168242b/nmstate-metrics/0.log" Mar 18 11:03:53 crc kubenswrapper[4733]: I0318 11:03:53.771463 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-2d4dp_7c8f098b-42c0-4132-88c0-350e0c872f9d/nmstate-operator/0.log" Mar 18 11:03:53 crc kubenswrapper[4733]: I0318 11:03:53.853935 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-m6rhx_29460af7-7801-4268-aae8-f84763762e2f/nmstate-webhook/0.log" Mar 18 11:04:00 crc kubenswrapper[4733]: I0318 11:04:00.169393 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563864-g5m6l"] Mar 18 11:04:00 crc kubenswrapper[4733]: E0318 11:04:00.172090 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60216bd7-dbfa-4ef9-a60d-0941dd33ff44" containerName="container-00" Mar 18 11:04:00 crc kubenswrapper[4733]: I0318 11:04:00.176106 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="60216bd7-dbfa-4ef9-a60d-0941dd33ff44" containerName="container-00" Mar 18 11:04:00 crc kubenswrapper[4733]: I0318 11:04:00.176514 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="60216bd7-dbfa-4ef9-a60d-0941dd33ff44" containerName="container-00" Mar 18 11:04:00 crc kubenswrapper[4733]: I0318 11:04:00.177284 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563864-g5m6l" Mar 18 11:04:00 crc kubenswrapper[4733]: I0318 11:04:00.179621 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wmd5k" Mar 18 11:04:00 crc kubenswrapper[4733]: I0318 11:04:00.179740 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 11:04:00 crc kubenswrapper[4733]: I0318 11:04:00.180596 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563864-g5m6l"] Mar 18 11:04:00 crc kubenswrapper[4733]: I0318 11:04:00.180911 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 11:04:00 crc kubenswrapper[4733]: I0318 11:04:00.221789 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpdk7\" (UniqueName: \"kubernetes.io/projected/720459c9-dd9a-4d0f-8541-f4f2f578acc5-kube-api-access-hpdk7\") pod \"auto-csr-approver-29563864-g5m6l\" (UID: \"720459c9-dd9a-4d0f-8541-f4f2f578acc5\") " pod="openshift-infra/auto-csr-approver-29563864-g5m6l" Mar 18 11:04:00 crc kubenswrapper[4733]: I0318 11:04:00.324102 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpdk7\" (UniqueName: \"kubernetes.io/projected/720459c9-dd9a-4d0f-8541-f4f2f578acc5-kube-api-access-hpdk7\") pod \"auto-csr-approver-29563864-g5m6l\" (UID: \"720459c9-dd9a-4d0f-8541-f4f2f578acc5\") " pod="openshift-infra/auto-csr-approver-29563864-g5m6l" Mar 18 11:04:00 crc kubenswrapper[4733]: I0318 11:04:00.351740 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpdk7\" (UniqueName: \"kubernetes.io/projected/720459c9-dd9a-4d0f-8541-f4f2f578acc5-kube-api-access-hpdk7\") pod \"auto-csr-approver-29563864-g5m6l\" (UID: \"720459c9-dd9a-4d0f-8541-f4f2f578acc5\") " pod="openshift-infra/auto-csr-approver-29563864-g5m6l" Mar 18 11:04:00 crc kubenswrapper[4733]: I0318 11:04:00.498157 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563864-g5m6l" Mar 18 11:04:00 crc kubenswrapper[4733]: I0318 11:04:00.969496 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563864-g5m6l"] Mar 18 11:04:01 crc kubenswrapper[4733]: I0318 11:04:01.520755 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563864-g5m6l" event={"ID":"720459c9-dd9a-4d0f-8541-f4f2f578acc5","Type":"ContainerStarted","Data":"f347f95424cf6cf3dae5086a5ebf375241fc5353eb08c3a8685b22e00d50d8be"} Mar 18 11:04:03 crc kubenswrapper[4733]: I0318 11:04:03.535086 4733 generic.go:334] "Generic (PLEG): container finished" podID="720459c9-dd9a-4d0f-8541-f4f2f578acc5" containerID="b3d994fcf267bc98b12bd59da2b08ea67ce03ac437e382c7728b7a3d005bd1f9" exitCode=0 Mar 18 11:04:03 crc kubenswrapper[4733]: I0318 11:04:03.535142 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563864-g5m6l" event={"ID":"720459c9-dd9a-4d0f-8541-f4f2f578acc5","Type":"ContainerDied","Data":"b3d994fcf267bc98b12bd59da2b08ea67ce03ac437e382c7728b7a3d005bd1f9"} Mar 18 11:04:04 crc kubenswrapper[4733]: I0318 11:04:04.175754 4733 scope.go:117] "RemoveContainer" containerID="42f5e854566e19360cf16fa02f3e09efbcbafeba0ef62811eb321face5cd1f9f" Mar 18 11:04:04 crc kubenswrapper[4733]: I0318 11:04:04.544788 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f0570ce4-1455-4698-85cf-01f7108d9e7f","Type":"ContainerStarted","Data":"32438c6a9409b79313e3b8972bb637f88d330bfefddb767cde843ec5e6f0eb01"} Mar 18 11:04:04 crc kubenswrapper[4733]: I0318 11:04:04.545446 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 18 11:04:04 crc kubenswrapper[4733]: I0318 11:04:04.867738 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563864-g5m6l" Mar 18 11:04:05 crc kubenswrapper[4733]: I0318 11:04:05.021821 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpdk7\" (UniqueName: \"kubernetes.io/projected/720459c9-dd9a-4d0f-8541-f4f2f578acc5-kube-api-access-hpdk7\") pod \"720459c9-dd9a-4d0f-8541-f4f2f578acc5\" (UID: \"720459c9-dd9a-4d0f-8541-f4f2f578acc5\") " Mar 18 11:04:05 crc kubenswrapper[4733]: I0318 11:04:05.026356 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/720459c9-dd9a-4d0f-8541-f4f2f578acc5-kube-api-access-hpdk7" (OuterVolumeSpecName: "kube-api-access-hpdk7") pod "720459c9-dd9a-4d0f-8541-f4f2f578acc5" (UID: "720459c9-dd9a-4d0f-8541-f4f2f578acc5"). InnerVolumeSpecName "kube-api-access-hpdk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:04:05 crc kubenswrapper[4733]: I0318 11:04:05.123811 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpdk7\" (UniqueName: \"kubernetes.io/projected/720459c9-dd9a-4d0f-8541-f4f2f578acc5-kube-api-access-hpdk7\") on node \"crc\" DevicePath \"\"" Mar 18 11:04:05 crc kubenswrapper[4733]: I0318 11:04:05.178747 4733 scope.go:117] "RemoveContainer" containerID="6b9a340729099e48708bb3e49a96ed003cdb26d857ad4f772c65d5062fdefcf9" Mar 18 11:04:05 crc kubenswrapper[4733]: E0318 11:04:05.179103 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 11:04:05 crc kubenswrapper[4733]: I0318 11:04:05.552825 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563864-g5m6l" Mar 18 11:04:05 crc kubenswrapper[4733]: I0318 11:04:05.552841 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563864-g5m6l" event={"ID":"720459c9-dd9a-4d0f-8541-f4f2f578acc5","Type":"ContainerDied","Data":"f347f95424cf6cf3dae5086a5ebf375241fc5353eb08c3a8685b22e00d50d8be"} Mar 18 11:04:05 crc kubenswrapper[4733]: I0318 11:04:05.552885 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f347f95424cf6cf3dae5086a5ebf375241fc5353eb08c3a8685b22e00d50d8be" Mar 18 11:04:05 crc kubenswrapper[4733]: I0318 11:04:05.946366 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563858-bxq6r"] Mar 18 11:04:05 crc kubenswrapper[4733]: I0318 11:04:05.952168 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563858-bxq6r"] Mar 18 11:04:07 crc kubenswrapper[4733]: I0318 11:04:07.176203 4733 scope.go:117] "RemoveContainer" containerID="bd29705735db3b754afb9922232f3ff6fa404f8d375f4f93c086696cc3583373" Mar 18 11:04:07 crc kubenswrapper[4733]: I0318 11:04:07.191689 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bc61f2d-1837-4253-a3a3-91d8acc950f8" path="/var/lib/kubelet/pods/7bc61f2d-1837-4253-a3a3-91d8acc950f8/volumes" Mar 18 11:04:07 crc kubenswrapper[4733]: I0318 11:04:07.568561 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b4a4e3e2-bd4d-4f8d-97bc-51267378ab03","Type":"ContainerStarted","Data":"309174b794edb6ce74f0fdb4a12ba0a1a8e65a9dcfd1acde2e49c6c8caf177d2"} Mar 18 11:04:07 crc kubenswrapper[4733]: I0318 11:04:07.569048 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 18 11:04:09 crc kubenswrapper[4733]: I0318 11:04:09.589615 4733 generic.go:334] "Generic (PLEG): container finished" podID="f0570ce4-1455-4698-85cf-01f7108d9e7f" containerID="32438c6a9409b79313e3b8972bb637f88d330bfefddb767cde843ec5e6f0eb01" exitCode=0 Mar 18 11:04:09 crc kubenswrapper[4733]: I0318 11:04:09.589656 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f0570ce4-1455-4698-85cf-01f7108d9e7f","Type":"ContainerDied","Data":"32438c6a9409b79313e3b8972bb637f88d330bfefddb767cde843ec5e6f0eb01"} Mar 18 11:04:09 crc kubenswrapper[4733]: I0318 11:04:09.589694 4733 scope.go:117] "RemoveContainer" containerID="42f5e854566e19360cf16fa02f3e09efbcbafeba0ef62811eb321face5cd1f9f" Mar 18 11:04:09 crc kubenswrapper[4733]: I0318 11:04:09.590310 4733 scope.go:117] "RemoveContainer" containerID="32438c6a9409b79313e3b8972bb637f88d330bfefddb767cde843ec5e6f0eb01" Mar 18 11:04:09 crc kubenswrapper[4733]: E0318 11:04:09.590558 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 11:04:12 crc kubenswrapper[4733]: I0318 11:04:12.616766 4733 generic.go:334] "Generic (PLEG): container finished" podID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" containerID="309174b794edb6ce74f0fdb4a12ba0a1a8e65a9dcfd1acde2e49c6c8caf177d2" exitCode=0 Mar 18 11:04:12 crc kubenswrapper[4733]: I0318 11:04:12.616860 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b4a4e3e2-bd4d-4f8d-97bc-51267378ab03","Type":"ContainerDied","Data":"309174b794edb6ce74f0fdb4a12ba0a1a8e65a9dcfd1acde2e49c6c8caf177d2"} Mar 18 11:04:12 crc kubenswrapper[4733]: I0318 11:04:12.617360 4733 scope.go:117] "RemoveContainer" containerID="bd29705735db3b754afb9922232f3ff6fa404f8d375f4f93c086696cc3583373" Mar 18 11:04:12 crc kubenswrapper[4733]: I0318 11:04:12.617990 4733 scope.go:117] "RemoveContainer" containerID="309174b794edb6ce74f0fdb4a12ba0a1a8e65a9dcfd1acde2e49c6c8caf177d2" Mar 18 11:04:12 crc kubenswrapper[4733]: E0318 11:04:12.618321 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 11:04:16 crc kubenswrapper[4733]: I0318 11:04:16.403890 4733 scope.go:117] "RemoveContainer" containerID="ae5d77ede52fa11bd913773d4add1f40cd6fcaf6154c4236eccd984879ea57ff" Mar 18 11:04:18 crc kubenswrapper[4733]: I0318 11:04:18.175428 4733 scope.go:117] "RemoveContainer" containerID="6b9a340729099e48708bb3e49a96ed003cdb26d857ad4f772c65d5062fdefcf9" Mar 18 11:04:18 crc kubenswrapper[4733]: E0318 11:04:18.176028 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 11:04:21 crc kubenswrapper[4733]: I0318 11:04:21.533740 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-zsljc_7eed25d9-11cc-4ca1-b715-0a77d4dcc8e0/kube-rbac-proxy/0.log" Mar 18 11:04:21 crc kubenswrapper[4733]: I0318 11:04:21.587905 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-zsljc_7eed25d9-11cc-4ca1-b715-0a77d4dcc8e0/controller/0.log" Mar 18 11:04:21 crc kubenswrapper[4733]: I0318 11:04:21.769277 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pc5zz_4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e/cp-frr-files/0.log" Mar 18 11:04:21 crc kubenswrapper[4733]: I0318 11:04:21.936782 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pc5zz_4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e/cp-frr-files/0.log" Mar 18 11:04:21 crc kubenswrapper[4733]: I0318 11:04:21.990603 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pc5zz_4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e/cp-reloader/0.log" Mar 18 11:04:22 crc kubenswrapper[4733]: I0318 11:04:22.009971 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pc5zz_4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e/cp-metrics/0.log" Mar 18 11:04:22 crc kubenswrapper[4733]: I0318 11:04:22.017933 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pc5zz_4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e/cp-reloader/0.log" Mar 18 11:04:22 crc kubenswrapper[4733]: I0318 11:04:22.196730 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pc5zz_4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e/cp-reloader/0.log" Mar 18 11:04:22 crc kubenswrapper[4733]: I0318 11:04:22.230388 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pc5zz_4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e/cp-metrics/0.log" Mar 18 11:04:22 crc kubenswrapper[4733]: I0318 11:04:22.235123 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pc5zz_4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e/cp-frr-files/0.log" Mar 18 11:04:22 crc kubenswrapper[4733]: I0318 11:04:22.256286 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pc5zz_4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e/cp-metrics/0.log" Mar 18 11:04:22 crc kubenswrapper[4733]: I0318 11:04:22.417147 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pc5zz_4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e/controller/0.log" Mar 18 11:04:22 crc kubenswrapper[4733]: I0318 11:04:22.450382 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pc5zz_4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e/cp-metrics/0.log" Mar 18 11:04:22 crc kubenswrapper[4733]: I0318 11:04:22.455242 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pc5zz_4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e/cp-reloader/0.log" Mar 18 11:04:22 crc kubenswrapper[4733]: I0318 11:04:22.460464 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pc5zz_4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e/cp-frr-files/0.log" Mar 18 11:04:22 crc kubenswrapper[4733]: I0318 11:04:22.620578 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pc5zz_4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e/frr-metrics/0.log" Mar 18 11:04:22 crc kubenswrapper[4733]: I0318 11:04:22.639892 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pc5zz_4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e/kube-rbac-proxy-frr/0.log" Mar 18 11:04:22 crc kubenswrapper[4733]: I0318 11:04:22.656173 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pc5zz_4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e/kube-rbac-proxy/0.log" Mar 18 11:04:22 crc kubenswrapper[4733]: I0318 11:04:22.997671 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-dr9dg_03476444-8ff8-4b1e-bcbc-ee654241370b/frr-k8s-webhook-server/0.log" Mar 18 11:04:23 crc kubenswrapper[4733]: I0318 11:04:23.009316 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pc5zz_4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e/reloader/0.log" Mar 18 11:04:23 crc kubenswrapper[4733]: I0318 11:04:23.175153 4733 scope.go:117] "RemoveContainer" containerID="32438c6a9409b79313e3b8972bb637f88d330bfefddb767cde843ec5e6f0eb01" Mar 18 11:04:23 crc kubenswrapper[4733]: E0318 11:04:23.175389 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 11:04:23 crc kubenswrapper[4733]: I0318 11:04:23.177395 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pc5zz_4a1a5332-5dcd-46ab-b584-0a8fb7feaa9e/frr/0.log" Mar 18 11:04:23 crc kubenswrapper[4733]: I0318 11:04:23.198035 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5ddc5ff65-jst9z_9731a250-9d78-43e0-bde3-7e769ea43d11/manager/0.log" Mar 18 11:04:23 crc kubenswrapper[4733]: I0318 11:04:23.336910 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-c99d9f4d6-n5lc9_37ecdf54-7bcf-4d33-9cd9-f156974ea7f9/webhook-server/0.log" Mar 18 11:04:23 crc kubenswrapper[4733]: I0318 11:04:23.397673 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-zg5cv_101c5687-bebd-449f-94c8-03077bf596d0/kube-rbac-proxy/0.log" Mar 18 11:04:23 crc kubenswrapper[4733]: I0318 11:04:23.637069 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-zg5cv_101c5687-bebd-449f-94c8-03077bf596d0/speaker/0.log" Mar 18 11:04:26 crc kubenswrapper[4733]: I0318 11:04:26.176014 4733 scope.go:117] "RemoveContainer" containerID="309174b794edb6ce74f0fdb4a12ba0a1a8e65a9dcfd1acde2e49c6c8caf177d2" Mar 18 11:04:26 crc kubenswrapper[4733]: E0318 11:04:26.176566 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 11:04:33 crc kubenswrapper[4733]: I0318 11:04:33.175841 4733 scope.go:117] "RemoveContainer" containerID="6b9a340729099e48708bb3e49a96ed003cdb26d857ad4f772c65d5062fdefcf9" Mar 18 11:04:33 crc kubenswrapper[4733]: E0318 11:04:33.176832 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 11:04:36 crc kubenswrapper[4733]: I0318 11:04:36.175590 4733 scope.go:117] "RemoveContainer" containerID="32438c6a9409b79313e3b8972bb637f88d330bfefddb767cde843ec5e6f0eb01" Mar 18 11:04:36 crc kubenswrapper[4733]: E0318 11:04:36.176311 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 11:04:36 crc kubenswrapper[4733]: I0318 11:04:36.885702 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rxg2p_3f95562e-ae03-4b2d-92b7-bc5593785f3c/util/0.log" Mar 18 11:04:37 crc kubenswrapper[4733]: I0318 11:04:37.087651 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rxg2p_3f95562e-ae03-4b2d-92b7-bc5593785f3c/util/0.log" Mar 18 11:04:37 crc kubenswrapper[4733]: I0318 11:04:37.105656 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rxg2p_3f95562e-ae03-4b2d-92b7-bc5593785f3c/pull/0.log" Mar 18 11:04:37 crc kubenswrapper[4733]: I0318 11:04:37.165045 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rxg2p_3f95562e-ae03-4b2d-92b7-bc5593785f3c/pull/0.log" Mar 18 11:04:37 crc kubenswrapper[4733]: I0318 11:04:37.443326 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rxg2p_3f95562e-ae03-4b2d-92b7-bc5593785f3c/pull/0.log" Mar 18 11:04:37 crc kubenswrapper[4733]: I0318 11:04:37.463499 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rxg2p_3f95562e-ae03-4b2d-92b7-bc5593785f3c/extract/0.log" Mar 18 11:04:37 crc kubenswrapper[4733]: I0318 11:04:37.470851 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874rxg2p_3f95562e-ae03-4b2d-92b7-bc5593785f3c/util/0.log" Mar 18 11:04:37 crc kubenswrapper[4733]: I0318 11:04:37.628903 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d59fb_2d140fd7-f9d7-4432-aef0-ec0ab2e18cf6/util/0.log" Mar 18 11:04:37 crc kubenswrapper[4733]: I0318 11:04:37.786598 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d59fb_2d140fd7-f9d7-4432-aef0-ec0ab2e18cf6/pull/0.log" Mar 18 11:04:37 crc kubenswrapper[4733]: I0318 11:04:37.817364 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d59fb_2d140fd7-f9d7-4432-aef0-ec0ab2e18cf6/pull/0.log" Mar 18 11:04:37 crc kubenswrapper[4733]: I0318 11:04:37.875462 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d59fb_2d140fd7-f9d7-4432-aef0-ec0ab2e18cf6/util/0.log" Mar 18 11:04:37 crc kubenswrapper[4733]: I0318 11:04:37.974854 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d59fb_2d140fd7-f9d7-4432-aef0-ec0ab2e18cf6/util/0.log" Mar 18 11:04:37 crc kubenswrapper[4733]: I0318 11:04:37.978050 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d59fb_2d140fd7-f9d7-4432-aef0-ec0ab2e18cf6/extract/0.log" Mar 18 11:04:38 crc kubenswrapper[4733]: I0318 11:04:38.042597 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d59fb_2d140fd7-f9d7-4432-aef0-ec0ab2e18cf6/pull/0.log" Mar 18 11:04:38 crc kubenswrapper[4733]: I0318 11:04:38.181539 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-c9s27_a3f55919-82b3-4117-8734-cb9a26364d83/extract-utilities/0.log" Mar 18 11:04:38 crc kubenswrapper[4733]: I0318 11:04:38.327940 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-c9s27_a3f55919-82b3-4117-8734-cb9a26364d83/extract-content/0.log" Mar 18 11:04:38 crc kubenswrapper[4733]: I0318 11:04:38.332324 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-c9s27_a3f55919-82b3-4117-8734-cb9a26364d83/extract-utilities/0.log" Mar 18 11:04:38 crc kubenswrapper[4733]: I0318 11:04:38.351084 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-c9s27_a3f55919-82b3-4117-8734-cb9a26364d83/extract-content/0.log" Mar 18 11:04:38 crc kubenswrapper[4733]: I0318 11:04:38.507414 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-c9s27_a3f55919-82b3-4117-8734-cb9a26364d83/extract-utilities/0.log" Mar 18 11:04:38 crc kubenswrapper[4733]: I0318 11:04:38.541861 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-c9s27_a3f55919-82b3-4117-8734-cb9a26364d83/extract-content/0.log" Mar 18 11:04:38 crc kubenswrapper[4733]: I0318 11:04:38.765929 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-smkxx_3b618f79-3791-49a8-a6aa-307fb25af727/extract-utilities/0.log" Mar 18 11:04:38 crc kubenswrapper[4733]: I0318 11:04:38.888608 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-smkxx_3b618f79-3791-49a8-a6aa-307fb25af727/extract-content/0.log" Mar 18 11:04:38 crc kubenswrapper[4733]: I0318 11:04:38.906400 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-c9s27_a3f55919-82b3-4117-8734-cb9a26364d83/registry-server/0.log" Mar 18 11:04:38 crc kubenswrapper[4733]: I0318 11:04:38.917973 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-smkxx_3b618f79-3791-49a8-a6aa-307fb25af727/extract-utilities/0.log" Mar 18 11:04:38 crc kubenswrapper[4733]: I0318 11:04:38.941662 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-smkxx_3b618f79-3791-49a8-a6aa-307fb25af727/extract-content/0.log" Mar 18 11:04:39 crc kubenswrapper[4733]: I0318 11:04:39.143630 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-smkxx_3b618f79-3791-49a8-a6aa-307fb25af727/extract-utilities/0.log" Mar 18 11:04:39 crc kubenswrapper[4733]: I0318 11:04:39.147746 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-smkxx_3b618f79-3791-49a8-a6aa-307fb25af727/extract-content/0.log" Mar 18 11:04:39 crc kubenswrapper[4733]: I0318 11:04:39.353153 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-z6qb2_8ae3847e-6357-46a1-9578-88deb6e1531b/marketplace-operator/0.log" Mar 18 11:04:39 crc kubenswrapper[4733]: I0318 11:04:39.452693 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kcbhw_20640f37-bf35-4f24-abbb-b31cd00e5c9c/extract-utilities/0.log" Mar 18 11:04:39 crc kubenswrapper[4733]: I0318 11:04:39.614373 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-smkxx_3b618f79-3791-49a8-a6aa-307fb25af727/registry-server/0.log" Mar 18 11:04:39 crc kubenswrapper[4733]: I0318 11:04:39.631407 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kcbhw_20640f37-bf35-4f24-abbb-b31cd00e5c9c/extract-content/0.log" Mar 18 11:04:39 crc kubenswrapper[4733]: I0318 11:04:39.654411 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kcbhw_20640f37-bf35-4f24-abbb-b31cd00e5c9c/extract-utilities/0.log" Mar 18 11:04:39 crc kubenswrapper[4733]: I0318 11:04:39.727308 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kcbhw_20640f37-bf35-4f24-abbb-b31cd00e5c9c/extract-content/0.log" Mar 18 11:04:39 crc kubenswrapper[4733]: I0318 11:04:39.857778 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kcbhw_20640f37-bf35-4f24-abbb-b31cd00e5c9c/extract-content/0.log" Mar 18 11:04:39 crc kubenswrapper[4733]: I0318 11:04:39.860915 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kcbhw_20640f37-bf35-4f24-abbb-b31cd00e5c9c/extract-utilities/0.log" Mar 18 11:04:39 crc kubenswrapper[4733]: I0318 11:04:39.981453 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kcbhw_20640f37-bf35-4f24-abbb-b31cd00e5c9c/registry-server/0.log" Mar 18 11:04:40 crc kubenswrapper[4733]: I0318 11:04:40.037606 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rs2b6_3018dd18-ee9f-44a1-ab22-a6bddde19b31/extract-utilities/0.log" Mar 18 11:04:40 crc kubenswrapper[4733]: I0318 11:04:40.176800 4733 scope.go:117] "RemoveContainer" containerID="309174b794edb6ce74f0fdb4a12ba0a1a8e65a9dcfd1acde2e49c6c8caf177d2" Mar 18 11:04:40 crc kubenswrapper[4733]: E0318 11:04:40.176991 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 11:04:40 crc kubenswrapper[4733]: I0318 11:04:40.224415 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rs2b6_3018dd18-ee9f-44a1-ab22-a6bddde19b31/extract-content/0.log" Mar 18 11:04:40 crc kubenswrapper[4733]: I0318 11:04:40.240720 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rs2b6_3018dd18-ee9f-44a1-ab22-a6bddde19b31/extract-content/0.log" Mar 18 11:04:40 crc kubenswrapper[4733]: I0318 11:04:40.260082 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rs2b6_3018dd18-ee9f-44a1-ab22-a6bddde19b31/extract-utilities/0.log" Mar 18 11:04:40 crc kubenswrapper[4733]: I0318 11:04:40.423995 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rs2b6_3018dd18-ee9f-44a1-ab22-a6bddde19b31/extract-utilities/0.log" Mar 18 11:04:40 crc kubenswrapper[4733]: I0318 11:04:40.443507 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rs2b6_3018dd18-ee9f-44a1-ab22-a6bddde19b31/extract-content/0.log" Mar 18 11:04:40 crc kubenswrapper[4733]: I0318 11:04:40.813242 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rs2b6_3018dd18-ee9f-44a1-ab22-a6bddde19b31/registry-server/0.log" Mar 18 11:04:42 crc kubenswrapper[4733]: I0318 11:04:42.496086 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-w89tj"] Mar 18 11:04:42 crc kubenswrapper[4733]: E0318 11:04:42.496923 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="720459c9-dd9a-4d0f-8541-f4f2f578acc5" containerName="oc" Mar 18 11:04:42 crc kubenswrapper[4733]: I0318 11:04:42.496936 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="720459c9-dd9a-4d0f-8541-f4f2f578acc5" containerName="oc" Mar 18 11:04:42 crc kubenswrapper[4733]: I0318 11:04:42.497101 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="720459c9-dd9a-4d0f-8541-f4f2f578acc5" containerName="oc" Mar 18 11:04:42 crc kubenswrapper[4733]: I0318 11:04:42.498240 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w89tj" Mar 18 11:04:42 crc kubenswrapper[4733]: I0318 11:04:42.509769 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w89tj"] Mar 18 11:04:42 crc kubenswrapper[4733]: I0318 11:04:42.546531 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/720d76e8-3e14-45ec-87e6-99948b3e0f42-catalog-content\") pod \"certified-operators-w89tj\" (UID: \"720d76e8-3e14-45ec-87e6-99948b3e0f42\") " pod="openshift-marketplace/certified-operators-w89tj" Mar 18 11:04:42 crc kubenswrapper[4733]: I0318 11:04:42.546785 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pddvl\" (UniqueName: \"kubernetes.io/projected/720d76e8-3e14-45ec-87e6-99948b3e0f42-kube-api-access-pddvl\") pod \"certified-operators-w89tj\" (UID: \"720d76e8-3e14-45ec-87e6-99948b3e0f42\") " pod="openshift-marketplace/certified-operators-w89tj" Mar 18 11:04:42 crc kubenswrapper[4733]: I0318 11:04:42.546860 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/720d76e8-3e14-45ec-87e6-99948b3e0f42-utilities\") pod \"certified-operators-w89tj\" (UID: \"720d76e8-3e14-45ec-87e6-99948b3e0f42\") " pod="openshift-marketplace/certified-operators-w89tj" Mar 18 11:04:42 crc kubenswrapper[4733]: I0318 11:04:42.647824 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/720d76e8-3e14-45ec-87e6-99948b3e0f42-utilities\") pod \"certified-operators-w89tj\" (UID: \"720d76e8-3e14-45ec-87e6-99948b3e0f42\") " pod="openshift-marketplace/certified-operators-w89tj" Mar 18 11:04:42 crc kubenswrapper[4733]: I0318 11:04:42.647888 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/720d76e8-3e14-45ec-87e6-99948b3e0f42-catalog-content\") pod \"certified-operators-w89tj\" (UID: \"720d76e8-3e14-45ec-87e6-99948b3e0f42\") " pod="openshift-marketplace/certified-operators-w89tj" Mar 18 11:04:42 crc kubenswrapper[4733]: I0318 11:04:42.647980 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pddvl\" (UniqueName: \"kubernetes.io/projected/720d76e8-3e14-45ec-87e6-99948b3e0f42-kube-api-access-pddvl\") pod \"certified-operators-w89tj\" (UID: \"720d76e8-3e14-45ec-87e6-99948b3e0f42\") " pod="openshift-marketplace/certified-operators-w89tj" Mar 18 11:04:42 crc kubenswrapper[4733]: I0318 11:04:42.648458 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/720d76e8-3e14-45ec-87e6-99948b3e0f42-utilities\") pod \"certified-operators-w89tj\" (UID: \"720d76e8-3e14-45ec-87e6-99948b3e0f42\") " pod="openshift-marketplace/certified-operators-w89tj" Mar 18 11:04:42 crc kubenswrapper[4733]: I0318 11:04:42.648672 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/720d76e8-3e14-45ec-87e6-99948b3e0f42-catalog-content\") pod \"certified-operators-w89tj\" (UID: \"720d76e8-3e14-45ec-87e6-99948b3e0f42\") " pod="openshift-marketplace/certified-operators-w89tj" Mar 18 11:04:42 crc kubenswrapper[4733]: I0318 11:04:42.674512 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pddvl\" (UniqueName: \"kubernetes.io/projected/720d76e8-3e14-45ec-87e6-99948b3e0f42-kube-api-access-pddvl\") pod \"certified-operators-w89tj\" (UID: \"720d76e8-3e14-45ec-87e6-99948b3e0f42\") " pod="openshift-marketplace/certified-operators-w89tj" Mar 18 11:04:42 crc kubenswrapper[4733]: I0318 11:04:42.823496 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w89tj" Mar 18 11:04:43 crc kubenswrapper[4733]: I0318 11:04:43.373579 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w89tj"] Mar 18 11:04:43 crc kubenswrapper[4733]: I0318 11:04:43.849246 4733 generic.go:334] "Generic (PLEG): container finished" podID="720d76e8-3e14-45ec-87e6-99948b3e0f42" containerID="d28f8f8f036e6e7300bc6459e017a6d89efcdd0467a2fe3dc572cd6efe822bf0" exitCode=0 Mar 18 11:04:43 crc kubenswrapper[4733]: I0318 11:04:43.849334 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w89tj" event={"ID":"720d76e8-3e14-45ec-87e6-99948b3e0f42","Type":"ContainerDied","Data":"d28f8f8f036e6e7300bc6459e017a6d89efcdd0467a2fe3dc572cd6efe822bf0"} Mar 18 11:04:43 crc kubenswrapper[4733]: I0318 11:04:43.849562 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w89tj" event={"ID":"720d76e8-3e14-45ec-87e6-99948b3e0f42","Type":"ContainerStarted","Data":"9af4ebac00ba06ba0a306bf985e9fada6fd3b57d11129aef1aab44ff8dd47136"} Mar 18 11:04:44 crc kubenswrapper[4733]: I0318 11:04:44.858523 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w89tj" event={"ID":"720d76e8-3e14-45ec-87e6-99948b3e0f42","Type":"ContainerStarted","Data":"8b6502424e46362b8ac33d9fd6c25d1864e353fe3a70af5033ba644217eb03c3"} Mar 18 11:04:45 crc kubenswrapper[4733]: I0318 11:04:45.178921 4733 scope.go:117] "RemoveContainer" containerID="6b9a340729099e48708bb3e49a96ed003cdb26d857ad4f772c65d5062fdefcf9" Mar 18 11:04:45 crc kubenswrapper[4733]: E0318 11:04:45.179172 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 11:04:45 crc kubenswrapper[4733]: I0318 11:04:45.867933 4733 generic.go:334] "Generic (PLEG): container finished" podID="720d76e8-3e14-45ec-87e6-99948b3e0f42" containerID="8b6502424e46362b8ac33d9fd6c25d1864e353fe3a70af5033ba644217eb03c3" exitCode=0 Mar 18 11:04:45 crc kubenswrapper[4733]: I0318 11:04:45.867971 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w89tj" event={"ID":"720d76e8-3e14-45ec-87e6-99948b3e0f42","Type":"ContainerDied","Data":"8b6502424e46362b8ac33d9fd6c25d1864e353fe3a70af5033ba644217eb03c3"} Mar 18 11:04:46 crc kubenswrapper[4733]: I0318 11:04:46.879311 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w89tj" event={"ID":"720d76e8-3e14-45ec-87e6-99948b3e0f42","Type":"ContainerStarted","Data":"43615e4c6b1987af73cae1c93ca995b55c2aafa63ad191baf1f760950b7c4bff"} Mar 18 11:04:46 crc kubenswrapper[4733]: I0318 11:04:46.908600 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-w89tj" podStartSLOduration=2.465932184 podStartE2EDuration="4.908581473s" podCreationTimestamp="2026-03-18 11:04:42 +0000 UTC" firstStartedPulling="2026-03-18 11:04:43.850902584 +0000 UTC m=+3123.342636909" lastFinishedPulling="2026-03-18 11:04:46.293551873 +0000 UTC m=+3125.785286198" observedRunningTime="2026-03-18 11:04:46.903396627 +0000 UTC m=+3126.395130962" watchObservedRunningTime="2026-03-18 11:04:46.908581473 +0000 UTC m=+3126.400315798" Mar 18 11:04:47 crc kubenswrapper[4733]: I0318 11:04:47.175645 4733 scope.go:117] "RemoveContainer" containerID="32438c6a9409b79313e3b8972bb637f88d330bfefddb767cde843ec5e6f0eb01" Mar 18 11:04:47 crc kubenswrapper[4733]: E0318 11:04:47.175885 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 11:04:52 crc kubenswrapper[4733]: I0318 11:04:52.824346 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-w89tj" Mar 18 11:04:52 crc kubenswrapper[4733]: I0318 11:04:52.824878 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-w89tj" Mar 18 11:04:52 crc kubenswrapper[4733]: I0318 11:04:52.878773 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-w89tj" Mar 18 11:04:52 crc kubenswrapper[4733]: I0318 11:04:52.965680 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-w89tj" Mar 18 11:04:53 crc kubenswrapper[4733]: I0318 11:04:53.136910 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w89tj"] Mar 18 11:04:54 crc kubenswrapper[4733]: I0318 11:04:54.937869 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-w89tj" podUID="720d76e8-3e14-45ec-87e6-99948b3e0f42" containerName="registry-server" containerID="cri-o://43615e4c6b1987af73cae1c93ca995b55c2aafa63ad191baf1f760950b7c4bff" gracePeriod=2 Mar 18 11:04:55 crc kubenswrapper[4733]: I0318 11:04:55.175710 4733 scope.go:117] "RemoveContainer" containerID="309174b794edb6ce74f0fdb4a12ba0a1a8e65a9dcfd1acde2e49c6c8caf177d2" Mar 18 11:04:55 crc kubenswrapper[4733]: E0318 11:04:55.176433 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 11:04:55 crc kubenswrapper[4733]: I0318 11:04:55.375200 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w89tj" Mar 18 11:04:55 crc kubenswrapper[4733]: I0318 11:04:55.456320 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/720d76e8-3e14-45ec-87e6-99948b3e0f42-utilities\") pod \"720d76e8-3e14-45ec-87e6-99948b3e0f42\" (UID: \"720d76e8-3e14-45ec-87e6-99948b3e0f42\") " Mar 18 11:04:55 crc kubenswrapper[4733]: I0318 11:04:55.456421 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/720d76e8-3e14-45ec-87e6-99948b3e0f42-catalog-content\") pod \"720d76e8-3e14-45ec-87e6-99948b3e0f42\" (UID: \"720d76e8-3e14-45ec-87e6-99948b3e0f42\") " Mar 18 11:04:55 crc kubenswrapper[4733]: I0318 11:04:55.456490 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pddvl\" (UniqueName: \"kubernetes.io/projected/720d76e8-3e14-45ec-87e6-99948b3e0f42-kube-api-access-pddvl\") pod \"720d76e8-3e14-45ec-87e6-99948b3e0f42\" (UID: \"720d76e8-3e14-45ec-87e6-99948b3e0f42\") " Mar 18 11:04:55 crc kubenswrapper[4733]: I0318 11:04:55.457349 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/720d76e8-3e14-45ec-87e6-99948b3e0f42-utilities" (OuterVolumeSpecName: "utilities") pod "720d76e8-3e14-45ec-87e6-99948b3e0f42" (UID: "720d76e8-3e14-45ec-87e6-99948b3e0f42"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 11:04:55 crc kubenswrapper[4733]: I0318 11:04:55.463397 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/720d76e8-3e14-45ec-87e6-99948b3e0f42-kube-api-access-pddvl" (OuterVolumeSpecName: "kube-api-access-pddvl") pod "720d76e8-3e14-45ec-87e6-99948b3e0f42" (UID: "720d76e8-3e14-45ec-87e6-99948b3e0f42"). InnerVolumeSpecName "kube-api-access-pddvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:04:55 crc kubenswrapper[4733]: I0318 11:04:55.560390 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pddvl\" (UniqueName: \"kubernetes.io/projected/720d76e8-3e14-45ec-87e6-99948b3e0f42-kube-api-access-pddvl\") on node \"crc\" DevicePath \"\"" Mar 18 11:04:55 crc kubenswrapper[4733]: I0318 11:04:55.560422 4733 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/720d76e8-3e14-45ec-87e6-99948b3e0f42-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 11:04:55 crc kubenswrapper[4733]: I0318 11:04:55.949750 4733 generic.go:334] "Generic (PLEG): container finished" podID="720d76e8-3e14-45ec-87e6-99948b3e0f42" containerID="43615e4c6b1987af73cae1c93ca995b55c2aafa63ad191baf1f760950b7c4bff" exitCode=0 Mar 18 11:04:55 crc kubenswrapper[4733]: I0318 11:04:55.949897 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w89tj" event={"ID":"720d76e8-3e14-45ec-87e6-99948b3e0f42","Type":"ContainerDied","Data":"43615e4c6b1987af73cae1c93ca995b55c2aafa63ad191baf1f760950b7c4bff"} Mar 18 11:04:55 crc kubenswrapper[4733]: I0318 11:04:55.950028 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w89tj" Mar 18 11:04:55 crc kubenswrapper[4733]: I0318 11:04:55.950040 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w89tj" event={"ID":"720d76e8-3e14-45ec-87e6-99948b3e0f42","Type":"ContainerDied","Data":"9af4ebac00ba06ba0a306bf985e9fada6fd3b57d11129aef1aab44ff8dd47136"} Mar 18 11:04:55 crc kubenswrapper[4733]: I0318 11:04:55.950063 4733 scope.go:117] "RemoveContainer" containerID="43615e4c6b1987af73cae1c93ca995b55c2aafa63ad191baf1f760950b7c4bff" Mar 18 11:04:55 crc kubenswrapper[4733]: I0318 11:04:55.970331 4733 scope.go:117] "RemoveContainer" containerID="8b6502424e46362b8ac33d9fd6c25d1864e353fe3a70af5033ba644217eb03c3" Mar 18 11:04:55 crc kubenswrapper[4733]: I0318 11:04:55.993078 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/720d76e8-3e14-45ec-87e6-99948b3e0f42-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "720d76e8-3e14-45ec-87e6-99948b3e0f42" (UID: "720d76e8-3e14-45ec-87e6-99948b3e0f42"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 11:04:55 crc kubenswrapper[4733]: I0318 11:04:55.993910 4733 scope.go:117] "RemoveContainer" containerID="d28f8f8f036e6e7300bc6459e017a6d89efcdd0467a2fe3dc572cd6efe822bf0" Mar 18 11:04:56 crc kubenswrapper[4733]: I0318 11:04:56.032323 4733 scope.go:117] "RemoveContainer" containerID="43615e4c6b1987af73cae1c93ca995b55c2aafa63ad191baf1f760950b7c4bff" Mar 18 11:04:56 crc kubenswrapper[4733]: E0318 11:04:56.033394 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43615e4c6b1987af73cae1c93ca995b55c2aafa63ad191baf1f760950b7c4bff\": container with ID starting with 43615e4c6b1987af73cae1c93ca995b55c2aafa63ad191baf1f760950b7c4bff not found: ID does not exist" containerID="43615e4c6b1987af73cae1c93ca995b55c2aafa63ad191baf1f760950b7c4bff" Mar 18 11:04:56 crc kubenswrapper[4733]: I0318 11:04:56.033471 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43615e4c6b1987af73cae1c93ca995b55c2aafa63ad191baf1f760950b7c4bff"} err="failed to get container status \"43615e4c6b1987af73cae1c93ca995b55c2aafa63ad191baf1f760950b7c4bff\": rpc error: code = NotFound desc = could not find container \"43615e4c6b1987af73cae1c93ca995b55c2aafa63ad191baf1f760950b7c4bff\": container with ID starting with 43615e4c6b1987af73cae1c93ca995b55c2aafa63ad191baf1f760950b7c4bff not found: ID does not exist" Mar 18 11:04:56 crc kubenswrapper[4733]: I0318 11:04:56.033503 4733 scope.go:117] "RemoveContainer" containerID="8b6502424e46362b8ac33d9fd6c25d1864e353fe3a70af5033ba644217eb03c3" Mar 18 11:04:56 crc kubenswrapper[4733]: E0318 11:04:56.033890 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b6502424e46362b8ac33d9fd6c25d1864e353fe3a70af5033ba644217eb03c3\": container with ID starting with 8b6502424e46362b8ac33d9fd6c25d1864e353fe3a70af5033ba644217eb03c3 not found: ID does not exist" containerID="8b6502424e46362b8ac33d9fd6c25d1864e353fe3a70af5033ba644217eb03c3" Mar 18 11:04:56 crc kubenswrapper[4733]: I0318 11:04:56.033919 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b6502424e46362b8ac33d9fd6c25d1864e353fe3a70af5033ba644217eb03c3"} err="failed to get container status \"8b6502424e46362b8ac33d9fd6c25d1864e353fe3a70af5033ba644217eb03c3\": rpc error: code = NotFound desc = could not find container \"8b6502424e46362b8ac33d9fd6c25d1864e353fe3a70af5033ba644217eb03c3\": container with ID starting with 8b6502424e46362b8ac33d9fd6c25d1864e353fe3a70af5033ba644217eb03c3 not found: ID does not exist" Mar 18 11:04:56 crc kubenswrapper[4733]: I0318 11:04:56.033941 4733 scope.go:117] "RemoveContainer" containerID="d28f8f8f036e6e7300bc6459e017a6d89efcdd0467a2fe3dc572cd6efe822bf0" Mar 18 11:04:56 crc kubenswrapper[4733]: E0318 11:04:56.034368 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d28f8f8f036e6e7300bc6459e017a6d89efcdd0467a2fe3dc572cd6efe822bf0\": container with ID starting with d28f8f8f036e6e7300bc6459e017a6d89efcdd0467a2fe3dc572cd6efe822bf0 not found: ID does not exist" containerID="d28f8f8f036e6e7300bc6459e017a6d89efcdd0467a2fe3dc572cd6efe822bf0" Mar 18 11:04:56 crc kubenswrapper[4733]: I0318 11:04:56.034399 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d28f8f8f036e6e7300bc6459e017a6d89efcdd0467a2fe3dc572cd6efe822bf0"} err="failed to get container status \"d28f8f8f036e6e7300bc6459e017a6d89efcdd0467a2fe3dc572cd6efe822bf0\": rpc error: code = NotFound desc = could not find container \"d28f8f8f036e6e7300bc6459e017a6d89efcdd0467a2fe3dc572cd6efe822bf0\": container with ID starting with d28f8f8f036e6e7300bc6459e017a6d89efcdd0467a2fe3dc572cd6efe822bf0 not found: ID does not exist" Mar 18 11:04:56 crc kubenswrapper[4733]: I0318 11:04:56.067953 4733 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/720d76e8-3e14-45ec-87e6-99948b3e0f42-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 11:04:56 crc kubenswrapper[4733]: I0318 11:04:56.284886 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w89tj"] Mar 18 11:04:56 crc kubenswrapper[4733]: I0318 11:04:56.296654 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-w89tj"] Mar 18 11:04:57 crc kubenswrapper[4733]: I0318 11:04:57.175385 4733 scope.go:117] "RemoveContainer" containerID="6b9a340729099e48708bb3e49a96ed003cdb26d857ad4f772c65d5062fdefcf9" Mar 18 11:04:57 crc kubenswrapper[4733]: E0318 11:04:57.175743 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 11:04:57 crc kubenswrapper[4733]: I0318 11:04:57.191527 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="720d76e8-3e14-45ec-87e6-99948b3e0f42" path="/var/lib/kubelet/pods/720d76e8-3e14-45ec-87e6-99948b3e0f42/volumes" Mar 18 11:05:00 crc kubenswrapper[4733]: I0318 11:05:00.175443 4733 scope.go:117] "RemoveContainer" containerID="32438c6a9409b79313e3b8972bb637f88d330bfefddb767cde843ec5e6f0eb01" Mar 18 11:05:00 crc kubenswrapper[4733]: E0318 11:05:00.176316 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 11:05:09 crc kubenswrapper[4733]: I0318 11:05:09.175555 4733 scope.go:117] "RemoveContainer" containerID="6b9a340729099e48708bb3e49a96ed003cdb26d857ad4f772c65d5062fdefcf9" Mar 18 11:05:09 crc kubenswrapper[4733]: E0318 11:05:09.176425 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 11:05:10 crc kubenswrapper[4733]: I0318 11:05:10.175638 4733 scope.go:117] "RemoveContainer" containerID="309174b794edb6ce74f0fdb4a12ba0a1a8e65a9dcfd1acde2e49c6c8caf177d2" Mar 18 11:05:10 crc kubenswrapper[4733]: E0318 11:05:10.176077 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 11:05:12 crc kubenswrapper[4733]: I0318 11:05:12.175844 4733 scope.go:117] "RemoveContainer" containerID="32438c6a9409b79313e3b8972bb637f88d330bfefddb767cde843ec5e6f0eb01" Mar 18 11:05:12 crc kubenswrapper[4733]: E0318 11:05:12.176984 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 11:05:22 crc kubenswrapper[4733]: I0318 11:05:22.175632 4733 scope.go:117] "RemoveContainer" containerID="6b9a340729099e48708bb3e49a96ed003cdb26d857ad4f772c65d5062fdefcf9" Mar 18 11:05:22 crc kubenswrapper[4733]: E0318 11:05:22.176402 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 11:05:23 crc kubenswrapper[4733]: I0318 11:05:23.175827 4733 scope.go:117] "RemoveContainer" containerID="309174b794edb6ce74f0fdb4a12ba0a1a8e65a9dcfd1acde2e49c6c8caf177d2" Mar 18 11:05:23 crc kubenswrapper[4733]: E0318 11:05:23.176330 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 11:05:26 crc kubenswrapper[4733]: I0318 11:05:26.176118 4733 scope.go:117] "RemoveContainer" containerID="32438c6a9409b79313e3b8972bb637f88d330bfefddb767cde843ec5e6f0eb01" Mar 18 11:05:26 crc kubenswrapper[4733]: E0318 11:05:26.176795 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 11:05:35 crc kubenswrapper[4733]: I0318 11:05:35.176118 4733 scope.go:117] "RemoveContainer" containerID="6b9a340729099e48708bb3e49a96ed003cdb26d857ad4f772c65d5062fdefcf9" Mar 18 11:05:35 crc kubenswrapper[4733]: E0318 11:05:35.178386 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 11:05:38 crc kubenswrapper[4733]: I0318 11:05:38.176570 4733 scope.go:117] "RemoveContainer" containerID="309174b794edb6ce74f0fdb4a12ba0a1a8e65a9dcfd1acde2e49c6c8caf177d2" Mar 18 11:05:38 crc kubenswrapper[4733]: E0318 11:05:38.177276 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 11:05:41 crc kubenswrapper[4733]: I0318 11:05:41.180463 4733 scope.go:117] "RemoveContainer" containerID="32438c6a9409b79313e3b8972bb637f88d330bfefddb767cde843ec5e6f0eb01" Mar 18 11:05:41 crc kubenswrapper[4733]: E0318 11:05:41.181002 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 11:05:49 crc kubenswrapper[4733]: I0318 11:05:49.176423 4733 scope.go:117] "RemoveContainer" containerID="6b9a340729099e48708bb3e49a96ed003cdb26d857ad4f772c65d5062fdefcf9" Mar 18 11:05:49 crc kubenswrapper[4733]: E0318 11:05:49.177159 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 11:05:52 crc kubenswrapper[4733]: I0318 11:05:52.175768 4733 scope.go:117] "RemoveContainer" containerID="309174b794edb6ce74f0fdb4a12ba0a1a8e65a9dcfd1acde2e49c6c8caf177d2" Mar 18 11:05:52 crc kubenswrapper[4733]: E0318 11:05:52.176566 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 11:05:54 crc kubenswrapper[4733]: I0318 11:05:54.175833 4733 scope.go:117] "RemoveContainer" containerID="32438c6a9409b79313e3b8972bb637f88d330bfefddb767cde843ec5e6f0eb01" Mar 18 11:05:54 crc kubenswrapper[4733]: E0318 11:05:54.176377 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 11:05:57 crc kubenswrapper[4733]: I0318 11:05:57.482140 4733 generic.go:334] "Generic (PLEG): container finished" podID="5542a33f-3466-419c-af8f-3391bcc3d241" containerID="83201b6bc07225e6541cd95837440f81a991cdbcc76e2856eb4af65abc081fb9" exitCode=0 Mar 18 11:05:57 crc kubenswrapper[4733]: I0318 11:05:57.482244 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-m754c/must-gather-dtbq6" event={"ID":"5542a33f-3466-419c-af8f-3391bcc3d241","Type":"ContainerDied","Data":"83201b6bc07225e6541cd95837440f81a991cdbcc76e2856eb4af65abc081fb9"} Mar 18 11:05:57 crc kubenswrapper[4733]: I0318 11:05:57.483512 4733 scope.go:117] "RemoveContainer" containerID="83201b6bc07225e6541cd95837440f81a991cdbcc76e2856eb4af65abc081fb9" Mar 18 11:05:57 crc kubenswrapper[4733]: I0318 11:05:57.849328 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-m754c_must-gather-dtbq6_5542a33f-3466-419c-af8f-3391bcc3d241/gather/0.log" Mar 18 11:06:00 crc kubenswrapper[4733]: I0318 11:06:00.145246 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563866-78qk8"] Mar 18 11:06:00 crc kubenswrapper[4733]: E0318 11:06:00.145910 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="720d76e8-3e14-45ec-87e6-99948b3e0f42" containerName="extract-utilities" Mar 18 11:06:00 crc kubenswrapper[4733]: I0318 11:06:00.145925 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="720d76e8-3e14-45ec-87e6-99948b3e0f42" containerName="extract-utilities" Mar 18 11:06:00 crc kubenswrapper[4733]: E0318 11:06:00.145937 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="720d76e8-3e14-45ec-87e6-99948b3e0f42" containerName="extract-content" Mar 18 11:06:00 crc kubenswrapper[4733]: I0318 11:06:00.145945 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="720d76e8-3e14-45ec-87e6-99948b3e0f42" containerName="extract-content" Mar 18 11:06:00 crc kubenswrapper[4733]: E0318 11:06:00.145957 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="720d76e8-3e14-45ec-87e6-99948b3e0f42" containerName="registry-server" Mar 18 11:06:00 crc kubenswrapper[4733]: I0318 11:06:00.145965 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="720d76e8-3e14-45ec-87e6-99948b3e0f42" containerName="registry-server" Mar 18 11:06:00 crc kubenswrapper[4733]: I0318 11:06:00.146130 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="720d76e8-3e14-45ec-87e6-99948b3e0f42" containerName="registry-server" Mar 18 11:06:00 crc kubenswrapper[4733]: I0318 11:06:00.146718 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563866-78qk8" Mar 18 11:06:00 crc kubenswrapper[4733]: I0318 11:06:00.148624 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wmd5k" Mar 18 11:06:00 crc kubenswrapper[4733]: I0318 11:06:00.149197 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 11:06:00 crc kubenswrapper[4733]: I0318 11:06:00.150280 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 11:06:00 crc kubenswrapper[4733]: I0318 11:06:00.154583 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563866-78qk8"] Mar 18 11:06:00 crc kubenswrapper[4733]: I0318 11:06:00.186142 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45g6m\" (UniqueName: \"kubernetes.io/projected/259a6c5e-ad27-412a-b854-776f42d48c84-kube-api-access-45g6m\") pod \"auto-csr-approver-29563866-78qk8\" (UID: \"259a6c5e-ad27-412a-b854-776f42d48c84\") " pod="openshift-infra/auto-csr-approver-29563866-78qk8" Mar 18 11:06:00 crc kubenswrapper[4733]: I0318 11:06:00.287259 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45g6m\" (UniqueName: \"kubernetes.io/projected/259a6c5e-ad27-412a-b854-776f42d48c84-kube-api-access-45g6m\") pod \"auto-csr-approver-29563866-78qk8\" (UID: \"259a6c5e-ad27-412a-b854-776f42d48c84\") " pod="openshift-infra/auto-csr-approver-29563866-78qk8" Mar 18 11:06:00 crc kubenswrapper[4733]: I0318 11:06:00.309054 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45g6m\" (UniqueName: \"kubernetes.io/projected/259a6c5e-ad27-412a-b854-776f42d48c84-kube-api-access-45g6m\") pod \"auto-csr-approver-29563866-78qk8\" (UID: \"259a6c5e-ad27-412a-b854-776f42d48c84\") " pod="openshift-infra/auto-csr-approver-29563866-78qk8" Mar 18 11:06:00 crc kubenswrapper[4733]: I0318 11:06:00.466847 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563866-78qk8" Mar 18 11:06:00 crc kubenswrapper[4733]: I0318 11:06:00.690317 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563866-78qk8"] Mar 18 11:06:01 crc kubenswrapper[4733]: I0318 11:06:01.511690 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563866-78qk8" event={"ID":"259a6c5e-ad27-412a-b854-776f42d48c84","Type":"ContainerStarted","Data":"0c1dd95218ba98497cb804ff0ac97f1c01408a8561346b9dcaac9335cfa062cd"} Mar 18 11:06:02 crc kubenswrapper[4733]: I0318 11:06:02.176379 4733 scope.go:117] "RemoveContainer" containerID="6b9a340729099e48708bb3e49a96ed003cdb26d857ad4f772c65d5062fdefcf9" Mar 18 11:06:02 crc kubenswrapper[4733]: E0318 11:06:02.176936 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 11:06:02 crc kubenswrapper[4733]: I0318 11:06:02.521559 4733 generic.go:334] "Generic (PLEG): container finished" podID="259a6c5e-ad27-412a-b854-776f42d48c84" containerID="3fa4661cb631b7da2b1c8775c9cfda48c185327f23a2376fd0ca5927a6b04720" exitCode=0 Mar 18 11:06:02 crc kubenswrapper[4733]: I0318 11:06:02.521632 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563866-78qk8" event={"ID":"259a6c5e-ad27-412a-b854-776f42d48c84","Type":"ContainerDied","Data":"3fa4661cb631b7da2b1c8775c9cfda48c185327f23a2376fd0ca5927a6b04720"} Mar 18 11:06:03 crc kubenswrapper[4733]: I0318 11:06:03.857412 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563866-78qk8" Mar 18 11:06:04 crc kubenswrapper[4733]: I0318 11:06:04.052211 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45g6m\" (UniqueName: \"kubernetes.io/projected/259a6c5e-ad27-412a-b854-776f42d48c84-kube-api-access-45g6m\") pod \"259a6c5e-ad27-412a-b854-776f42d48c84\" (UID: \"259a6c5e-ad27-412a-b854-776f42d48c84\") " Mar 18 11:06:04 crc kubenswrapper[4733]: I0318 11:06:04.059828 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/259a6c5e-ad27-412a-b854-776f42d48c84-kube-api-access-45g6m" (OuterVolumeSpecName: "kube-api-access-45g6m") pod "259a6c5e-ad27-412a-b854-776f42d48c84" (UID: "259a6c5e-ad27-412a-b854-776f42d48c84"). InnerVolumeSpecName "kube-api-access-45g6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:06:04 crc kubenswrapper[4733]: I0318 11:06:04.154331 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45g6m\" (UniqueName: \"kubernetes.io/projected/259a6c5e-ad27-412a-b854-776f42d48c84-kube-api-access-45g6m\") on node \"crc\" DevicePath \"\"" Mar 18 11:06:04 crc kubenswrapper[4733]: I0318 11:06:04.543446 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563866-78qk8" event={"ID":"259a6c5e-ad27-412a-b854-776f42d48c84","Type":"ContainerDied","Data":"0c1dd95218ba98497cb804ff0ac97f1c01408a8561346b9dcaac9335cfa062cd"} Mar 18 11:06:04 crc kubenswrapper[4733]: I0318 11:06:04.543492 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c1dd95218ba98497cb804ff0ac97f1c01408a8561346b9dcaac9335cfa062cd" Mar 18 11:06:04 crc kubenswrapper[4733]: I0318 11:06:04.543530 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563866-78qk8" Mar 18 11:06:04 crc kubenswrapper[4733]: I0318 11:06:04.916349 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563860-2k4dj"] Mar 18 11:06:04 crc kubenswrapper[4733]: I0318 11:06:04.921659 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563860-2k4dj"] Mar 18 11:06:05 crc kubenswrapper[4733]: I0318 11:06:05.184963 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e41c70a-6d8d-47a8-9caf-57f46a60f96a" path="/var/lib/kubelet/pods/6e41c70a-6d8d-47a8-9caf-57f46a60f96a/volumes" Mar 18 11:06:05 crc kubenswrapper[4733]: I0318 11:06:05.188965 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-m754c/must-gather-dtbq6"] Mar 18 11:06:05 crc kubenswrapper[4733]: I0318 11:06:05.189734 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-m754c/must-gather-dtbq6" podUID="5542a33f-3466-419c-af8f-3391bcc3d241" containerName="copy" containerID="cri-o://1557b68a8a325a0ed8ba1781990482335f57ef77fd7aa971706340af195f6c7b" gracePeriod=2 Mar 18 11:06:05 crc kubenswrapper[4733]: I0318 11:06:05.196765 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-m754c/must-gather-dtbq6"] Mar 18 11:06:05 crc kubenswrapper[4733]: I0318 11:06:05.553039 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-m754c_must-gather-dtbq6_5542a33f-3466-419c-af8f-3391bcc3d241/copy/0.log" Mar 18 11:06:05 crc kubenswrapper[4733]: I0318 11:06:05.553850 4733 generic.go:334] "Generic (PLEG): container finished" podID="5542a33f-3466-419c-af8f-3391bcc3d241" containerID="1557b68a8a325a0ed8ba1781990482335f57ef77fd7aa971706340af195f6c7b" exitCode=143 Mar 18 11:06:05 crc kubenswrapper[4733]: I0318 11:06:05.701601 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-m754c_must-gather-dtbq6_5542a33f-3466-419c-af8f-3391bcc3d241/copy/0.log" Mar 18 11:06:05 crc kubenswrapper[4733]: I0318 11:06:05.705852 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m754c/must-gather-dtbq6" Mar 18 11:06:05 crc kubenswrapper[4733]: I0318 11:06:05.794922 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57p52\" (UniqueName: \"kubernetes.io/projected/5542a33f-3466-419c-af8f-3391bcc3d241-kube-api-access-57p52\") pod \"5542a33f-3466-419c-af8f-3391bcc3d241\" (UID: \"5542a33f-3466-419c-af8f-3391bcc3d241\") " Mar 18 11:06:05 crc kubenswrapper[4733]: I0318 11:06:05.795009 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5542a33f-3466-419c-af8f-3391bcc3d241-must-gather-output\") pod \"5542a33f-3466-419c-af8f-3391bcc3d241\" (UID: \"5542a33f-3466-419c-af8f-3391bcc3d241\") " Mar 18 11:06:05 crc kubenswrapper[4733]: I0318 11:06:05.810418 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5542a33f-3466-419c-af8f-3391bcc3d241-kube-api-access-57p52" (OuterVolumeSpecName: "kube-api-access-57p52") pod "5542a33f-3466-419c-af8f-3391bcc3d241" (UID: "5542a33f-3466-419c-af8f-3391bcc3d241"). InnerVolumeSpecName "kube-api-access-57p52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:06:05 crc kubenswrapper[4733]: I0318 11:06:05.906289 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57p52\" (UniqueName: \"kubernetes.io/projected/5542a33f-3466-419c-af8f-3391bcc3d241-kube-api-access-57p52\") on node \"crc\" DevicePath \"\"" Mar 18 11:06:05 crc kubenswrapper[4733]: I0318 11:06:05.977801 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5542a33f-3466-419c-af8f-3391bcc3d241-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "5542a33f-3466-419c-af8f-3391bcc3d241" (UID: "5542a33f-3466-419c-af8f-3391bcc3d241"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 11:06:06 crc kubenswrapper[4733]: I0318 11:06:06.007562 4733 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5542a33f-3466-419c-af8f-3391bcc3d241-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 18 11:06:06 crc kubenswrapper[4733]: I0318 11:06:06.175906 4733 scope.go:117] "RemoveContainer" containerID="309174b794edb6ce74f0fdb4a12ba0a1a8e65a9dcfd1acde2e49c6c8caf177d2" Mar 18 11:06:06 crc kubenswrapper[4733]: E0318 11:06:06.176209 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 11:06:06 crc kubenswrapper[4733]: I0318 11:06:06.563893 4733 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-m754c_must-gather-dtbq6_5542a33f-3466-419c-af8f-3391bcc3d241/copy/0.log" Mar 18 11:06:06 crc kubenswrapper[4733]: I0318 11:06:06.564523 4733 scope.go:117] "RemoveContainer" containerID="1557b68a8a325a0ed8ba1781990482335f57ef77fd7aa971706340af195f6c7b" Mar 18 11:06:06 crc kubenswrapper[4733]: I0318 11:06:06.564543 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m754c/must-gather-dtbq6" Mar 18 11:06:06 crc kubenswrapper[4733]: I0318 11:06:06.582895 4733 scope.go:117] "RemoveContainer" containerID="83201b6bc07225e6541cd95837440f81a991cdbcc76e2856eb4af65abc081fb9" Mar 18 11:06:07 crc kubenswrapper[4733]: I0318 11:06:07.191781 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5542a33f-3466-419c-af8f-3391bcc3d241" path="/var/lib/kubelet/pods/5542a33f-3466-419c-af8f-3391bcc3d241/volumes" Mar 18 11:06:08 crc kubenswrapper[4733]: I0318 11:06:08.175591 4733 scope.go:117] "RemoveContainer" containerID="32438c6a9409b79313e3b8972bb637f88d330bfefddb767cde843ec5e6f0eb01" Mar 18 11:06:08 crc kubenswrapper[4733]: E0318 11:06:08.176395 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 11:06:14 crc kubenswrapper[4733]: I0318 11:06:14.176280 4733 scope.go:117] "RemoveContainer" containerID="6b9a340729099e48708bb3e49a96ed003cdb26d857ad4f772c65d5062fdefcf9" Mar 18 11:06:14 crc kubenswrapper[4733]: E0318 11:06:14.177373 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 11:06:20 crc kubenswrapper[4733]: I0318 11:06:20.176280 4733 scope.go:117] "RemoveContainer" containerID="309174b794edb6ce74f0fdb4a12ba0a1a8e65a9dcfd1acde2e49c6c8caf177d2" Mar 18 11:06:20 crc kubenswrapper[4733]: E0318 11:06:20.176994 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 11:06:22 crc kubenswrapper[4733]: I0318 11:06:22.511524 4733 scope.go:117] "RemoveContainer" containerID="32438c6a9409b79313e3b8972bb637f88d330bfefddb767cde843ec5e6f0eb01" Mar 18 11:06:22 crc kubenswrapper[4733]: E0318 11:06:22.512890 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 11:06:25 crc kubenswrapper[4733]: I0318 11:06:25.176048 4733 scope.go:117] "RemoveContainer" containerID="6b9a340729099e48708bb3e49a96ed003cdb26d857ad4f772c65d5062fdefcf9" Mar 18 11:06:25 crc kubenswrapper[4733]: E0318 11:06:25.177072 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 11:06:34 crc kubenswrapper[4733]: I0318 11:06:34.175867 4733 scope.go:117] "RemoveContainer" containerID="309174b794edb6ce74f0fdb4a12ba0a1a8e65a9dcfd1acde2e49c6c8caf177d2" Mar 18 11:06:34 crc kubenswrapper[4733]: E0318 11:06:34.177436 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 11:06:36 crc kubenswrapper[4733]: I0318 11:06:36.175633 4733 scope.go:117] "RemoveContainer" containerID="32438c6a9409b79313e3b8972bb637f88d330bfefddb767cde843ec5e6f0eb01" Mar 18 11:06:36 crc kubenswrapper[4733]: E0318 11:06:36.176485 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 11:06:39 crc kubenswrapper[4733]: I0318 11:06:39.175687 4733 scope.go:117] "RemoveContainer" containerID="6b9a340729099e48708bb3e49a96ed003cdb26d857ad4f772c65d5062fdefcf9" Mar 18 11:06:39 crc kubenswrapper[4733]: E0318 11:06:39.176235 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 11:06:46 crc kubenswrapper[4733]: I0318 11:06:46.175882 4733 scope.go:117] "RemoveContainer" containerID="309174b794edb6ce74f0fdb4a12ba0a1a8e65a9dcfd1acde2e49c6c8caf177d2" Mar 18 11:06:46 crc kubenswrapper[4733]: E0318 11:06:46.176652 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 11:06:48 crc kubenswrapper[4733]: I0318 11:06:48.176086 4733 scope.go:117] "RemoveContainer" containerID="32438c6a9409b79313e3b8972bb637f88d330bfefddb767cde843ec5e6f0eb01" Mar 18 11:06:48 crc kubenswrapper[4733]: E0318 11:06:48.176812 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 11:06:53 crc kubenswrapper[4733]: I0318 11:06:53.177379 4733 scope.go:117] "RemoveContainer" containerID="6b9a340729099e48708bb3e49a96ed003cdb26d857ad4f772c65d5062fdefcf9" Mar 18 11:06:53 crc kubenswrapper[4733]: E0318 11:06:53.178451 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 11:07:01 crc kubenswrapper[4733]: I0318 11:07:01.180423 4733 scope.go:117] "RemoveContainer" containerID="32438c6a9409b79313e3b8972bb637f88d330bfefddb767cde843ec5e6f0eb01" Mar 18 11:07:01 crc kubenswrapper[4733]: I0318 11:07:01.181086 4733 scope.go:117] "RemoveContainer" containerID="309174b794edb6ce74f0fdb4a12ba0a1a8e65a9dcfd1acde2e49c6c8caf177d2" Mar 18 11:07:01 crc kubenswrapper[4733]: E0318 11:07:01.181179 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 11:07:01 crc kubenswrapper[4733]: E0318 11:07:01.181462 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 11:07:07 crc kubenswrapper[4733]: I0318 11:07:07.175140 4733 scope.go:117] "RemoveContainer" containerID="6b9a340729099e48708bb3e49a96ed003cdb26d857ad4f772c65d5062fdefcf9" Mar 18 11:07:07 crc kubenswrapper[4733]: E0318 11:07:07.175932 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 11:07:14 crc kubenswrapper[4733]: I0318 11:07:14.175076 4733 scope.go:117] "RemoveContainer" containerID="32438c6a9409b79313e3b8972bb637f88d330bfefddb767cde843ec5e6f0eb01" Mar 18 11:07:14 crc kubenswrapper[4733]: E0318 11:07:14.175859 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 11:07:15 crc kubenswrapper[4733]: I0318 11:07:15.175805 4733 scope.go:117] "RemoveContainer" containerID="309174b794edb6ce74f0fdb4a12ba0a1a8e65a9dcfd1acde2e49c6c8caf177d2" Mar 18 11:07:15 crc kubenswrapper[4733]: E0318 11:07:15.176097 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 11:07:16 crc kubenswrapper[4733]: I0318 11:07:16.532922 4733 scope.go:117] "RemoveContainer" containerID="ee8a5931d088bb90e3f8edd41217a30f581b7d88c4f982136e16b0f2b145d28c" Mar 18 11:07:18 crc kubenswrapper[4733]: I0318 11:07:18.175936 4733 scope.go:117] "RemoveContainer" containerID="6b9a340729099e48708bb3e49a96ed003cdb26d857ad4f772c65d5062fdefcf9" Mar 18 11:07:18 crc kubenswrapper[4733]: E0318 11:07:18.176915 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 11:07:27 crc kubenswrapper[4733]: I0318 11:07:27.175526 4733 scope.go:117] "RemoveContainer" containerID="32438c6a9409b79313e3b8972bb637f88d330bfefddb767cde843ec5e6f0eb01" Mar 18 11:07:27 crc kubenswrapper[4733]: I0318 11:07:27.176263 4733 scope.go:117] "RemoveContainer" containerID="309174b794edb6ce74f0fdb4a12ba0a1a8e65a9dcfd1acde2e49c6c8caf177d2" Mar 18 11:07:27 crc kubenswrapper[4733]: E0318 11:07:27.176593 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 11:07:27 crc kubenswrapper[4733]: E0318 11:07:27.176647 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 11:07:32 crc kubenswrapper[4733]: I0318 11:07:32.175709 4733 scope.go:117] "RemoveContainer" containerID="6b9a340729099e48708bb3e49a96ed003cdb26d857ad4f772c65d5062fdefcf9" Mar 18 11:07:32 crc kubenswrapper[4733]: E0318 11:07:32.176643 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 11:07:38 crc kubenswrapper[4733]: I0318 11:07:38.176122 4733 scope.go:117] "RemoveContainer" containerID="32438c6a9409b79313e3b8972bb637f88d330bfefddb767cde843ec5e6f0eb01" Mar 18 11:07:38 crc kubenswrapper[4733]: E0318 11:07:38.176898 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 11:07:42 crc kubenswrapper[4733]: I0318 11:07:42.175303 4733 scope.go:117] "RemoveContainer" containerID="309174b794edb6ce74f0fdb4a12ba0a1a8e65a9dcfd1acde2e49c6c8caf177d2" Mar 18 11:07:42 crc kubenswrapper[4733]: E0318 11:07:42.176373 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 11:07:43 crc kubenswrapper[4733]: I0318 11:07:43.175940 4733 scope.go:117] "RemoveContainer" containerID="6b9a340729099e48708bb3e49a96ed003cdb26d857ad4f772c65d5062fdefcf9" Mar 18 11:07:43 crc kubenswrapper[4733]: E0318 11:07:43.176214 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2h7dp_openshift-machine-config-operator(6f75e1c5-e0c5-43df-944f-77b734070793)\"" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" Mar 18 11:07:51 crc kubenswrapper[4733]: I0318 11:07:51.183487 4733 scope.go:117] "RemoveContainer" containerID="32438c6a9409b79313e3b8972bb637f88d330bfefddb767cde843ec5e6f0eb01" Mar 18 11:07:51 crc kubenswrapper[4733]: E0318 11:07:51.184141 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 11:07:54 crc kubenswrapper[4733]: I0318 11:07:54.175533 4733 scope.go:117] "RemoveContainer" containerID="6b9a340729099e48708bb3e49a96ed003cdb26d857ad4f772c65d5062fdefcf9" Mar 18 11:07:55 crc kubenswrapper[4733]: I0318 11:07:55.416706 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" event={"ID":"6f75e1c5-e0c5-43df-944f-77b734070793","Type":"ContainerStarted","Data":"cfc089247a62ba1977b8f1dbc94d8aa7156f75d1768073acced9e580d06b9f83"} Mar 18 11:07:57 crc kubenswrapper[4733]: I0318 11:07:57.175178 4733 scope.go:117] "RemoveContainer" containerID="309174b794edb6ce74f0fdb4a12ba0a1a8e65a9dcfd1acde2e49c6c8caf177d2" Mar 18 11:07:57 crc kubenswrapper[4733]: E0318 11:07:57.175510 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 11:08:00 crc kubenswrapper[4733]: I0318 11:08:00.188656 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563868-bsjmk"] Mar 18 11:08:00 crc kubenswrapper[4733]: E0318 11:08:00.189659 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="259a6c5e-ad27-412a-b854-776f42d48c84" containerName="oc" Mar 18 11:08:00 crc kubenswrapper[4733]: I0318 11:08:00.189676 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="259a6c5e-ad27-412a-b854-776f42d48c84" containerName="oc" Mar 18 11:08:00 crc kubenswrapper[4733]: E0318 11:08:00.189693 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5542a33f-3466-419c-af8f-3391bcc3d241" containerName="gather" Mar 18 11:08:00 crc kubenswrapper[4733]: I0318 11:08:00.189701 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="5542a33f-3466-419c-af8f-3391bcc3d241" containerName="gather" Mar 18 11:08:00 crc kubenswrapper[4733]: E0318 11:08:00.189714 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5542a33f-3466-419c-af8f-3391bcc3d241" containerName="copy" Mar 18 11:08:00 crc kubenswrapper[4733]: I0318 11:08:00.189723 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="5542a33f-3466-419c-af8f-3391bcc3d241" containerName="copy" Mar 18 11:08:00 crc kubenswrapper[4733]: I0318 11:08:00.189937 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="5542a33f-3466-419c-af8f-3391bcc3d241" containerName="copy" Mar 18 11:08:00 crc kubenswrapper[4733]: I0318 11:08:00.189958 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="5542a33f-3466-419c-af8f-3391bcc3d241" containerName="gather" Mar 18 11:08:00 crc kubenswrapper[4733]: I0318 11:08:00.189976 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="259a6c5e-ad27-412a-b854-776f42d48c84" containerName="oc" Mar 18 11:08:00 crc kubenswrapper[4733]: I0318 11:08:00.190662 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563868-bsjmk" Mar 18 11:08:00 crc kubenswrapper[4733]: I0318 11:08:00.192211 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 11:08:00 crc kubenswrapper[4733]: I0318 11:08:00.192310 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wmd5k" Mar 18 11:08:00 crc kubenswrapper[4733]: I0318 11:08:00.192993 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 11:08:00 crc kubenswrapper[4733]: I0318 11:08:00.197588 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563868-bsjmk"] Mar 18 11:08:00 crc kubenswrapper[4733]: I0318 11:08:00.334949 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvqk9\" (UniqueName: \"kubernetes.io/projected/81e9d3af-d753-42e5-b7e3-529203e7cfab-kube-api-access-xvqk9\") pod \"auto-csr-approver-29563868-bsjmk\" (UID: \"81e9d3af-d753-42e5-b7e3-529203e7cfab\") " pod="openshift-infra/auto-csr-approver-29563868-bsjmk" Mar 18 11:08:00 crc kubenswrapper[4733]: I0318 11:08:00.436660 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvqk9\" (UniqueName: \"kubernetes.io/projected/81e9d3af-d753-42e5-b7e3-529203e7cfab-kube-api-access-xvqk9\") pod \"auto-csr-approver-29563868-bsjmk\" (UID: \"81e9d3af-d753-42e5-b7e3-529203e7cfab\") " pod="openshift-infra/auto-csr-approver-29563868-bsjmk" Mar 18 11:08:00 crc kubenswrapper[4733]: I0318 11:08:00.462535 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvqk9\" (UniqueName: \"kubernetes.io/projected/81e9d3af-d753-42e5-b7e3-529203e7cfab-kube-api-access-xvqk9\") pod \"auto-csr-approver-29563868-bsjmk\" (UID: \"81e9d3af-d753-42e5-b7e3-529203e7cfab\") " pod="openshift-infra/auto-csr-approver-29563868-bsjmk" Mar 18 11:08:00 crc kubenswrapper[4733]: I0318 11:08:00.511640 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563868-bsjmk" Mar 18 11:08:00 crc kubenswrapper[4733]: I0318 11:08:00.916736 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563868-bsjmk"] Mar 18 11:08:00 crc kubenswrapper[4733]: W0318 11:08:00.922230 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81e9d3af_d753_42e5_b7e3_529203e7cfab.slice/crio-68224e65287acce10060c36b5fe003286c5f71fcd676f7c816e02786f315b5c8 WatchSource:0}: Error finding container 68224e65287acce10060c36b5fe003286c5f71fcd676f7c816e02786f315b5c8: Status 404 returned error can't find the container with id 68224e65287acce10060c36b5fe003286c5f71fcd676f7c816e02786f315b5c8 Mar 18 11:08:00 crc kubenswrapper[4733]: I0318 11:08:00.925146 4733 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 11:08:01 crc kubenswrapper[4733]: I0318 11:08:01.470350 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563868-bsjmk" event={"ID":"81e9d3af-d753-42e5-b7e3-529203e7cfab","Type":"ContainerStarted","Data":"68224e65287acce10060c36b5fe003286c5f71fcd676f7c816e02786f315b5c8"} Mar 18 11:08:02 crc kubenswrapper[4733]: I0318 11:08:02.481033 4733 generic.go:334] "Generic (PLEG): container finished" podID="81e9d3af-d753-42e5-b7e3-529203e7cfab" containerID="29bf663b381a7fa830f7e7010a5bc46c4197d3720124ddb060ebae43c856ab2e" exitCode=0 Mar 18 11:08:02 crc kubenswrapper[4733]: I0318 11:08:02.481095 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563868-bsjmk" event={"ID":"81e9d3af-d753-42e5-b7e3-529203e7cfab","Type":"ContainerDied","Data":"29bf663b381a7fa830f7e7010a5bc46c4197d3720124ddb060ebae43c856ab2e"} Mar 18 11:08:03 crc kubenswrapper[4733]: I0318 11:08:03.821170 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563868-bsjmk" Mar 18 11:08:03 crc kubenswrapper[4733]: I0318 11:08:03.995402 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvqk9\" (UniqueName: \"kubernetes.io/projected/81e9d3af-d753-42e5-b7e3-529203e7cfab-kube-api-access-xvqk9\") pod \"81e9d3af-d753-42e5-b7e3-529203e7cfab\" (UID: \"81e9d3af-d753-42e5-b7e3-529203e7cfab\") " Mar 18 11:08:04 crc kubenswrapper[4733]: I0318 11:08:04.000401 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81e9d3af-d753-42e5-b7e3-529203e7cfab-kube-api-access-xvqk9" (OuterVolumeSpecName: "kube-api-access-xvqk9") pod "81e9d3af-d753-42e5-b7e3-529203e7cfab" (UID: "81e9d3af-d753-42e5-b7e3-529203e7cfab"). InnerVolumeSpecName "kube-api-access-xvqk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:08:04 crc kubenswrapper[4733]: I0318 11:08:04.097154 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvqk9\" (UniqueName: \"kubernetes.io/projected/81e9d3af-d753-42e5-b7e3-529203e7cfab-kube-api-access-xvqk9\") on node \"crc\" DevicePath \"\"" Mar 18 11:08:04 crc kubenswrapper[4733]: I0318 11:08:04.176150 4733 scope.go:117] "RemoveContainer" containerID="32438c6a9409b79313e3b8972bb637f88d330bfefddb767cde843ec5e6f0eb01" Mar 18 11:08:04 crc kubenswrapper[4733]: E0318 11:08:04.177062 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 11:08:04 crc kubenswrapper[4733]: I0318 11:08:04.509174 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563868-bsjmk" event={"ID":"81e9d3af-d753-42e5-b7e3-529203e7cfab","Type":"ContainerDied","Data":"68224e65287acce10060c36b5fe003286c5f71fcd676f7c816e02786f315b5c8"} Mar 18 11:08:04 crc kubenswrapper[4733]: I0318 11:08:04.509240 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68224e65287acce10060c36b5fe003286c5f71fcd676f7c816e02786f315b5c8" Mar 18 11:08:04 crc kubenswrapper[4733]: I0318 11:08:04.509257 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563868-bsjmk" Mar 18 11:08:04 crc kubenswrapper[4733]: I0318 11:08:04.883999 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563862-vmt2n"] Mar 18 11:08:04 crc kubenswrapper[4733]: I0318 11:08:04.889529 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563862-vmt2n"] Mar 18 11:08:05 crc kubenswrapper[4733]: I0318 11:08:05.191336 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad06f0e3-f671-426a-b6d8-793e87745364" path="/var/lib/kubelet/pods/ad06f0e3-f671-426a-b6d8-793e87745364/volumes" Mar 18 11:08:09 crc kubenswrapper[4733]: I0318 11:08:09.176483 4733 scope.go:117] "RemoveContainer" containerID="309174b794edb6ce74f0fdb4a12ba0a1a8e65a9dcfd1acde2e49c6c8caf177d2" Mar 18 11:08:09 crc kubenswrapper[4733]: E0318 11:08:09.177315 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 11:08:15 crc kubenswrapper[4733]: I0318 11:08:15.176706 4733 scope.go:117] "RemoveContainer" containerID="32438c6a9409b79313e3b8972bb637f88d330bfefddb767cde843ec5e6f0eb01" Mar 18 11:08:15 crc kubenswrapper[4733]: E0318 11:08:15.177844 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 11:08:16 crc kubenswrapper[4733]: I0318 11:08:16.599242 4733 scope.go:117] "RemoveContainer" containerID="8d69124510c80bc1e6643920b76340f2795eccc769fbdff3665f6edcc2793fe5" Mar 18 11:08:20 crc kubenswrapper[4733]: I0318 11:08:20.175044 4733 scope.go:117] "RemoveContainer" containerID="309174b794edb6ce74f0fdb4a12ba0a1a8e65a9dcfd1acde2e49c6c8caf177d2" Mar 18 11:08:20 crc kubenswrapper[4733]: E0318 11:08:20.175802 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 11:08:26 crc kubenswrapper[4733]: I0318 11:08:26.175992 4733 scope.go:117] "RemoveContainer" containerID="32438c6a9409b79313e3b8972bb637f88d330bfefddb767cde843ec5e6f0eb01" Mar 18 11:08:26 crc kubenswrapper[4733]: E0318 11:08:26.176706 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 11:08:34 crc kubenswrapper[4733]: I0318 11:08:34.175942 4733 scope.go:117] "RemoveContainer" containerID="309174b794edb6ce74f0fdb4a12ba0a1a8e65a9dcfd1acde2e49c6c8caf177d2" Mar 18 11:08:34 crc kubenswrapper[4733]: E0318 11:08:34.177586 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 11:08:40 crc kubenswrapper[4733]: I0318 11:08:40.175796 4733 scope.go:117] "RemoveContainer" containerID="32438c6a9409b79313e3b8972bb637f88d330bfefddb767cde843ec5e6f0eb01" Mar 18 11:08:40 crc kubenswrapper[4733]: E0318 11:08:40.176514 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 11:08:49 crc kubenswrapper[4733]: I0318 11:08:49.176165 4733 scope.go:117] "RemoveContainer" containerID="309174b794edb6ce74f0fdb4a12ba0a1a8e65a9dcfd1acde2e49c6c8caf177d2" Mar 18 11:08:49 crc kubenswrapper[4733]: E0318 11:08:49.176959 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 11:08:52 crc kubenswrapper[4733]: I0318 11:08:52.175672 4733 scope.go:117] "RemoveContainer" containerID="32438c6a9409b79313e3b8972bb637f88d330bfefddb767cde843ec5e6f0eb01" Mar 18 11:08:52 crc kubenswrapper[4733]: E0318 11:08:52.176363 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 11:09:00 crc kubenswrapper[4733]: I0318 11:09:00.917526 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qjhkq"] Mar 18 11:09:00 crc kubenswrapper[4733]: E0318 11:09:00.918426 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81e9d3af-d753-42e5-b7e3-529203e7cfab" containerName="oc" Mar 18 11:09:00 crc kubenswrapper[4733]: I0318 11:09:00.918457 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="81e9d3af-d753-42e5-b7e3-529203e7cfab" containerName="oc" Mar 18 11:09:00 crc kubenswrapper[4733]: I0318 11:09:00.918661 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="81e9d3af-d753-42e5-b7e3-529203e7cfab" containerName="oc" Mar 18 11:09:00 crc kubenswrapper[4733]: I0318 11:09:00.920005 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qjhkq" Mar 18 11:09:00 crc kubenswrapper[4733]: I0318 11:09:00.933086 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qjhkq"] Mar 18 11:09:01 crc kubenswrapper[4733]: I0318 11:09:01.090354 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wswjh\" (UniqueName: \"kubernetes.io/projected/d5decccb-25db-497f-834d-00a802b2a5ca-kube-api-access-wswjh\") pod \"community-operators-qjhkq\" (UID: \"d5decccb-25db-497f-834d-00a802b2a5ca\") " pod="openshift-marketplace/community-operators-qjhkq" Mar 18 11:09:01 crc kubenswrapper[4733]: I0318 11:09:01.090409 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5decccb-25db-497f-834d-00a802b2a5ca-catalog-content\") pod \"community-operators-qjhkq\" (UID: \"d5decccb-25db-497f-834d-00a802b2a5ca\") " pod="openshift-marketplace/community-operators-qjhkq" Mar 18 11:09:01 crc kubenswrapper[4733]: I0318 11:09:01.090526 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5decccb-25db-497f-834d-00a802b2a5ca-utilities\") pod \"community-operators-qjhkq\" (UID: \"d5decccb-25db-497f-834d-00a802b2a5ca\") " pod="openshift-marketplace/community-operators-qjhkq" Mar 18 11:09:01 crc kubenswrapper[4733]: I0318 11:09:01.191961 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wswjh\" (UniqueName: \"kubernetes.io/projected/d5decccb-25db-497f-834d-00a802b2a5ca-kube-api-access-wswjh\") pod \"community-operators-qjhkq\" (UID: \"d5decccb-25db-497f-834d-00a802b2a5ca\") " pod="openshift-marketplace/community-operators-qjhkq" Mar 18 11:09:01 crc kubenswrapper[4733]: I0318 11:09:01.192011 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5decccb-25db-497f-834d-00a802b2a5ca-catalog-content\") pod \"community-operators-qjhkq\" (UID: \"d5decccb-25db-497f-834d-00a802b2a5ca\") " pod="openshift-marketplace/community-operators-qjhkq" Mar 18 11:09:01 crc kubenswrapper[4733]: I0318 11:09:01.192092 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5decccb-25db-497f-834d-00a802b2a5ca-utilities\") pod \"community-operators-qjhkq\" (UID: \"d5decccb-25db-497f-834d-00a802b2a5ca\") " pod="openshift-marketplace/community-operators-qjhkq" Mar 18 11:09:01 crc kubenswrapper[4733]: I0318 11:09:01.192694 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5decccb-25db-497f-834d-00a802b2a5ca-catalog-content\") pod \"community-operators-qjhkq\" (UID: \"d5decccb-25db-497f-834d-00a802b2a5ca\") " pod="openshift-marketplace/community-operators-qjhkq" Mar 18 11:09:01 crc kubenswrapper[4733]: I0318 11:09:01.192694 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5decccb-25db-497f-834d-00a802b2a5ca-utilities\") pod \"community-operators-qjhkq\" (UID: \"d5decccb-25db-497f-834d-00a802b2a5ca\") " pod="openshift-marketplace/community-operators-qjhkq" Mar 18 11:09:01 crc kubenswrapper[4733]: I0318 11:09:01.210247 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wswjh\" (UniqueName: \"kubernetes.io/projected/d5decccb-25db-497f-834d-00a802b2a5ca-kube-api-access-wswjh\") pod \"community-operators-qjhkq\" (UID: \"d5decccb-25db-497f-834d-00a802b2a5ca\") " pod="openshift-marketplace/community-operators-qjhkq" Mar 18 11:09:01 crc kubenswrapper[4733]: I0318 11:09:01.246785 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qjhkq" Mar 18 11:09:01 crc kubenswrapper[4733]: I0318 11:09:01.527867 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qjhkq"] Mar 18 11:09:02 crc kubenswrapper[4733]: I0318 11:09:02.007521 4733 generic.go:334] "Generic (PLEG): container finished" podID="d5decccb-25db-497f-834d-00a802b2a5ca" containerID="b84fc31514ed603bba303ec28f5a426ecb6ae7c046b4a75d643b055090d6437b" exitCode=0 Mar 18 11:09:02 crc kubenswrapper[4733]: I0318 11:09:02.007564 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qjhkq" event={"ID":"d5decccb-25db-497f-834d-00a802b2a5ca","Type":"ContainerDied","Data":"b84fc31514ed603bba303ec28f5a426ecb6ae7c046b4a75d643b055090d6437b"} Mar 18 11:09:02 crc kubenswrapper[4733]: I0318 11:09:02.007868 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qjhkq" event={"ID":"d5decccb-25db-497f-834d-00a802b2a5ca","Type":"ContainerStarted","Data":"e9440f409036519ec55c7c9ca7bffba341f56cdd2766643a8f9051792924a14b"} Mar 18 11:09:03 crc kubenswrapper[4733]: I0318 11:09:03.176968 4733 scope.go:117] "RemoveContainer" containerID="32438c6a9409b79313e3b8972bb637f88d330bfefddb767cde843ec5e6f0eb01" Mar 18 11:09:03 crc kubenswrapper[4733]: E0318 11:09:03.177441 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 11:09:04 crc kubenswrapper[4733]: I0318 11:09:04.024495 4733 generic.go:334] "Generic (PLEG): container finished" podID="d5decccb-25db-497f-834d-00a802b2a5ca" containerID="0b138f51f2df78a11e054926beb21b5f8ff84a8876455818cb6870ff0aa34fc3" exitCode=0 Mar 18 11:09:04 crc kubenswrapper[4733]: I0318 11:09:04.024564 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qjhkq" event={"ID":"d5decccb-25db-497f-834d-00a802b2a5ca","Type":"ContainerDied","Data":"0b138f51f2df78a11e054926beb21b5f8ff84a8876455818cb6870ff0aa34fc3"} Mar 18 11:09:04 crc kubenswrapper[4733]: I0318 11:09:04.176081 4733 scope.go:117] "RemoveContainer" containerID="309174b794edb6ce74f0fdb4a12ba0a1a8e65a9dcfd1acde2e49c6c8caf177d2" Mar 18 11:09:04 crc kubenswrapper[4733]: E0318 11:09:04.176713 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 11:09:05 crc kubenswrapper[4733]: I0318 11:09:05.032863 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qjhkq" event={"ID":"d5decccb-25db-497f-834d-00a802b2a5ca","Type":"ContainerStarted","Data":"7298bb974b60eb967ef1ec7369e4539b9ca038a55f17d333b8184b4668a6a4a3"} Mar 18 11:09:05 crc kubenswrapper[4733]: I0318 11:09:05.052607 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qjhkq" podStartSLOduration=2.505431591 podStartE2EDuration="5.052584947s" podCreationTimestamp="2026-03-18 11:09:00 +0000 UTC" firstStartedPulling="2026-03-18 11:09:02.014469089 +0000 UTC m=+3381.506203434" lastFinishedPulling="2026-03-18 11:09:04.561622435 +0000 UTC m=+3384.053356790" observedRunningTime="2026-03-18 11:09:05.046794004 +0000 UTC m=+3384.538528359" watchObservedRunningTime="2026-03-18 11:09:05.052584947 +0000 UTC m=+3384.544319272" Mar 18 11:09:11 crc kubenswrapper[4733]: I0318 11:09:11.246931 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qjhkq" Mar 18 11:09:11 crc kubenswrapper[4733]: I0318 11:09:11.247509 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qjhkq" Mar 18 11:09:11 crc kubenswrapper[4733]: I0318 11:09:11.315393 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qjhkq" Mar 18 11:09:12 crc kubenswrapper[4733]: I0318 11:09:12.151245 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qjhkq" Mar 18 11:09:12 crc kubenswrapper[4733]: I0318 11:09:12.201765 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qjhkq"] Mar 18 11:09:14 crc kubenswrapper[4733]: I0318 11:09:14.123275 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qjhkq" podUID="d5decccb-25db-497f-834d-00a802b2a5ca" containerName="registry-server" containerID="cri-o://7298bb974b60eb967ef1ec7369e4539b9ca038a55f17d333b8184b4668a6a4a3" gracePeriod=2 Mar 18 11:09:14 crc kubenswrapper[4733]: I0318 11:09:14.550130 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qjhkq" Mar 18 11:09:14 crc kubenswrapper[4733]: I0318 11:09:14.686427 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wswjh\" (UniqueName: \"kubernetes.io/projected/d5decccb-25db-497f-834d-00a802b2a5ca-kube-api-access-wswjh\") pod \"d5decccb-25db-497f-834d-00a802b2a5ca\" (UID: \"d5decccb-25db-497f-834d-00a802b2a5ca\") " Mar 18 11:09:14 crc kubenswrapper[4733]: I0318 11:09:14.686514 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5decccb-25db-497f-834d-00a802b2a5ca-catalog-content\") pod \"d5decccb-25db-497f-834d-00a802b2a5ca\" (UID: \"d5decccb-25db-497f-834d-00a802b2a5ca\") " Mar 18 11:09:14 crc kubenswrapper[4733]: I0318 11:09:14.686702 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5decccb-25db-497f-834d-00a802b2a5ca-utilities\") pod \"d5decccb-25db-497f-834d-00a802b2a5ca\" (UID: \"d5decccb-25db-497f-834d-00a802b2a5ca\") " Mar 18 11:09:14 crc kubenswrapper[4733]: I0318 11:09:14.687654 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5decccb-25db-497f-834d-00a802b2a5ca-utilities" (OuterVolumeSpecName: "utilities") pod "d5decccb-25db-497f-834d-00a802b2a5ca" (UID: "d5decccb-25db-497f-834d-00a802b2a5ca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 11:09:14 crc kubenswrapper[4733]: I0318 11:09:14.694583 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5decccb-25db-497f-834d-00a802b2a5ca-kube-api-access-wswjh" (OuterVolumeSpecName: "kube-api-access-wswjh") pod "d5decccb-25db-497f-834d-00a802b2a5ca" (UID: "d5decccb-25db-497f-834d-00a802b2a5ca"). InnerVolumeSpecName "kube-api-access-wswjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:09:14 crc kubenswrapper[4733]: I0318 11:09:14.788600 4733 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5decccb-25db-497f-834d-00a802b2a5ca-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 11:09:14 crc kubenswrapper[4733]: I0318 11:09:14.788647 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wswjh\" (UniqueName: \"kubernetes.io/projected/d5decccb-25db-497f-834d-00a802b2a5ca-kube-api-access-wswjh\") on node \"crc\" DevicePath \"\"" Mar 18 11:09:15 crc kubenswrapper[4733]: I0318 11:09:15.133693 4733 generic.go:334] "Generic (PLEG): container finished" podID="d5decccb-25db-497f-834d-00a802b2a5ca" containerID="7298bb974b60eb967ef1ec7369e4539b9ca038a55f17d333b8184b4668a6a4a3" exitCode=0 Mar 18 11:09:15 crc kubenswrapper[4733]: I0318 11:09:15.133825 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qjhkq" Mar 18 11:09:15 crc kubenswrapper[4733]: I0318 11:09:15.133747 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qjhkq" event={"ID":"d5decccb-25db-497f-834d-00a802b2a5ca","Type":"ContainerDied","Data":"7298bb974b60eb967ef1ec7369e4539b9ca038a55f17d333b8184b4668a6a4a3"} Mar 18 11:09:15 crc kubenswrapper[4733]: I0318 11:09:15.133985 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qjhkq" event={"ID":"d5decccb-25db-497f-834d-00a802b2a5ca","Type":"ContainerDied","Data":"e9440f409036519ec55c7c9ca7bffba341f56cdd2766643a8f9051792924a14b"} Mar 18 11:09:15 crc kubenswrapper[4733]: I0318 11:09:15.134030 4733 scope.go:117] "RemoveContainer" containerID="7298bb974b60eb967ef1ec7369e4539b9ca038a55f17d333b8184b4668a6a4a3" Mar 18 11:09:15 crc kubenswrapper[4733]: I0318 11:09:15.159283 4733 scope.go:117] "RemoveContainer" containerID="0b138f51f2df78a11e054926beb21b5f8ff84a8876455818cb6870ff0aa34fc3" Mar 18 11:09:15 crc kubenswrapper[4733]: I0318 11:09:15.198765 4733 scope.go:117] "RemoveContainer" containerID="b84fc31514ed603bba303ec28f5a426ecb6ae7c046b4a75d643b055090d6437b" Mar 18 11:09:15 crc kubenswrapper[4733]: I0318 11:09:15.241040 4733 scope.go:117] "RemoveContainer" containerID="7298bb974b60eb967ef1ec7369e4539b9ca038a55f17d333b8184b4668a6a4a3" Mar 18 11:09:15 crc kubenswrapper[4733]: E0318 11:09:15.241529 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7298bb974b60eb967ef1ec7369e4539b9ca038a55f17d333b8184b4668a6a4a3\": container with ID starting with 7298bb974b60eb967ef1ec7369e4539b9ca038a55f17d333b8184b4668a6a4a3 not found: ID does not exist" containerID="7298bb974b60eb967ef1ec7369e4539b9ca038a55f17d333b8184b4668a6a4a3" Mar 18 11:09:15 crc kubenswrapper[4733]: I0318 11:09:15.241657 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7298bb974b60eb967ef1ec7369e4539b9ca038a55f17d333b8184b4668a6a4a3"} err="failed to get container status \"7298bb974b60eb967ef1ec7369e4539b9ca038a55f17d333b8184b4668a6a4a3\": rpc error: code = NotFound desc = could not find container \"7298bb974b60eb967ef1ec7369e4539b9ca038a55f17d333b8184b4668a6a4a3\": container with ID starting with 7298bb974b60eb967ef1ec7369e4539b9ca038a55f17d333b8184b4668a6a4a3 not found: ID does not exist" Mar 18 11:09:15 crc kubenswrapper[4733]: I0318 11:09:15.241755 4733 scope.go:117] "RemoveContainer" containerID="0b138f51f2df78a11e054926beb21b5f8ff84a8876455818cb6870ff0aa34fc3" Mar 18 11:09:15 crc kubenswrapper[4733]: E0318 11:09:15.242250 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b138f51f2df78a11e054926beb21b5f8ff84a8876455818cb6870ff0aa34fc3\": container with ID starting with 0b138f51f2df78a11e054926beb21b5f8ff84a8876455818cb6870ff0aa34fc3 not found: ID does not exist" containerID="0b138f51f2df78a11e054926beb21b5f8ff84a8876455818cb6870ff0aa34fc3" Mar 18 11:09:15 crc kubenswrapper[4733]: I0318 11:09:15.242282 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b138f51f2df78a11e054926beb21b5f8ff84a8876455818cb6870ff0aa34fc3"} err="failed to get container status \"0b138f51f2df78a11e054926beb21b5f8ff84a8876455818cb6870ff0aa34fc3\": rpc error: code = NotFound desc = could not find container \"0b138f51f2df78a11e054926beb21b5f8ff84a8876455818cb6870ff0aa34fc3\": container with ID starting with 0b138f51f2df78a11e054926beb21b5f8ff84a8876455818cb6870ff0aa34fc3 not found: ID does not exist" Mar 18 11:09:15 crc kubenswrapper[4733]: I0318 11:09:15.242302 4733 scope.go:117] "RemoveContainer" containerID="b84fc31514ed603bba303ec28f5a426ecb6ae7c046b4a75d643b055090d6437b" Mar 18 11:09:15 crc kubenswrapper[4733]: E0318 11:09:15.242570 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b84fc31514ed603bba303ec28f5a426ecb6ae7c046b4a75d643b055090d6437b\": container with ID starting with b84fc31514ed603bba303ec28f5a426ecb6ae7c046b4a75d643b055090d6437b not found: ID does not exist" containerID="b84fc31514ed603bba303ec28f5a426ecb6ae7c046b4a75d643b055090d6437b" Mar 18 11:09:15 crc kubenswrapper[4733]: I0318 11:09:15.242672 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b84fc31514ed603bba303ec28f5a426ecb6ae7c046b4a75d643b055090d6437b"} err="failed to get container status \"b84fc31514ed603bba303ec28f5a426ecb6ae7c046b4a75d643b055090d6437b\": rpc error: code = NotFound desc = could not find container \"b84fc31514ed603bba303ec28f5a426ecb6ae7c046b4a75d643b055090d6437b\": container with ID starting with b84fc31514ed603bba303ec28f5a426ecb6ae7c046b4a75d643b055090d6437b not found: ID does not exist" Mar 18 11:09:15 crc kubenswrapper[4733]: I0318 11:09:15.409078 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5decccb-25db-497f-834d-00a802b2a5ca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d5decccb-25db-497f-834d-00a802b2a5ca" (UID: "d5decccb-25db-497f-834d-00a802b2a5ca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 11:09:15 crc kubenswrapper[4733]: I0318 11:09:15.482847 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qjhkq"] Mar 18 11:09:15 crc kubenswrapper[4733]: I0318 11:09:15.489039 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qjhkq"] Mar 18 11:09:15 crc kubenswrapper[4733]: I0318 11:09:15.499905 4733 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5decccb-25db-497f-834d-00a802b2a5ca-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 11:09:16 crc kubenswrapper[4733]: I0318 11:09:16.176037 4733 scope.go:117] "RemoveContainer" containerID="32438c6a9409b79313e3b8972bb637f88d330bfefddb767cde843ec5e6f0eb01" Mar 18 11:09:16 crc kubenswrapper[4733]: I0318 11:09:16.176305 4733 scope.go:117] "RemoveContainer" containerID="309174b794edb6ce74f0fdb4a12ba0a1a8e65a9dcfd1acde2e49c6c8caf177d2" Mar 18 11:09:17 crc kubenswrapper[4733]: I0318 11:09:17.160725 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b4a4e3e2-bd4d-4f8d-97bc-51267378ab03","Type":"ContainerStarted","Data":"42cbdac80d1cd5197063da49b635b07d62e4adb2d6aae581ef46d16897659830"} Mar 18 11:09:17 crc kubenswrapper[4733]: I0318 11:09:17.161002 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 18 11:09:17 crc kubenswrapper[4733]: I0318 11:09:17.163438 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f0570ce4-1455-4698-85cf-01f7108d9e7f","Type":"ContainerStarted","Data":"438887752c2380d0a118f44c0a43f524012895d063bcabcf9aab6778e9825f97"} Mar 18 11:09:17 crc kubenswrapper[4733]: I0318 11:09:17.163663 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 18 11:09:17 crc kubenswrapper[4733]: I0318 11:09:17.212105 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5decccb-25db-497f-834d-00a802b2a5ca" path="/var/lib/kubelet/pods/d5decccb-25db-497f-834d-00a802b2a5ca/volumes" Mar 18 11:09:21 crc kubenswrapper[4733]: I0318 11:09:21.207284 4733 generic.go:334] "Generic (PLEG): container finished" podID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" containerID="42cbdac80d1cd5197063da49b635b07d62e4adb2d6aae581ef46d16897659830" exitCode=0 Mar 18 11:09:21 crc kubenswrapper[4733]: I0318 11:09:21.207438 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b4a4e3e2-bd4d-4f8d-97bc-51267378ab03","Type":"ContainerDied","Data":"42cbdac80d1cd5197063da49b635b07d62e4adb2d6aae581ef46d16897659830"} Mar 18 11:09:21 crc kubenswrapper[4733]: I0318 11:09:21.207944 4733 scope.go:117] "RemoveContainer" containerID="309174b794edb6ce74f0fdb4a12ba0a1a8e65a9dcfd1acde2e49c6c8caf177d2" Mar 18 11:09:21 crc kubenswrapper[4733]: I0318 11:09:21.208827 4733 scope.go:117] "RemoveContainer" containerID="42cbdac80d1cd5197063da49b635b07d62e4adb2d6aae581ef46d16897659830" Mar 18 11:09:21 crc kubenswrapper[4733]: E0318 11:09:21.209270 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 11:09:21 crc kubenswrapper[4733]: I0318 11:09:21.215317 4733 generic.go:334] "Generic (PLEG): container finished" podID="f0570ce4-1455-4698-85cf-01f7108d9e7f" containerID="438887752c2380d0a118f44c0a43f524012895d063bcabcf9aab6778e9825f97" exitCode=0 Mar 18 11:09:21 crc kubenswrapper[4733]: I0318 11:09:21.215360 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f0570ce4-1455-4698-85cf-01f7108d9e7f","Type":"ContainerDied","Data":"438887752c2380d0a118f44c0a43f524012895d063bcabcf9aab6778e9825f97"} Mar 18 11:09:21 crc kubenswrapper[4733]: I0318 11:09:21.215939 4733 scope.go:117] "RemoveContainer" containerID="438887752c2380d0a118f44c0a43f524012895d063bcabcf9aab6778e9825f97" Mar 18 11:09:21 crc kubenswrapper[4733]: E0318 11:09:21.216138 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 11:09:21 crc kubenswrapper[4733]: I0318 11:09:21.775825 4733 scope.go:117] "RemoveContainer" containerID="32438c6a9409b79313e3b8972bb637f88d330bfefddb767cde843ec5e6f0eb01" Mar 18 11:09:36 crc kubenswrapper[4733]: I0318 11:09:36.177247 4733 scope.go:117] "RemoveContainer" containerID="42cbdac80d1cd5197063da49b635b07d62e4adb2d6aae581ef46d16897659830" Mar 18 11:09:36 crc kubenswrapper[4733]: E0318 11:09:36.178408 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 11:09:36 crc kubenswrapper[4733]: I0318 11:09:36.178662 4733 scope.go:117] "RemoveContainer" containerID="438887752c2380d0a118f44c0a43f524012895d063bcabcf9aab6778e9825f97" Mar 18 11:09:36 crc kubenswrapper[4733]: E0318 11:09:36.178899 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 11:09:51 crc kubenswrapper[4733]: I0318 11:09:51.187466 4733 scope.go:117] "RemoveContainer" containerID="438887752c2380d0a118f44c0a43f524012895d063bcabcf9aab6778e9825f97" Mar 18 11:09:51 crc kubenswrapper[4733]: I0318 11:09:51.188179 4733 scope.go:117] "RemoveContainer" containerID="42cbdac80d1cd5197063da49b635b07d62e4adb2d6aae581ef46d16897659830" Mar 18 11:09:51 crc kubenswrapper[4733]: E0318 11:09:51.188531 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 11:09:51 crc kubenswrapper[4733]: E0318 11:09:51.188813 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 11:10:00 crc kubenswrapper[4733]: I0318 11:10:00.158560 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563870-x5wsc"] Mar 18 11:10:00 crc kubenswrapper[4733]: E0318 11:10:00.160292 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5decccb-25db-497f-834d-00a802b2a5ca" containerName="extract-utilities" Mar 18 11:10:00 crc kubenswrapper[4733]: I0318 11:10:00.160375 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5decccb-25db-497f-834d-00a802b2a5ca" containerName="extract-utilities" Mar 18 11:10:00 crc kubenswrapper[4733]: E0318 11:10:00.160467 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5decccb-25db-497f-834d-00a802b2a5ca" containerName="extract-content" Mar 18 11:10:00 crc kubenswrapper[4733]: I0318 11:10:00.160534 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5decccb-25db-497f-834d-00a802b2a5ca" containerName="extract-content" Mar 18 11:10:00 crc kubenswrapper[4733]: E0318 11:10:00.160613 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5decccb-25db-497f-834d-00a802b2a5ca" containerName="registry-server" Mar 18 11:10:00 crc kubenswrapper[4733]: I0318 11:10:00.160669 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5decccb-25db-497f-834d-00a802b2a5ca" containerName="registry-server" Mar 18 11:10:00 crc kubenswrapper[4733]: I0318 11:10:00.160890 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5decccb-25db-497f-834d-00a802b2a5ca" containerName="registry-server" Mar 18 11:10:00 crc kubenswrapper[4733]: I0318 11:10:00.161517 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563870-x5wsc" Mar 18 11:10:00 crc kubenswrapper[4733]: I0318 11:10:00.163722 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 11:10:00 crc kubenswrapper[4733]: I0318 11:10:00.165206 4733 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 11:10:00 crc kubenswrapper[4733]: I0318 11:10:00.171575 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563870-x5wsc"] Mar 18 11:10:00 crc kubenswrapper[4733]: I0318 11:10:00.213011 4733 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wmd5k" Mar 18 11:10:00 crc kubenswrapper[4733]: I0318 11:10:00.277902 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2t7wn\" (UniqueName: \"kubernetes.io/projected/9b718b15-0627-4a8f-999c-20a09469f53b-kube-api-access-2t7wn\") pod \"auto-csr-approver-29563870-x5wsc\" (UID: \"9b718b15-0627-4a8f-999c-20a09469f53b\") " pod="openshift-infra/auto-csr-approver-29563870-x5wsc" Mar 18 11:10:00 crc kubenswrapper[4733]: I0318 11:10:00.378932 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2t7wn\" (UniqueName: \"kubernetes.io/projected/9b718b15-0627-4a8f-999c-20a09469f53b-kube-api-access-2t7wn\") pod \"auto-csr-approver-29563870-x5wsc\" (UID: \"9b718b15-0627-4a8f-999c-20a09469f53b\") " pod="openshift-infra/auto-csr-approver-29563870-x5wsc" Mar 18 11:10:00 crc kubenswrapper[4733]: I0318 11:10:00.411372 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2t7wn\" (UniqueName: \"kubernetes.io/projected/9b718b15-0627-4a8f-999c-20a09469f53b-kube-api-access-2t7wn\") pod \"auto-csr-approver-29563870-x5wsc\" (UID: \"9b718b15-0627-4a8f-999c-20a09469f53b\") " pod="openshift-infra/auto-csr-approver-29563870-x5wsc" Mar 18 11:10:00 crc kubenswrapper[4733]: I0318 11:10:00.528380 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563870-x5wsc" Mar 18 11:10:01 crc kubenswrapper[4733]: I0318 11:10:00.999759 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563870-x5wsc"] Mar 18 11:10:01 crc kubenswrapper[4733]: W0318 11:10:01.006370 4733 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b718b15_0627_4a8f_999c_20a09469f53b.slice/crio-f003638e921861d82ddb6efe47782100f535e8007625abc6735f119a8d816ae1 WatchSource:0}: Error finding container f003638e921861d82ddb6efe47782100f535e8007625abc6735f119a8d816ae1: Status 404 returned error can't find the container with id f003638e921861d82ddb6efe47782100f535e8007625abc6735f119a8d816ae1 Mar 18 11:10:01 crc kubenswrapper[4733]: I0318 11:10:01.596012 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563870-x5wsc" event={"ID":"9b718b15-0627-4a8f-999c-20a09469f53b","Type":"ContainerStarted","Data":"f003638e921861d82ddb6efe47782100f535e8007625abc6735f119a8d816ae1"} Mar 18 11:10:02 crc kubenswrapper[4733]: I0318 11:10:02.607341 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563870-x5wsc" event={"ID":"9b718b15-0627-4a8f-999c-20a09469f53b","Type":"ContainerStarted","Data":"9245c66eef13b65a42f75aef76c4af481f3a188f6dd89682a56798c08e81d427"} Mar 18 11:10:02 crc kubenswrapper[4733]: I0318 11:10:02.633924 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563870-x5wsc" podStartSLOduration=1.468589797 podStartE2EDuration="2.633896662s" podCreationTimestamp="2026-03-18 11:10:00 +0000 UTC" firstStartedPulling="2026-03-18 11:10:01.008604068 +0000 UTC m=+3440.500338393" lastFinishedPulling="2026-03-18 11:10:02.173910923 +0000 UTC m=+3441.665645258" observedRunningTime="2026-03-18 11:10:02.623034886 +0000 UTC m=+3442.114769281" watchObservedRunningTime="2026-03-18 11:10:02.633896662 +0000 UTC m=+3442.125631027" Mar 18 11:10:03 crc kubenswrapper[4733]: I0318 11:10:03.614783 4733 generic.go:334] "Generic (PLEG): container finished" podID="9b718b15-0627-4a8f-999c-20a09469f53b" containerID="9245c66eef13b65a42f75aef76c4af481f3a188f6dd89682a56798c08e81d427" exitCode=0 Mar 18 11:10:03 crc kubenswrapper[4733]: I0318 11:10:03.614829 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563870-x5wsc" event={"ID":"9b718b15-0627-4a8f-999c-20a09469f53b","Type":"ContainerDied","Data":"9245c66eef13b65a42f75aef76c4af481f3a188f6dd89682a56798c08e81d427"} Mar 18 11:10:04 crc kubenswrapper[4733]: I0318 11:10:04.176079 4733 scope.go:117] "RemoveContainer" containerID="438887752c2380d0a118f44c0a43f524012895d063bcabcf9aab6778e9825f97" Mar 18 11:10:04 crc kubenswrapper[4733]: E0318 11:10:04.176494 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 11:10:05 crc kubenswrapper[4733]: I0318 11:10:05.075409 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563870-x5wsc" Mar 18 11:10:05 crc kubenswrapper[4733]: I0318 11:10:05.177558 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2t7wn\" (UniqueName: \"kubernetes.io/projected/9b718b15-0627-4a8f-999c-20a09469f53b-kube-api-access-2t7wn\") pod \"9b718b15-0627-4a8f-999c-20a09469f53b\" (UID: \"9b718b15-0627-4a8f-999c-20a09469f53b\") " Mar 18 11:10:05 crc kubenswrapper[4733]: I0318 11:10:05.184008 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b718b15-0627-4a8f-999c-20a09469f53b-kube-api-access-2t7wn" (OuterVolumeSpecName: "kube-api-access-2t7wn") pod "9b718b15-0627-4a8f-999c-20a09469f53b" (UID: "9b718b15-0627-4a8f-999c-20a09469f53b"). InnerVolumeSpecName "kube-api-access-2t7wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:10:05 crc kubenswrapper[4733]: I0318 11:10:05.279624 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2t7wn\" (UniqueName: \"kubernetes.io/projected/9b718b15-0627-4a8f-999c-20a09469f53b-kube-api-access-2t7wn\") on node \"crc\" DevicePath \"\"" Mar 18 11:10:05 crc kubenswrapper[4733]: I0318 11:10:05.635710 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563870-x5wsc" event={"ID":"9b718b15-0627-4a8f-999c-20a09469f53b","Type":"ContainerDied","Data":"f003638e921861d82ddb6efe47782100f535e8007625abc6735f119a8d816ae1"} Mar 18 11:10:05 crc kubenswrapper[4733]: I0318 11:10:05.636115 4733 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f003638e921861d82ddb6efe47782100f535e8007625abc6735f119a8d816ae1" Mar 18 11:10:05 crc kubenswrapper[4733]: I0318 11:10:05.635950 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563870-x5wsc" Mar 18 11:10:05 crc kubenswrapper[4733]: I0318 11:10:05.701679 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563864-g5m6l"] Mar 18 11:10:05 crc kubenswrapper[4733]: I0318 11:10:05.717664 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563864-g5m6l"] Mar 18 11:10:06 crc kubenswrapper[4733]: I0318 11:10:06.176334 4733 scope.go:117] "RemoveContainer" containerID="42cbdac80d1cd5197063da49b635b07d62e4adb2d6aae581ef46d16897659830" Mar 18 11:10:06 crc kubenswrapper[4733]: E0318 11:10:06.176722 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 11:10:07 crc kubenswrapper[4733]: I0318 11:10:07.214328 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="720459c9-dd9a-4d0f-8541-f4f2f578acc5" path="/var/lib/kubelet/pods/720459c9-dd9a-4d0f-8541-f4f2f578acc5/volumes" Mar 18 11:10:13 crc kubenswrapper[4733]: I0318 11:10:13.570818 4733 patch_prober.go:28] interesting pod/machine-config-daemon-2h7dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 11:10:13 crc kubenswrapper[4733]: I0318 11:10:13.571565 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 11:10:16 crc kubenswrapper[4733]: I0318 11:10:16.685610 4733 scope.go:117] "RemoveContainer" containerID="b3d994fcf267bc98b12bd59da2b08ea67ce03ac437e382c7728b7a3d005bd1f9" Mar 18 11:10:18 crc kubenswrapper[4733]: I0318 11:10:18.175948 4733 scope.go:117] "RemoveContainer" containerID="42cbdac80d1cd5197063da49b635b07d62e4adb2d6aae581ef46d16897659830" Mar 18 11:10:18 crc kubenswrapper[4733]: I0318 11:10:18.176514 4733 scope.go:117] "RemoveContainer" containerID="438887752c2380d0a118f44c0a43f524012895d063bcabcf9aab6778e9825f97" Mar 18 11:10:18 crc kubenswrapper[4733]: E0318 11:10:18.176794 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 11:10:18 crc kubenswrapper[4733]: E0318 11:10:18.177711 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 11:10:31 crc kubenswrapper[4733]: I0318 11:10:31.195571 4733 scope.go:117] "RemoveContainer" containerID="42cbdac80d1cd5197063da49b635b07d62e4adb2d6aae581ef46d16897659830" Mar 18 11:10:31 crc kubenswrapper[4733]: E0318 11:10:31.196480 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 11:10:33 crc kubenswrapper[4733]: I0318 11:10:33.156839 4733 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8kd26"] Mar 18 11:10:33 crc kubenswrapper[4733]: E0318 11:10:33.158180 4733 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b718b15-0627-4a8f-999c-20a09469f53b" containerName="oc" Mar 18 11:10:33 crc kubenswrapper[4733]: I0318 11:10:33.158245 4733 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b718b15-0627-4a8f-999c-20a09469f53b" containerName="oc" Mar 18 11:10:33 crc kubenswrapper[4733]: I0318 11:10:33.159181 4733 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b718b15-0627-4a8f-999c-20a09469f53b" containerName="oc" Mar 18 11:10:33 crc kubenswrapper[4733]: I0318 11:10:33.161320 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8kd26" Mar 18 11:10:33 crc kubenswrapper[4733]: I0318 11:10:33.171340 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8kd26"] Mar 18 11:10:33 crc kubenswrapper[4733]: I0318 11:10:33.178485 4733 scope.go:117] "RemoveContainer" containerID="438887752c2380d0a118f44c0a43f524012895d063bcabcf9aab6778e9825f97" Mar 18 11:10:33 crc kubenswrapper[4733]: E0318 11:10:33.178891 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 11:10:33 crc kubenswrapper[4733]: I0318 11:10:33.223861 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wm8v9\" (UniqueName: \"kubernetes.io/projected/367d4602-00b0-4165-9bc2-8c95171f5d9c-kube-api-access-wm8v9\") pod \"redhat-operators-8kd26\" (UID: \"367d4602-00b0-4165-9bc2-8c95171f5d9c\") " pod="openshift-marketplace/redhat-operators-8kd26" Mar 18 11:10:33 crc kubenswrapper[4733]: I0318 11:10:33.223943 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/367d4602-00b0-4165-9bc2-8c95171f5d9c-catalog-content\") pod \"redhat-operators-8kd26\" (UID: \"367d4602-00b0-4165-9bc2-8c95171f5d9c\") " pod="openshift-marketplace/redhat-operators-8kd26" Mar 18 11:10:33 crc kubenswrapper[4733]: I0318 11:10:33.224006 4733 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/367d4602-00b0-4165-9bc2-8c95171f5d9c-utilities\") pod \"redhat-operators-8kd26\" (UID: \"367d4602-00b0-4165-9bc2-8c95171f5d9c\") " pod="openshift-marketplace/redhat-operators-8kd26" Mar 18 11:10:33 crc kubenswrapper[4733]: I0318 11:10:33.325920 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wm8v9\" (UniqueName: \"kubernetes.io/projected/367d4602-00b0-4165-9bc2-8c95171f5d9c-kube-api-access-wm8v9\") pod \"redhat-operators-8kd26\" (UID: \"367d4602-00b0-4165-9bc2-8c95171f5d9c\") " pod="openshift-marketplace/redhat-operators-8kd26" Mar 18 11:10:33 crc kubenswrapper[4733]: I0318 11:10:33.326219 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/367d4602-00b0-4165-9bc2-8c95171f5d9c-catalog-content\") pod \"redhat-operators-8kd26\" (UID: \"367d4602-00b0-4165-9bc2-8c95171f5d9c\") " pod="openshift-marketplace/redhat-operators-8kd26" Mar 18 11:10:33 crc kubenswrapper[4733]: I0318 11:10:33.326344 4733 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/367d4602-00b0-4165-9bc2-8c95171f5d9c-utilities\") pod \"redhat-operators-8kd26\" (UID: \"367d4602-00b0-4165-9bc2-8c95171f5d9c\") " pod="openshift-marketplace/redhat-operators-8kd26" Mar 18 11:10:33 crc kubenswrapper[4733]: I0318 11:10:33.326761 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/367d4602-00b0-4165-9bc2-8c95171f5d9c-catalog-content\") pod \"redhat-operators-8kd26\" (UID: \"367d4602-00b0-4165-9bc2-8c95171f5d9c\") " pod="openshift-marketplace/redhat-operators-8kd26" Mar 18 11:10:33 crc kubenswrapper[4733]: I0318 11:10:33.326857 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/367d4602-00b0-4165-9bc2-8c95171f5d9c-utilities\") pod \"redhat-operators-8kd26\" (UID: \"367d4602-00b0-4165-9bc2-8c95171f5d9c\") " pod="openshift-marketplace/redhat-operators-8kd26" Mar 18 11:10:33 crc kubenswrapper[4733]: I0318 11:10:33.355705 4733 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wm8v9\" (UniqueName: \"kubernetes.io/projected/367d4602-00b0-4165-9bc2-8c95171f5d9c-kube-api-access-wm8v9\") pod \"redhat-operators-8kd26\" (UID: \"367d4602-00b0-4165-9bc2-8c95171f5d9c\") " pod="openshift-marketplace/redhat-operators-8kd26" Mar 18 11:10:33 crc kubenswrapper[4733]: I0318 11:10:33.522711 4733 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8kd26" Mar 18 11:10:33 crc kubenswrapper[4733]: I0318 11:10:33.970692 4733 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8kd26"] Mar 18 11:10:34 crc kubenswrapper[4733]: I0318 11:10:34.896369 4733 generic.go:334] "Generic (PLEG): container finished" podID="367d4602-00b0-4165-9bc2-8c95171f5d9c" containerID="bdf1a726e526ac3c6c0337af25f7f8af05476de01b143fd2326d85e30cc7ea20" exitCode=0 Mar 18 11:10:34 crc kubenswrapper[4733]: I0318 11:10:34.896472 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8kd26" event={"ID":"367d4602-00b0-4165-9bc2-8c95171f5d9c","Type":"ContainerDied","Data":"bdf1a726e526ac3c6c0337af25f7f8af05476de01b143fd2326d85e30cc7ea20"} Mar 18 11:10:34 crc kubenswrapper[4733]: I0318 11:10:34.896760 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8kd26" event={"ID":"367d4602-00b0-4165-9bc2-8c95171f5d9c","Type":"ContainerStarted","Data":"37730f69b5210919a437b0a5480e2a30678d89c33bbbcafdd873001ce13f8f7c"} Mar 18 11:10:36 crc kubenswrapper[4733]: I0318 11:10:36.917039 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8kd26" event={"ID":"367d4602-00b0-4165-9bc2-8c95171f5d9c","Type":"ContainerStarted","Data":"d9fd75094d39707df9dd53ed05d322a28e9c6abc8c019a61cf21173338374f1f"} Mar 18 11:10:37 crc kubenswrapper[4733]: I0318 11:10:37.931607 4733 generic.go:334] "Generic (PLEG): container finished" podID="367d4602-00b0-4165-9bc2-8c95171f5d9c" containerID="d9fd75094d39707df9dd53ed05d322a28e9c6abc8c019a61cf21173338374f1f" exitCode=0 Mar 18 11:10:37 crc kubenswrapper[4733]: I0318 11:10:37.931671 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8kd26" event={"ID":"367d4602-00b0-4165-9bc2-8c95171f5d9c","Type":"ContainerDied","Data":"d9fd75094d39707df9dd53ed05d322a28e9c6abc8c019a61cf21173338374f1f"} Mar 18 11:10:38 crc kubenswrapper[4733]: I0318 11:10:38.942604 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8kd26" event={"ID":"367d4602-00b0-4165-9bc2-8c95171f5d9c","Type":"ContainerStarted","Data":"05c1dd2ccd8ca7aaae0115caacefd36edfb7dd4b1d2ca8878258ac43dd1972a5"} Mar 18 11:10:38 crc kubenswrapper[4733]: I0318 11:10:38.967275 4733 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8kd26" podStartSLOduration=2.5709748660000002 podStartE2EDuration="5.967258303s" podCreationTimestamp="2026-03-18 11:10:33 +0000 UTC" firstStartedPulling="2026-03-18 11:10:34.899577936 +0000 UTC m=+3474.391312251" lastFinishedPulling="2026-03-18 11:10:38.295861323 +0000 UTC m=+3477.787595688" observedRunningTime="2026-03-18 11:10:38.959611897 +0000 UTC m=+3478.451346262" watchObservedRunningTime="2026-03-18 11:10:38.967258303 +0000 UTC m=+3478.458992628" Mar 18 11:10:43 crc kubenswrapper[4733]: I0318 11:10:43.523111 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8kd26" Mar 18 11:10:43 crc kubenswrapper[4733]: I0318 11:10:43.524118 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8kd26" Mar 18 11:10:43 crc kubenswrapper[4733]: I0318 11:10:43.571765 4733 patch_prober.go:28] interesting pod/machine-config-daemon-2h7dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 11:10:43 crc kubenswrapper[4733]: I0318 11:10:43.571858 4733 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2h7dp" podUID="6f75e1c5-e0c5-43df-944f-77b734070793" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 11:10:44 crc kubenswrapper[4733]: I0318 11:10:44.568022 4733 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8kd26" podUID="367d4602-00b0-4165-9bc2-8c95171f5d9c" containerName="registry-server" probeResult="failure" output=< Mar 18 11:10:44 crc kubenswrapper[4733]: timeout: failed to connect service ":50051" within 1s Mar 18 11:10:44 crc kubenswrapper[4733]: > Mar 18 11:10:45 crc kubenswrapper[4733]: I0318 11:10:45.175468 4733 scope.go:117] "RemoveContainer" containerID="42cbdac80d1cd5197063da49b635b07d62e4adb2d6aae581ef46d16897659830" Mar 18 11:10:45 crc kubenswrapper[4733]: E0318 11:10:45.175827 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 11:10:46 crc kubenswrapper[4733]: I0318 11:10:46.176704 4733 scope.go:117] "RemoveContainer" containerID="438887752c2380d0a118f44c0a43f524012895d063bcabcf9aab6778e9825f97" Mar 18 11:10:46 crc kubenswrapper[4733]: E0318 11:10:46.177672 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" Mar 18 11:10:53 crc kubenswrapper[4733]: I0318 11:10:53.583520 4733 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8kd26" Mar 18 11:10:53 crc kubenswrapper[4733]: I0318 11:10:53.648308 4733 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8kd26" Mar 18 11:10:53 crc kubenswrapper[4733]: I0318 11:10:53.835269 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8kd26"] Mar 18 11:10:55 crc kubenswrapper[4733]: I0318 11:10:55.078144 4733 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8kd26" podUID="367d4602-00b0-4165-9bc2-8c95171f5d9c" containerName="registry-server" containerID="cri-o://05c1dd2ccd8ca7aaae0115caacefd36edfb7dd4b1d2ca8878258ac43dd1972a5" gracePeriod=2 Mar 18 11:10:55 crc kubenswrapper[4733]: I0318 11:10:55.521744 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8kd26" Mar 18 11:10:55 crc kubenswrapper[4733]: I0318 11:10:55.705624 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wm8v9\" (UniqueName: \"kubernetes.io/projected/367d4602-00b0-4165-9bc2-8c95171f5d9c-kube-api-access-wm8v9\") pod \"367d4602-00b0-4165-9bc2-8c95171f5d9c\" (UID: \"367d4602-00b0-4165-9bc2-8c95171f5d9c\") " Mar 18 11:10:55 crc kubenswrapper[4733]: I0318 11:10:55.705673 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/367d4602-00b0-4165-9bc2-8c95171f5d9c-catalog-content\") pod \"367d4602-00b0-4165-9bc2-8c95171f5d9c\" (UID: \"367d4602-00b0-4165-9bc2-8c95171f5d9c\") " Mar 18 11:10:55 crc kubenswrapper[4733]: I0318 11:10:55.705872 4733 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/367d4602-00b0-4165-9bc2-8c95171f5d9c-utilities\") pod \"367d4602-00b0-4165-9bc2-8c95171f5d9c\" (UID: \"367d4602-00b0-4165-9bc2-8c95171f5d9c\") " Mar 18 11:10:55 crc kubenswrapper[4733]: I0318 11:10:55.706916 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/367d4602-00b0-4165-9bc2-8c95171f5d9c-utilities" (OuterVolumeSpecName: "utilities") pod "367d4602-00b0-4165-9bc2-8c95171f5d9c" (UID: "367d4602-00b0-4165-9bc2-8c95171f5d9c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 11:10:55 crc kubenswrapper[4733]: I0318 11:10:55.707371 4733 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/367d4602-00b0-4165-9bc2-8c95171f5d9c-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 11:10:55 crc kubenswrapper[4733]: I0318 11:10:55.720398 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/367d4602-00b0-4165-9bc2-8c95171f5d9c-kube-api-access-wm8v9" (OuterVolumeSpecName: "kube-api-access-wm8v9") pod "367d4602-00b0-4165-9bc2-8c95171f5d9c" (UID: "367d4602-00b0-4165-9bc2-8c95171f5d9c"). InnerVolumeSpecName "kube-api-access-wm8v9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 11:10:55 crc kubenswrapper[4733]: I0318 11:10:55.809677 4733 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wm8v9\" (UniqueName: \"kubernetes.io/projected/367d4602-00b0-4165-9bc2-8c95171f5d9c-kube-api-access-wm8v9\") on node \"crc\" DevicePath \"\"" Mar 18 11:10:55 crc kubenswrapper[4733]: I0318 11:10:55.849851 4733 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/367d4602-00b0-4165-9bc2-8c95171f5d9c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "367d4602-00b0-4165-9bc2-8c95171f5d9c" (UID: "367d4602-00b0-4165-9bc2-8c95171f5d9c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 11:10:55 crc kubenswrapper[4733]: I0318 11:10:55.912035 4733 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/367d4602-00b0-4165-9bc2-8c95171f5d9c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 11:10:56 crc kubenswrapper[4733]: I0318 11:10:56.087583 4733 generic.go:334] "Generic (PLEG): container finished" podID="367d4602-00b0-4165-9bc2-8c95171f5d9c" containerID="05c1dd2ccd8ca7aaae0115caacefd36edfb7dd4b1d2ca8878258ac43dd1972a5" exitCode=0 Mar 18 11:10:56 crc kubenswrapper[4733]: I0318 11:10:56.087640 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8kd26" event={"ID":"367d4602-00b0-4165-9bc2-8c95171f5d9c","Type":"ContainerDied","Data":"05c1dd2ccd8ca7aaae0115caacefd36edfb7dd4b1d2ca8878258ac43dd1972a5"} Mar 18 11:10:56 crc kubenswrapper[4733]: I0318 11:10:56.087666 4733 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8kd26" Mar 18 11:10:56 crc kubenswrapper[4733]: I0318 11:10:56.087686 4733 scope.go:117] "RemoveContainer" containerID="05c1dd2ccd8ca7aaae0115caacefd36edfb7dd4b1d2ca8878258ac43dd1972a5" Mar 18 11:10:56 crc kubenswrapper[4733]: I0318 11:10:56.087673 4733 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8kd26" event={"ID":"367d4602-00b0-4165-9bc2-8c95171f5d9c","Type":"ContainerDied","Data":"37730f69b5210919a437b0a5480e2a30678d89c33bbbcafdd873001ce13f8f7c"} Mar 18 11:10:56 crc kubenswrapper[4733]: I0318 11:10:56.106404 4733 scope.go:117] "RemoveContainer" containerID="d9fd75094d39707df9dd53ed05d322a28e9c6abc8c019a61cf21173338374f1f" Mar 18 11:10:56 crc kubenswrapper[4733]: I0318 11:10:56.125985 4733 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8kd26"] Mar 18 11:10:56 crc kubenswrapper[4733]: I0318 11:10:56.133066 4733 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8kd26"] Mar 18 11:10:56 crc kubenswrapper[4733]: I0318 11:10:56.156468 4733 scope.go:117] "RemoveContainer" containerID="bdf1a726e526ac3c6c0337af25f7f8af05476de01b143fd2326d85e30cc7ea20" Mar 18 11:10:56 crc kubenswrapper[4733]: I0318 11:10:56.170392 4733 scope.go:117] "RemoveContainer" containerID="05c1dd2ccd8ca7aaae0115caacefd36edfb7dd4b1d2ca8878258ac43dd1972a5" Mar 18 11:10:56 crc kubenswrapper[4733]: E0318 11:10:56.170766 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05c1dd2ccd8ca7aaae0115caacefd36edfb7dd4b1d2ca8878258ac43dd1972a5\": container with ID starting with 05c1dd2ccd8ca7aaae0115caacefd36edfb7dd4b1d2ca8878258ac43dd1972a5 not found: ID does not exist" containerID="05c1dd2ccd8ca7aaae0115caacefd36edfb7dd4b1d2ca8878258ac43dd1972a5" Mar 18 11:10:56 crc kubenswrapper[4733]: I0318 11:10:56.170802 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05c1dd2ccd8ca7aaae0115caacefd36edfb7dd4b1d2ca8878258ac43dd1972a5"} err="failed to get container status \"05c1dd2ccd8ca7aaae0115caacefd36edfb7dd4b1d2ca8878258ac43dd1972a5\": rpc error: code = NotFound desc = could not find container \"05c1dd2ccd8ca7aaae0115caacefd36edfb7dd4b1d2ca8878258ac43dd1972a5\": container with ID starting with 05c1dd2ccd8ca7aaae0115caacefd36edfb7dd4b1d2ca8878258ac43dd1972a5 not found: ID does not exist" Mar 18 11:10:56 crc kubenswrapper[4733]: I0318 11:10:56.170823 4733 scope.go:117] "RemoveContainer" containerID="d9fd75094d39707df9dd53ed05d322a28e9c6abc8c019a61cf21173338374f1f" Mar 18 11:10:56 crc kubenswrapper[4733]: E0318 11:10:56.171087 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9fd75094d39707df9dd53ed05d322a28e9c6abc8c019a61cf21173338374f1f\": container with ID starting with d9fd75094d39707df9dd53ed05d322a28e9c6abc8c019a61cf21173338374f1f not found: ID does not exist" containerID="d9fd75094d39707df9dd53ed05d322a28e9c6abc8c019a61cf21173338374f1f" Mar 18 11:10:56 crc kubenswrapper[4733]: I0318 11:10:56.171152 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9fd75094d39707df9dd53ed05d322a28e9c6abc8c019a61cf21173338374f1f"} err="failed to get container status \"d9fd75094d39707df9dd53ed05d322a28e9c6abc8c019a61cf21173338374f1f\": rpc error: code = NotFound desc = could not find container \"d9fd75094d39707df9dd53ed05d322a28e9c6abc8c019a61cf21173338374f1f\": container with ID starting with d9fd75094d39707df9dd53ed05d322a28e9c6abc8c019a61cf21173338374f1f not found: ID does not exist" Mar 18 11:10:56 crc kubenswrapper[4733]: I0318 11:10:56.171259 4733 scope.go:117] "RemoveContainer" containerID="bdf1a726e526ac3c6c0337af25f7f8af05476de01b143fd2326d85e30cc7ea20" Mar 18 11:10:56 crc kubenswrapper[4733]: E0318 11:10:56.171546 4733 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdf1a726e526ac3c6c0337af25f7f8af05476de01b143fd2326d85e30cc7ea20\": container with ID starting with bdf1a726e526ac3c6c0337af25f7f8af05476de01b143fd2326d85e30cc7ea20 not found: ID does not exist" containerID="bdf1a726e526ac3c6c0337af25f7f8af05476de01b143fd2326d85e30cc7ea20" Mar 18 11:10:56 crc kubenswrapper[4733]: I0318 11:10:56.171571 4733 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdf1a726e526ac3c6c0337af25f7f8af05476de01b143fd2326d85e30cc7ea20"} err="failed to get container status \"bdf1a726e526ac3c6c0337af25f7f8af05476de01b143fd2326d85e30cc7ea20\": rpc error: code = NotFound desc = could not find container \"bdf1a726e526ac3c6c0337af25f7f8af05476de01b143fd2326d85e30cc7ea20\": container with ID starting with bdf1a726e526ac3c6c0337af25f7f8af05476de01b143fd2326d85e30cc7ea20 not found: ID does not exist" Mar 18 11:10:57 crc kubenswrapper[4733]: I0318 11:10:57.175124 4733 scope.go:117] "RemoveContainer" containerID="42cbdac80d1cd5197063da49b635b07d62e4adb2d6aae581ef46d16897659830" Mar 18 11:10:57 crc kubenswrapper[4733]: E0318 11:10:57.175399 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(b4a4e3e2-bd4d-4f8d-97bc-51267378ab03)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b4a4e3e2-bd4d-4f8d-97bc-51267378ab03" Mar 18 11:10:57 crc kubenswrapper[4733]: I0318 11:10:57.187126 4733 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="367d4602-00b0-4165-9bc2-8c95171f5d9c" path="/var/lib/kubelet/pods/367d4602-00b0-4165-9bc2-8c95171f5d9c/volumes" Mar 18 11:11:01 crc kubenswrapper[4733]: I0318 11:11:01.195525 4733 scope.go:117] "RemoveContainer" containerID="438887752c2380d0a118f44c0a43f524012895d063bcabcf9aab6778e9825f97" Mar 18 11:11:01 crc kubenswrapper[4733]: E0318 11:11:01.197553 4733 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(f0570ce4-1455-4698-85cf-01f7108d9e7f)\"" pod="openstack/rabbitmq-server-0" podUID="f0570ce4-1455-4698-85cf-01f7108d9e7f" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515156504117024452 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015156504117017367 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015156474721016521 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015156474722015472 5ustar corecore